# Asymptotic Notations

Asymptotic notation is a mathematical tool used to describe the behavior and growth rate of functions. It is commonly used in computer science and algorithm analysis to analyze the efficiency and performance of algorithms. There are three commonly used asymptotic notations: Big O notation (O), Omega notation (Ω), and Theta notation (Θ). Let’s explore each notation in detail:

**Big O notation (O):**

Big O notation provides an upper bound on the growth rate of a function. It describes the worst-case scenario or the upper limit of how the function behaves as the input size approaches infinity. In simple terms, it represents the maximum growth rate of a function.

Notation: O(g(n))

Explanation: O(g(n)) denotes that the growth rate of the function is no worse than g(n). It means that the function’s running time or space complexity grows at most as fast as g(n), disregarding constant factors and lower-order terms. Mathematically, if a function f(n) is said to be O(g(n)), there exist positive constants c and n₀ such that f(n) ≤ c * g(n) for all values of n ≥ n₀.

Example: If we say an algorithm has a time complexity of O(n^2), it means that the algorithm’s running time grows no faster than the square of the input size. In other words, it has a quadratic time complexity.

## Omega notation (Ω):

Omega notation provides a lower bound on the growth rate of a function. It describes the best-case scenario or the lower limit of how the function behaves as the input size approaches infinity. It represents the minimum growth rate of a function.

Notation: Ω(g(n))

Explanation: Ω(g(n)) denotes that the growth rate of the function is no better than g(n). It means that the function’s running time or space complexity grows at least as fast as g(n), disregarding constant factors and lower-order terms. Mathematically, if a function f(n) is said to be Ω(g(n)), there exist positive constants c and n₀ such that f(n) ≥ c * g(n) for all values of n ≥ n₀.

Example: If we say an algorithm has a time complexity of Ω(n), it means that the algorithm’s running time grows at least linearly with the input size. In other words, it has a linear time complexity.

## Theta notation (Θ):

Theta notation provides both an upper and a lower bound on the growth rate of a function. It describes the tightest bound or the exact growth rate of the function. It represents the average behavior or the growth rate within a constant factor.

Notation: Θ(g(n))

Explanation: Θ(g(n)) denotes that the growth rate of the function is bounded both from above and below by g(n). It means that the function’s running time or space complexity grows at the same rate as g(n), disregarding constant factors and lower-order terms. Mathematically, if a function f(n) is said to be Θ(g(n)), there exist positive constants c₁, c₂, and n₀ such that c₁ * g(n) ≤ f(n) ≤ c₂ * g(n) for all values of n ≥ n₀.

Example: If we say an algorithm has a time complexity of Θ(n), it means that the algorithm’s running time grows at the same rate as the input size. In other words, it has a linear time complexity, both in the best and worst cases.

## Properties of asymptotic notation:

**Transitivity**

Asymptotic notations follow the transitive property, which means if f(n) = O(g(n)) and g(n) = O(h(n)), then f(n) = O(h(n)). This property allows us to compare the complexities of different algorithms and their combinations.

**Non-negativity**

Asymptotic notations are always non-negative. Time and space complexities cannot be negative values.

**Symmetry**

Asymptotic notations are symmetric, meaning if f(n) = O(g(n)), then g(n) = Ω(f(n)). This property indicates that if an algorithm has an upper bound on its growth rate, it also has a corresponding lower bound.

**Reflexivity**

Asymptotic notations are reflexive, meaning f(n) = O(f(n)). This property implies that any function is asymptotically bound by itself.

**Lower Order Terms**

Asymptotic notations disregard constant factors and lower-order terms in the analysis. They focus on the dominant term that influences the growth rate as the input size approaches infinity. This property simplifies the analysis and allows for a broader classification of algorithms based on their growth rates.