Growth of functions

Foundations of mathematics

Introduction

Understanding the growth of functions is essential to learning concepts such as computational complexity (in the case of algorithms) and convergence (in the case of random variables). In this article, we introduce the tools and notation to estimate and represent the relative rates of growths of functions.

Asymptotic tight bounds

Let us describe a set of functions of the variable \( n \) that grow similar to the function \( g(n) \) up to some constant factors when \( n \ge n_0 \). Say \( f(n) \) is one such function. For \( f(n) \) to grow equally as \( g(n) \) upto some constant factor, it should be the case that there exist some positive constants \( c_1 \) and \( c_2 \) for all \( n \ge n_0 \),

$$ 0 \le c_1 f(n) \le g(n) \le c_2 f(n) $$.

The following figure shows an example of such a function. Notice how the function \(f(n)\) stays bounded within the confines of constant factors of \( g(n) \).

There could be many functions that are within such constant factors of the function \( g(n) \) for all \( n \ge n_0 \). We call the set of such functions \( \Theta(g(n)) \). We describe the relationship that \( f(n) \) is contained within such as a set as \( f(n) = \Theta(g(n)) \). We say that \(g(n)\) is an asymptotically tight bound for \(f(n) \).

The \( \theta \)-notation

Let us describe a set of functions of the variable \( n \) that grow similar to the function \( g(n) \) up to some constant factors when \( n \ge n_0 \). Say \( f(n) \) is one such function. For \( f(n) \) to grow equally as \( g(n) \) upto some constant factor, it should be the case that there exist some positive constants \( c_1 \) and \( c_2 \) for all \( n \ge n_0 \),

$$ 0 \le c_1 f(n) \le g(n) \le c_2 f(n) $$.

The following figure shows an example of such a function. Notice how the function \(f(n)\) stays bounded within the confines of constant factors of \( g(n) \).

There could be many functions that are within such constant factors of the function \( g(n) \) for all \( n \ge n_0 \). We call the set of such functions \( \theta(g(n)) \). We describe the relationship that \( f(n) \) is contained within such as a set as \( f(n) = \theta(g(n)) \). We say that \(g(n)\) is an asymptotically tight bound for \(f(n) \).

The \( \BigOsymbol \)-notation

The \( \theta \)-notation is useful when we can find a function \( g(n) \) that upper bounds as well as lower bounds the function \( f(n) \) up to a constant factor for all values of \( n \ge n_0 \).

Sometimes, we can only upper bound. That means there exists some positive constant \( c \) such that for all \( n \ge n_0 \),

$$ 0 \le f(n) \le c g(n) $$

We call such a function an asymptotic upper bound. We describe the relationship as \( f(n) = \BigO{g(n)} \).

Note that if a function can be upper as well as lower bounded by \(g(n)\), then it is still in \( \BigO{g(n)} \). This means

$$ f(n) = \theta(g(n)) \Rightarrow f(n) = \BigO{g(n)} $$

\( \theta \) is a symmetric relation

Suppose \( f(n) = \theta(g(n)) \). This means, there exist some positive constants \( c_1 \) and \( c_2 \) such that for some values of \( n \ge n_0 \),

$$ 0 \le c_1 g(n) \le f(n) \le c_2 g(n) $$

Note that if \( c_1 g(n) \le f(n) \), then \( g(n) \le \frac{1}{c_1} f(n) \). Similarly, \( f(n) \le c_2 g(n) \) means that \( \frac{1}{c_2} f(n) \le g(n) \). These two equations can be written as

$$ 0 \le \frac{1}{c_2} f(n) \le g(n) \le \frac{1}{c_1} f(n) $$

Thus, the function \( g(n) = \theta(f(n)) \). Thus, the asymptotic relation \( \theta \) is symmetric. This means,

$$ f(n) = \theta(f(n)) \text{ if and only if } g(n) = \theta(f(n)) $$

Transitivity of asymptotic bounds

Suppose \( f(n) = \BigO{g(n)} \). This means, there exist some positive constant \( c \) such that for some values of \( n \ge n_0 \),

$$ 0 \le f(n) \le c g(n) $$

Now suppose there is another function \( h(n) \) that upper bounds \( g(n) \). This means, \( g(n) = \BigO{h(n)} \). By definition, it implies that there is some positive constant \( d \), such that

$$ 0 \le g(n) \le d h(n) $$

Collectively, the two inequalities imply that

$$ 0 \le f(n) \le cd h(n) $$

Thus, it is the case tha t\( f(n) = \BigO{h(n)} \).

This means,

$$ f(n) = \BigO{g(n)} \text{ and } g(n) = \BigO{h(n)} \text{ imply } f(n) = \BigO{h(n)} $$

Similar transitive can be derived for the other asymptotic bounds. We list them here.

$$ f(n) = \Omega(g(n)) \text{ and } g(n) = \Omega(h(n)) \text{ imply } f(n) = \Omega(h(n)) $$ $$ f(n) = \theta(g(n)) \text{ and } g(n) = \theta(h(n)) \text{ imply } f(n) = \theta(h(n)) $$ $$ f(n) = \smallo{g(n)} \text{ and } g(n) = \smallo{h(n)} \text{ imply } f(n) = \smallo{h(n)} $$ $$ f(n) = \omega(g(n)) \text{ and } g(n) = \omega(h(n)) \text{ imply } f(n) = \omega(h(n)) $$

Next-article

Study our other mathematical articles for a strong foundation to tackle advanced material.

Please support us

Help us create more engaging and effective content and keep it free of paywalls and advertisements!

Let's connect

Please share your comments, questions, encouragement, and feedback.