Are all functions differentiable? Let's find out through interactive examples.

This learning module has many interactive demos. It is easier to work with them on a larger screen.
Bookmark and revisit if you are currently on a small screen device.

To understand differentiability and smoothness, we recommend familiarity with the concepts in

Follow the above links to first get acquainted with the corresponding concepts.

To compute the derivative, we use a limit \( h \to 0 \).

$$ m_x = \lim_{h \to 0} \frac{f(x+h) - f(x)}{h} $$

But remember that a limit does not always exist.

So, if the limit for a function exists, then we can compute the derivative.
The functions for which that limit exists are known as **differentiable** functions.
Conversely, the remaining functions are known as **not-differentiable**.

We already made the case that **discontinuous functions are not differentiable** at their point(s) of discontinuity.
But are all continuous functions differentiable? Find out next.

The absolute value function \( f(x) = |x| \) appears very commonly in machine learning. It is defined as follows

$$ f(x) = |x| = \begin{cases} x, \text{ if } x \ge 0 \\ -x, \text{ otherwise } \end{cases} $$

It is a continuous function. So, \( \lim_{x \to 0} f(x) \) exists. (Verify this. Do it. You will be happier.)

But for the derivative, the mere existence of \( \lim_{x \to 0} f(x) \) is not sufficient.

We want the limit \( \lim_{h \to 0} \frac{|x+h| - |x|}{h} \) to exist. Does that exist?

It exists everywhere, except at \( x = 0 \). (Verify this. And give yourself a pat on the back when you do.)

However, in such cases, we can still differentiate on the portions that **differentiable**.
These derivatives are

$$ f'(x) = \begin{cases} 1, \text{ if } x > 0 \\ -1, \text{ if } x < 0 \\ \text{Not differentiable if } x = 0 \end{cases}$$

Thus, the slope of the function stays constant on either side of \( x = 0 \). At \( x = 0 \), the function is not differentiable.

In optimization, such situations are dealt with the introduction of the concept of **subgradient**, also known as **subdifferential**, or **subderivative**.

A full discussion of subgradients is not in scope here, so we will make a brief comment. Intuitively, if the function \( f(x) \) is not differentiable at the point \( x_0 \), a subderivative is any line that passes through the point \( (x_0, f(x_0)) \) which either touches or stays below/above the graph of the function \( f(x) \).

So, there are continuous functions, that are not differentiable everywhere. What should we call them? Let's try to understand these functions geometrically first.

The entire idea of the derivative relies on being able to zoom into a function to arrive at a line segment. What if no matter how much you magnify, you cannot arrive at a line segment?

We saw that this happens for a gap; a discontinuity. It will also happen when no matter how much you zoom, the function looks as if two straight line segments, with different slopes, are connecting at a point. In such as a scenario, the equation of the slope, presented earlier, does not hold anymore. If slope cannot be calculated, then the derivative cannot be calculated either.

This is possible for a continuous function, as can be seen for our example of \( f(x) = |x| \).
In that sense, the function \( f(x) = |x| \), is *not smooth* at \( x = 0 \).

Thus, a function is **smooth at a point** if it is continuous and differentiable at that point.
A function is **smooth within an interval** if it is smooth at every point in that interval.

So, an easy way to remember these ideas is this.

- if limit of a function exists at a point, then it is continuous at that point
- if derivative of a function exists at a point, then it is smooth at that point

Thus, smoothness has a stronger requirement than continuity. If a function is smooth at a point, then it is definitely also continuous at that point. Continuity does not imply smoothness.

We explore smoothness further in our article on higher order derivatives.

Now that you are an expert in derivatives, explore the counterpart to derivatives — integrals.

Already a calculus expert? Check out comprehensive courses on multivariate calculus, machine learning or deep learning

Help us create more engaging and effective content and keep it free of paywalls and advertisements!

Stay up to date with new material ** for free**.