What a Limit Isn’t

In my calculus class yesterday, my students and I talked about how we could compute the derivative of a non-polynomial function. With appropriate prodding, they came up with the idea of taking the limit of the slope of secant lines and the distance between the points of intersection vanishes. That raises an important question though: what is a limit?

I asked each student to write down what they thought the definition of a limit should be based on their intuition and the discussion we had had about derivatives that motivated limits in the first place. I read their answers and classified them into three main groups by similarity. Then today we talked about why they were wrong.

Wrong Definitions of Limits

(1) A limit is the value of $f(x)$ when $|x-a|\approx 0$.

This wrong definition has some promise. It recognizes that a limit is a value and that it is taken of a function. It also makes nice use of the absolute value of a difference to measure distance. However, it has a glaring weakness: it is not well-defined. If $f(x)$ is non-constant, then that function likely takes on infinitely many values near $x=a$. Which one is the limit? We cannot define a limit as the value of a function if the function takes multiple values. The students quickly acknowledged the error.

(2) A limit is a value that the function can almost reach but is unable to.

As with the previous wrong definition, this one recognizes that a limit is a value and it comes from a function. However, an easy example pokes a hole in it: what is $\lim_{x\to 1} 1$? My students agree that the function $y=1$ is approaching the value of $1$, so the limit ought to be $1$. But this wrong definition would exclude $1$ as the limit since the function $y=1$ does, in fact, reach $1$. So, we conclude that the definition of a limit should be okay with the function equaling the value of the limit.A constant function

 

(3) A limit is the value that a function approaches as the input approaches a desired number.

This wrong definition is perhaps the closest to being correct. It is certainly the most instructive to negate. First, a clarification is in order. The word “approaches” is synonymous with “getting closer to”. With that in mind, is it not true that $\lim_{x\to 0} x^2=-1$? After all, as $x$ gets closer to $0$, $x^2$ does get closer to $-1$. My students agreed that $\lim_{x\to 0} x^2$ is actually $0$ but conceded that $-1$ would be a correct limit given this working definition.

Parabola approaching line

 

Getting It Right

Fundamentally, the third working definition is flawed because it is centered on the idea of getting closer, but a limit needs to be about being close. The function $x^2$ does get closer to $-1$ as $x\to 0$, but it does not get close.

Now that we have the key idea down, we are not far from the true definition of a limit.

Limits are about being close. How close should we be? No single measure of closeness is enough – just because a function gets within $1$ or $0.1$ or $0.01$ or $0.000000001$ of $L$ will not make $L$ its limit. So, we need to be within any given positive distance.

On the other hand, we cannot expect the entire function to output values within the given distance of $L$. What if the range of $f(x)$ is all real numbers? We need to restrict our attention to $x$-values near the target.

Once we put those pieces together, we get the familiar $\epsilon-\delta$-definition of a $\lim_{x\to a} f(x)=L$:

$$(\forall\varepsilon>0 \exists\delta>0) 0<|x-a|<\delta\implies |f(x)-L|<\varepsilon$$

This definition of a limit certainly looks daunting, but it is much easier for students to swallow when they recognize that it does, in fact, agree with their intuition. The primary challenge is helping students figure out what they were thinking all along.