Limits

The concept of a limit or limiting process, essential to the understanding of calculus, has been around for thousands of years. In fact, early mathematicians used a limiting process to obtain better and better approximations of areas of circles. Yet, the formal definition of a limit—as we know and understand it today—did not appear until the late 19th century. We therefore begin our quest to understand limits, as our mathematical ancestors did, by using an intuitive approach. At the end of this chapter, armed with a conceptual understanding of limits, we examine the formal definition of a limit.

Definition of Limit

The limit of a function at a point is the value that the function approaches as the input approaches that point.

Suppose f(x) is defined when x is near the number a. (This means that f is defined on some open interval that contains a, except possibly at a itself.) Then we write

and say "the limit of f(x), as x approaches a, equals L" if we can make the values of f(x) arbitrarily close to L (as close to L as we like) restricting x to be sufficiently close to a (on either side of a) but not equal to a.

In this video we learn how to compute the imit of an irrational function using the conjugate expression of the radical expression. The functions is undefined at x = 1, but if we make this statement and no other, we give a very incomplete picture of how each function behaves in the vicinity of x = 1. To express the behavior of each graph in the vicinity of 1 more completely, we need to introduce the concept of a limit.
An unhandled error has occurred. Reload 🗙