Limits
The concept of a limit or limiting process, essential to the understanding of calculus, has been around for thousands of years. In fact, early mathematicians used a limiting process to obtain better and better approximations of areas of circles. Yet, the formal definition of a limit—as we know and understand it today—did not appear until the late 19th century. We therefore begin our quest to understand limits, as our mathematical ancestors did, by using an intuitive approach. At the end of this chapter, armed with a conceptual understanding of limits, we examine the formal definition of a limit.
Definition of Limit
The limit of a function at a point is the value that the function approaches as the input approaches that point.
Suppose f(x) is defined when x is near the number a. (This means that f is defined on some open interval that contains a, except possibly at a itself.) Then we write
and say "the limit of f(x), as x approaches a, equals L" if we can make the values of f(x) arbitrarily close to L (as close to L as we like)restricting x to be sufficiently close to a (on either side of a) but not equal to a.