Understanding Derivatives: A Comprehensive Guide
Written on
Derivatives are fundamental in mathematics, enabling us to articulate the changing nature of the world. To comprehend natural phenomena and their governing laws, we require effective tools, and calculus, particularly differentiation and derivatives, serves this purpose.
In this article, we will explore the following topics:
- My Initial Encounter with Derivatives
- The Concept of Slope
- Examples and Properties Derived from First Principles
- f(x) = x^a
- Notation
- Linearity
- Rules of Differentiation
- The Product Rule
- The Quotient Rule
- The Chain Rule
- L'Hôpital's Rule
- Taylor Series
- Applications of Taylor Series
My Initial Encounter with Derivatives
I vividly recall my first experience with derivatives during my high school years. It was a summer weekend, and I was faced with some tedious homework. To tackle it, I brewed a cup of coffee and sat under the apple tree in my parents' garden, hoping for inspiration—though no apples fell to spark any ideas.
At that time, I had little appreciation for math, viewing it as mundane and lifeless. My understanding was quite limited.
As I sat there, I began reading about calculating slopes of function graphs and their applications in optimization problems. Although I struggled with the theory, I had a sophisticated calculator that handled most computations for me.
My father joined me and briefly glanced at a particularly frustrating problem I was grappling with. Within moments, he provided me with the answer. I was incredulous, as I had spent a considerable amount of time programming my calculator to solve it, while he solved it effortlessly.
Eventually, I arrived at the same solution, and I was astonished by his swift reasoning. This pivotal moment motivated me to excel in mathematics, leading me to devour every math book I could find, even those beyond my current level. I began with derivatives and calculus, and within a year, I had made significant progress.
I share this story to encourage anyone who feels they lack a solid foundation in math. It's possible to delve into this topic while simultaneously brushing up on the basics.
The Concept of Slope
For a linear polynomial, commonly known as a line, the slope remains constant, independent of the input variable. This characteristic defines a line.
Calculus begins with generalizing the slope concept to non-linear functions. Consider the function:
f(x) = x³ - x² + x.
Imagine drawing a line that intersects the graph at two points; this line is termed a secant.
The formula for the slope of this line is ?f/?x. If we wish to determine the slope of a secant line through the points (x, f(x)) and (x + ?x, f(x + ?x)), it can be computed as follows:
This expression calculates the slope for a line. The idea is that as we let the second point approach the first (by allowing ?x to approach 0), zooming in on the first intersection point reveals that the graph of f appears increasingly like the secant.
In fact, as ?x approaches 0, the secant becomes a line that touches the graph of f without crossing it at that point. This line is known as the tangent to f at the point (x, f(x)). As we zoom in further, the tangent and the graph of f become indistinguishable.
Thus, we define the slope of f at that point using the limit of the slope of the secant as ?x approaches 0, which becomes the slope of the tangent. This limit gives us the derivative of f, and the process of calculating the derivative is termed differentiation.
You may have heard the expression "to go off on a tangent." This phrase originates from mathematics, where moving along a tangent takes you further from the graph.
Unlike linear functions, the derivative of a non-linear function is not constant; it is a variable function itself. We denote the derivative of f at point x as f'(x), and its formula is given by:
This definition is valid only for specific functions known as differentiable functions.
Examples and Properties from First Principles
In this section, we will familiarize ourselves with differentiability through examples.
f(x) = x^a
Let's examine the function mentioned above. How would we compute its derivative using the definition?
We start with a simple case: f(x) = x².
We have established that the derivative of x² is 2x. With a clever application of the binomial theorem and the definition, we can show that for any real number a, the derivative is:
Notation
At this point, it is fitting to introduce another notation. Calculus was developed independently by Newton and Leibniz, who used distinctly different notations. Generally, Leibniz’s notation is more widely used, although Newton initially received most credit for the discovery. This historical dispute is fascinating, but let's focus on the task at hand.
An alternative expression for the statement with the implication arrow from the previous example is:
Sometimes we omit the parentheses. The d/dx symbol functions as an operator on a function space, mapping functions to their derivatives.
We also have the following notation:
This notation signifies that df/dx is the derivative, representing a function.
Notably, the df/dx symbol resembles a fraction. This resemblance is intentional, as the derivative is defined as the limit of ?f/?x as ?x approaches 0. Although df and dx appear as infinitely small quantities (referred to as infinitesimals by Leibniz), their ratio represents the slope of a tangent line at a given point.
In cases involving the df/dx symbol, we might replace it with the limit definition, treating ?f and ?x as real quantities before taking the limit. This approach can be beneficial, allowing us to manipulate df/dx as a fraction and apply fraction rules. However, it's crucial to understand that this is more of a mnemonic device than a mathematically rigorous concept, which we will clarify later.
Linearity
In this section, we will prove an important property of the differential operator: linearity.
That is,
and
Thus, the derivative of a sum equals the sum of the derivatives, and the derivative of a scalar is the scalar multiplied by the derivative.
The proof is straightforward, but since linearity is crucial, I will provide a complete proof. Let h(x) = f(x) + g(x); then:
Similarly, if h(x) = c f(x); then:
Differentiation Rules
In this section, we will prove some essential rules that are vital to remember.
These rules are frequently utilized.
The Product Rule
When differentiating a product, a special rule applies:
Proof.
Let h(x) = f(x)g(x); then:
The Quotient Rule
Using a similar approach, we can establish another essential property—the quotient rule:
Proof.
First, note that:
Now we can apply the product rule:
The Chain Rule
This remarkable rule, alongside the previous two, enables us to differentiate any differentiable function. This capability is more significant than it may initially seem, as we will discover in later articles, this does not hold true for the reverse operation.
The chain rule states:
I will provide a sketch of the proof using infinitesimals as numbers. While this explanation isn't entirely rigorous, it can be made so with the right arguments. The goal is to present a memorable rule that is easy to recall.
In the expression f(g(x)), we can view f as a function of g.
Thus, we have:
L'Hôpital's Rule
L'Hôpital's rule, named after French mathematician Guillaume de l'Hôpital, is a significant and powerful result in calculus, applicable in various situations to find limits.
Before stating and proving the rule, note that the derivative's definition can be expressed as:
This is equivalent to our earlier definition, substituting a = x + ?x and b = x.
The rule can be loosely stated as follows: Assume that the limits as x approaches some number c of the ratios f(x)/g(x) and g'(x)/f'(x) exist, then:
Proof.
We will prove a specific but useful case of the theorem.
Assume that f and g are continuously differentiable at a real number c, with f(c) = g(c) = 0, and g'(c) ? 0. Then:
The last equality holds since this quotient equals f'(c)/g'(c) by definition.
In the previous article about calculus, we encountered a challenging limit. Recall that we aimed to find:
Now we possess the tools to solve it. Let's proceed using L'Hôpital's rule.
Taylor Series
If you're unfamiliar with the concept of a series, don't worry; I will cover that in a future article. For now, feel free to sit back and enjoy the explanation, and if anything seems unclear, we will revisit it later.
One of the most potent tools in real and complex analysis is the fact that analytic functions can be represented by a power series.
If f is infinitely differentiable and locally expressed by a convergent power series at the point x=a, then in some neighborhood of a, we have:
The notation f^(n) indicates that f has been differentiated n times.
I won't prove this fact here, but readers may be convinced of its validity by differentiating both sides multiple times and substituting x=a or by expressing f as a sum with a remainder term rather than a series, then considering the limit as the sum expands and the remainder term diminishes.
A rigorous proof of this requires careful consideration of boundedness conditions for the derivatives, convergence of the series, and other factors.
Applications of Taylor Series
This series is among the most frequently utilized tools in calculus. It is even more fundamental than initially thought because, in complex analysis, a (complex) function is either not differentiable or differentiable infinitely many times, thus being analytic and possessing a power series representation. Hence, all complex differentiable (or holomorphic) functions have a Taylor series.
Let's use Taylor series to demonstrate the identity of the differential operator.
First, consider:
Recall from Taylor series theory that the exponential function can be expressed as a power series that converges everywhere:
We also utilize Taylor series to define trigonometric functions of complex numbers.
When we expand the Taylor series for the functions e^x, sine, and cosine, we obtain:
Now, it is merely a matter of substituting and rearranging to derive Euler's identity:
This identity is frequently employed to transition between polar and Cartesian coordinates.
This article is the third in a series discussing calculus. The previous articles can be accessed here: