Subject: Calculus

Differentiation Under The Integral Sign

As we all know, the derivative of a function f(x) is just f'(x). Now, when the function f is an integral, then it seems like we are in a "taking the derivative of an integral" scenario which makes most of us argue that the answer is just the same because integration and differentiation are inverse processes. In fact, when the function f is given by

f(x)=\int_a^b g(x) dx

then the derivative of f is then given by

\frac{d}{dx}f(x)=\int_a^b \frac{\partial }{\partial x}g(x) dx=g(x)

Well, that argument is valid only when the variable of integration and differentiation are the same. Suppose now that the variables of integration and differentiation are not the same. This time, we have a new function f such that

f(x)=\int_a^b g(x, t) dt

and suppose further that the upper and lower limits are not anymore constants. Say a = a(x) and b = b(x) such that

f(x)=\int_{a(x)}^{b(x)} g(x, t) dt

then operating under differentiation under the integral sign, the derivative of f(x) is then given by

\frac{d}{dx}f(x) = \frac{\partial f}{\partial b}\frac{db}{dx} - \frac{\partial f}{\partial a}\frac{da}{dx} + \int_{a(x)}^{b(x)} \frac{\partial }{\partial x}g(x, t) dt

which is just equivalent to

f(x, b(x))\cdot b'(x) - f(x, a(x))\cdot a'(x) + \int_{a(x)}^{b(x)} \frac{\partial }{\partial x}g(x, t) dt

Therefore, the derivative of f with respect to x is

f(x) = f(x, b(x))\cdot b'(x) - f(x, a(x))\cdot a'(x) + \int_{a(x)}^{b(x)} \frac{\partial }{\partial x}g(x, t) dt

Note that when a and b are constants, its individual derivatives are zero, thus the first two terms of the expression above becomes zero which yields a result similar to the one we mentioned above for the case of constant limits. When this happens, it gives rise to the commutative property of calculus operators such that

I_tD_x = D_x I_t

where I_t is the integral operator with respect to the variable t and D_x is the differential operator with respect to x.

NEXT TOPIC: Trigonometric substitution