Existence of Partial Derivatives Implies Continuity

Derivative of a function with multiple variables

In mathematics, a partial derivative of a function of several variables is its derivative with respect to one of those variables, with the others held constant (as opposed to the total derivative, in which all variables are allowed to vary). Partial derivatives are used in vector calculus and differential geometry.

The partial derivative of a function f ( x , y , ) {\displaystyle f(x,y,\dots )} with respect to the variable x {\displaystyle x} is variously denoted by

f x {\displaystyle f_{x}} , f x {\displaystyle f'_{x}} , x f {\displaystyle \partial _{x}f} , D x f {\displaystyle \ D_{x}f} , D 1 f {\displaystyle D_{1}f} , x f {\displaystyle {\frac {\partial }{\partial x}}f} , or f x {\displaystyle {\frac {\partial f}{\partial x}}} .

It can be thought of as the rate of change of the function in the x {\displaystyle x} -direction.

Sometimes, for z = f ( x , y , ) {\displaystyle z=f(x,y,\ldots )} , the partial derivative of z {\displaystyle z} with respect to x {\displaystyle x} is denoted as z x . {\displaystyle {\tfrac {\partial z}{\partial x}}.} Since a partial derivative generally has the same arguments as the original function, its functional dependence is sometimes explicitly signified by the notation, such as in:

f x ( x , y , ) , f x ( x , y , ) . {\displaystyle f'_{x}(x,y,\ldots ),{\frac {\partial f}{\partial x}}(x,y,\ldots ).}

The symbol used to denote partial derivatives is ∂. One of the first known uses of this symbol in mathematics is by Marquis de Condorcet from 1770, who used it for partial differences. The modern partial derivative notation was created by Adrien-Marie Legendre (1786), although he later abandoned it; Carl Gustav Jacob Jacobi reintroduced the symbol in 1841.[1]

Definition [edit]

Like ordinary derivatives, the partial derivative is defined as a limit. Let U be an open subset of R n {\displaystyle \mathbb {R} ^{n}} and f : U R {\displaystyle f:U\to \mathbb {R} } a function. The partial derivative of f at the point a = ( a 1 , , a n ) U {\displaystyle \mathbf {a} =(a_{1},\ldots ,a_{n})\in U} with respect to the i-th variable x i is defined as

x i f ( a ) = lim h 0 f ( a 1 , , a i 1 , a i + h , a i + 1 , , a n ) f ( a 1 , , a i , , a n ) h = lim h 0 f ( a + h e i ) f ( a ) h {\displaystyle {\begin{aligned}{\frac {\partial }{\partial x_{i}}}f(\mathbf {a} )&=\lim _{h\to 0}{\frac {f(a_{1},\ldots ,a_{i-1},a_{i}+h,a_{i+1},\ldots ,a_{n})-f(a_{1},\ldots ,a_{i},\dots ,a_{n})}{h}}\\&=\lim _{h\to 0}{\frac {f(\mathbf {a} +h\mathbf {e_{i}} )-f(\mathbf {a} )}{h}}\end{aligned}}}

Even if all partial derivatives ∂f/∂x i (a) exist at a given point a, the function need not be continuous there. However, if all partial derivatives exist in a neighborhood of a and are continuous there, then f is totally differentiable in that neighborhood and the total derivative is continuous. In this case, it is said that f is a C 1 function. This can be used to generalize for vector valued functions, f : U R m {\displaystyle f:U\to \mathbb {R} ^{m}} , by carefully using a componentwise argument.

The partial derivative f x {\displaystyle {\frac {\partial f}{\partial x}}} can be seen as another function defined on U and can again be partially differentiated. If all mixed second order partial derivatives are continuous at a point (or on a set), f is termed a C 2 function at that point (or on that set); in this case, the partial derivatives can be exchanged by Clairaut's theorem:

2 f x i x j = 2 f x j x i . {\displaystyle {\frac {\partial ^{2}f}{\partial x_{i}\partial x_{j}}}={\frac {\partial ^{2}f}{\partial x_{j}\partial x_{i}}}.}

Notation [edit]

For the following examples, let f {\displaystyle f} be a function in x , y {\displaystyle x,y} and z {\displaystyle z} .

First-order partial derivatives:

f x = f x = x f . {\displaystyle {\frac {\partial f}{\partial x}}=f'_{x}=\partial _{x}f.}

Second-order partial derivatives:

2 f x 2 = f x x = x x f = x 2 f . {\displaystyle {\frac {\partial ^{2}f}{\partial x^{2}}}=f''_{xx}=\partial _{xx}f=\partial _{x}^{2}f.}

Second-order mixed derivatives:

2 f y x = y ( f x ) = ( f x ) y = f x y = y x f = y x f . {\displaystyle {\frac {\partial ^{2}f}{\partial y\,\partial x}}={\frac {\partial }{\partial y}}\left({\frac {\partial f}{\partial x}}\right)=(f'_{x})'_{y}=f''_{xy}=\partial _{yx}f=\partial _{y}\partial _{x}f.}

Higher-order partial and mixed derivatives:

i + j + k f x i y j z k = f ( i , j , k ) = x i y j z k f . {\displaystyle {\frac {\partial ^{i+j+k}f}{\partial x^{i}\partial y^{j}\partial z^{k}}}=f^{(i,j,k)}=\partial _{x}^{i}\partial _{y}^{j}\partial _{z}^{k}f.}

When dealing with functions of multiple variables, some of these variables may be related to each other, thus it may be necessary to specify explicitly which variables are being held constant to avoid ambiguity. In fields such as statistical mechanics, the partial derivative of f {\displaystyle f} with respect to x {\displaystyle x} , holding y {\displaystyle y} and z {\displaystyle z} constant, is often expressed as

( f x ) y , z . {\displaystyle \left({\frac {\partial f}{\partial x}}\right)_{y,z}.}

Conventionally, for clarity and simplicity of notation, the partial derivative function and the value of the function at a specific point are conflated by including the function arguments when the partial derivative symbol (Leibniz notation) is used. Thus, an expression like

f ( x , y , z ) x {\displaystyle {\frac {\partial f(x,y,z)}{\partial x}}}

is used for the function, while

f ( u , v , w ) u {\displaystyle {\frac {\partial f(u,v,w)}{\partial u}}}

might be used for the value of the function at the point ( x , y , z ) = ( u , v , w ) {\displaystyle (x,y,z)=(u,v,w)} . However, this convention breaks down when we want to evaluate the partial derivative at a point like ( x , y , z ) = ( 17 , u + v , v 2 ) {\displaystyle (x,y,z)=(17,u+v,v^{2})} . In such a case, evaluation of the function must be expressed in an unwieldy manner as

f ( x , y , z ) x ( 17 , u + v , v 2 ) {\displaystyle {\frac {\partial f(x,y,z)}{\partial x}}(17,u+v,v^{2})}

or

f ( x , y , z ) x | ( x , y , z ) = ( 17 , u + v , v 2 ) {\displaystyle \left.{\frac {\partial f(x,y,z)}{\partial x}}\right|_{(x,y,z)=(17,u+v,v^{2})}}

in order to use the Leibniz notation. Thus, in these cases, it may be preferable to use the Euler differential operator notation with D i {\displaystyle D_{i}} as the partial derivative symbol with respect to the ith variable. For instance, one would write D 1 f ( 17 , u + v , v 2 ) {\displaystyle D_{1}f(17,u+v,v^{2})} for the example described above, while the expression D 1 f {\displaystyle D_{1}f} represents the partial derivative function with respect to the 1st variable.[2]

For higher order partial derivatives, the partial derivative (function) of D i f {\displaystyle D_{i}f} with respect to the jth variable is denoted D j ( D i f ) = D i , j f {\displaystyle D_{j}(D_{i}f)=D_{i,j}f} . That is, D j D i = D i , j {\displaystyle D_{j}\circ D_{i}=D_{i,j}} , so that the variables are listed in the order in which the derivatives are taken, and thus, in reverse order of how the composition of operators is usually notated. Of course, Clairaut's theorem implies that D i , j = D j , i {\displaystyle D_{i,j}=D_{j,i}} as long as comparatively mild regularity conditions on f are satisfied.

Gradient [edit]

An important example of a function of several variables is the case of a scalar-valued function f(x 1, ..., xn ) on a domain in Euclidean space R n {\displaystyle \mathbb {R} ^{n}} (e.g., on R 2 {\displaystyle \mathbb {R} ^{2}} or R 3 {\displaystyle \mathbb {R} ^{3}} ). In this case f has a partial derivative ∂f/∂xj with respect to each variable x j . At the point a, these partial derivatives define the vector

f ( a ) = ( f x 1 ( a ) , , f x n ( a ) ) . {\displaystyle \nabla f(a)=\left({\frac {\partial f}{\partial x_{1}}}(a),\ldots ,{\frac {\partial f}{\partial x_{n}}}(a)\right).}

This vector is called the gradient of f at a. If f is differentiable at every point in some domain, then the gradient is a vector-valued function ∇f which takes the point a to the vector ∇f(a). Consequently, the gradient produces a vector field.

A common abuse of notation is to define the del operator (∇) as follows in three-dimensional Euclidean space R 3 {\displaystyle \mathbb {R} ^{3}} with unit vectors i ^ , j ^ , k ^ {\displaystyle {\hat {\mathbf {i} }},{\hat {\mathbf {j} }},{\hat {\mathbf {k} }}} :

= [ x ] i ^ + [ y ] j ^ + [ z ] k ^ {\displaystyle \nabla =\left[{\frac {\partial }{\partial x}}\right]{\hat {\mathbf {i} }}+\left[{\frac {\partial }{\partial y}}\right]{\hat {\mathbf {j} }}+\left[{\frac {\partial }{\partial z}}\right]{\hat {\mathbf {k} }}}

Or, more generally, for n-dimensional Euclidean space R n {\displaystyle \mathbb {R} ^{n}} with coordinates x 1 , , x n {\displaystyle x_{1},\ldots ,x_{n}} and unit vectors e ^ 1 , , e ^ n {\displaystyle {\hat {\mathbf {e} }}_{1},\ldots ,{\hat {\mathbf {e} }}_{n}} :

= j = 1 n [ x j ] e ^ j = [ x 1 ] e ^ 1 + [ x 2 ] e ^ 2 + + [ x n ] e ^ n {\displaystyle \nabla =\sum _{j=1}^{n}\left[{\frac {\partial }{\partial x_{j}}}\right]{\hat {\mathbf {e} }}_{j}=\left[{\frac {\partial }{\partial x_{1}}}\right]{\hat {\mathbf {e} }}_{1}+\left[{\frac {\partial }{\partial x_{2}}}\right]{\hat {\mathbf {e} }}_{2}+\dots +\left[{\frac {\partial }{\partial x_{n}}}\right]{\hat {\mathbf {e} }}_{n}}

Directional derivative [edit]

A contour plot of f ( x , y ) = x 2 + y 2 {\displaystyle f(x,y)=x^{2}+y^{2}} , showing the gradient vector in black, and the unit vector u {\displaystyle \mathbf {u} } scaled by the directional derivative in the direction of u {\displaystyle \mathbf {u} } in orange. The gradient vector is longer because the gradient points in the direction of greatest rate of increase of a function.

The directional derivative of a scalar function

f ( x ) = f ( x 1 , x 2 , , x n ) {\displaystyle f(\mathbf {x} )=f(x_{1},x_{2},\ldots ,x_{n})}

along a vector

v = ( v 1 , , v n ) {\displaystyle \mathbf {v} =(v_{1},\ldots ,v_{n})}

is the function v f {\displaystyle \nabla _{\mathbf {v} }{f}} defined by the limit[3]

v f ( x ) = lim h 0 f ( x + h v ) f ( x ) h . {\displaystyle \nabla _{\mathbf {v} }{f}(\mathbf {x} )=\lim _{h\to 0}{\frac {f(\mathbf {x} +h\mathbf {v} )-f(\mathbf {x} )}{h}}.}

This definition is valid in a broad range of contexts, for example where the norm of a vector (and hence a unit vector) is undefined.[4]

Example [edit]

Suppose that f is a function of more than one variable. For instance,

z = f ( x , y ) = x 2 + x y + y 2 {\displaystyle z=f(x,y)=x^{2}+xy+y^{2}} .

A graph of z = x 2 + xy + y 2 . For the partial derivative at (1, 1) that leaves y constant, the corresponding tangent line is parallel to the xz-plane.

A slice of the graph above showing the function in the xz-plane at y = 1. Note that the two axes are shown here with different scales. The slope of the tangent line is 3.

The graph of this function defines a surface in Euclidean space. To every point on this surface, there are an infinite number of tangent lines. Partial differentiation is the act of choosing one of these lines and finding its slope. Usually, the lines of most interest are those that are parallel to the x z {\displaystyle xz} -plane, and those that are parallel to the y z {\displaystyle yz} -plane (which result from holding either y {\displaystyle y} or x {\displaystyle x} constant, respectively).

To find the slope of the line tangent to the function at P ( 1 , 1 ) {\displaystyle P(1,1)} and parallel to the x z {\displaystyle xz} -plane, we treat y {\displaystyle y} as a constant. The graph and this plane are shown on the right. Below, we see how the function looks on the plane y = 1 {\displaystyle y=1} . By finding the derivative of the equation while assuming that y {\displaystyle y} is a constant, we find that the slope of f {\displaystyle f} at the point ( x , y ) {\displaystyle (x,y)} is:

z x = 2 x + y . {\displaystyle {\frac {\partial z}{\partial x}}=2x+y.}

So at ( 1 , 1 ) {\displaystyle (1,1)} , by substitution, the slope is 3. Therefore,

z x = 3 {\displaystyle {\frac {\partial z}{\partial x}}=3}

at the point ( 1 , 1 ) {\displaystyle (1,1)} . That is, the partial derivative of z {\displaystyle z} with respect to x {\displaystyle x} at ( 1 , 1 ) {\displaystyle (1,1)} is 3, as shown in the graph.

The function f can be reinterpreted as a family of functions of one variable indexed by the other variables:

f ( x , y ) = f y ( x ) = x 2 + x y + y 2 . {\displaystyle f(x,y)=f_{y}(x)=x^{2}+xy+y^{2}.}

In other words, every value of y defines a function, denoted fy , which is a function of one variable x.[note 1] That is,

f y ( x ) = x 2 + x y + y 2 . {\displaystyle f_{y}(x)=x^{2}+xy+y^{2}.}

In this section the subscript notation fy denotes a function contingent on a fixed value of y, and not a partial derivative.

Once a value of y is chosen, say a, then f(x,y) determines a function fa which traces a curve x 2 + ax + a 2 on the x z {\displaystyle xz} -plane:

f a ( x ) = x 2 + a x + a 2 . {\displaystyle f_{a}(x)=x^{2}+ax+a^{2}.}

In this expression, a is a constant, not a variable, so fa is a function of only one real variable, that being x. Consequently, the definition of the derivative for a function of one variable applies:

f a ( x ) = 2 x + a . {\displaystyle f_{a}'(x)=2x+a.}

The above procedure can be performed for any choice of a. Assembling the derivatives together into a function gives a function which describes the variation of f in the x direction:

f x ( x , y ) = 2 x + y . {\displaystyle {\frac {\partial f}{\partial x}}(x,y)=2x+y.}

This is the partial derivative of f with respect to x. Here is a rounded d called the partial derivative symbol; to distinguish it from the letter d, is sometimes pronounced "partial".

Higher order partial derivatives [edit]

Second and higher order partial derivatives are defined analogously to the higher order derivatives of univariate functions. For the function f ( x , y , . . . ) {\displaystyle f(x,y,...)} the "own" second partial derivative with respect to x is simply the partial derivative of the partial derivative (both with respect to x):[5] : 316–318

2 f x 2 f / x x f x x f x x . {\displaystyle {\frac {\partial ^{2}f}{\partial x^{2}}}\equiv \partial {\frac {\partial f/\partial x}{\partial x}}\equiv {\frac {\partial f_{x}}{\partial x}}\equiv f_{xx}.}

The cross partial derivative with respect to x and y is obtained by taking the partial derivative of f with respect to x, and then taking the partial derivative of the result with respect to y, to obtain

2 f y x f / x y f x y f x y . {\displaystyle {\frac {\partial ^{2}f}{\partial y\,\partial x}}\equiv \partial {\frac {\partial f/\partial x}{\partial y}}\equiv {\frac {\partial f_{x}}{\partial y}}\equiv f_{xy}.}

Schwarz's theorem states that if the second derivatives are continuous, the expression for the cross partial derivative is unaffected by which variable the partial derivative is taken with respect to first and which is taken second. That is,

2 f x y = 2 f y x {\displaystyle {\frac {\partial ^{2}f}{\partial x\,\partial y}}={\frac {\partial ^{2}f}{\partial y\,\partial x}}}

or equivalently f y x = f x y . {\displaystyle f_{yx}=f_{xy}.}

Own and cross partial derivatives appear in the Hessian matrix which is used in the second order conditions in optimization problems. The higher order partial derivatives can be obtained by successive differentiation

Antiderivative analogue [edit]

There is a concept for partial derivatives that is analogous to antiderivatives for regular derivatives. Given a partial derivative, it allows for the partial recovery of the original function.

Consider the example of

z x = 2 x + y . {\displaystyle {\frac {\partial z}{\partial x}}=2x+y.}

The "partial" integral can be taken with respect to x (treating y as constant, in a similar manner to partial differentiation):

z = z x d x = x 2 + x y + g ( y ) . {\displaystyle z=\int {\frac {\partial z}{\partial x}}\,dx=x^{2}+xy+g(y).}

Here, the "constant" of integration is no longer a constant, but instead a function of all the variables of the original function except x. The reason for this is that all the other variables are treated as constant when taking the partial derivative, so any function which does not involve x {\displaystyle x} will disappear when taking the partial derivative, and we have to account for this when we take the antiderivative. The most general way to represent this is to have the "constant" represent an unknown function of all the other variables.

Thus the set of functions x 2 + x y + g ( y ) {\displaystyle x^{2}+xy+g(y)} , where g is any one-argument function, represents the entire set of functions in variables x,y that could have produced the x-partial derivative 2 x + y {\displaystyle 2x+y} .

If all the partial derivatives of a function are known (for example, with the gradient), then the antiderivatives can be matched via the above process to reconstruct the original function up to a constant. Unlike in the single-variable case, however, not every set of functions can be the set of all (first) partial derivatives of a single function. In other words, not every vector field is conservative.

Applications [edit]

Geometry [edit]

The volume of a cone depends on height and radius

The volume V of a cone depends on the cone's height h and its radius r according to the formula

V ( r , h ) = π r 2 h 3 . {\displaystyle V(r,h)={\frac {\pi r^{2}h}{3}}.}

The partial derivative of V with respect to r is

V r = 2 π r h 3 , {\displaystyle {\frac {\partial V}{\partial r}}={\frac {2\pi rh}{3}},}

which represents the rate with which a cone's volume changes if its radius is varied and its height is kept constant. The partial derivative with respect to h {\displaystyle h} equals π r 2 3 , {\displaystyle {\frac {\pi r^{2}}{3}},} which represents the rate with which the volume changes if its height is varied and its radius is kept constant.

By contrast, the total derivative of V with respect to r and h are respectively

d V d r = 2 π r h 3 V r + π r 2 3 V h d h d r {\displaystyle {\frac {dV}{dr}}=\overbrace {\frac {2\pi rh}{3}} ^{\frac {\partial V}{\partial r}}+\overbrace {\frac {\pi r^{2}}{3}} ^{\frac {\partial V}{\partial h}}{\frac {dh}{dr}}}

and

d V d h = π r 2 3 V h + 2 π r h 3 V r d r d h {\displaystyle {\frac {dV}{dh}}=\overbrace {\frac {\pi r^{2}}{3}} ^{\frac {\partial V}{\partial h}}+\overbrace {\frac {2\pi rh}{3}} ^{\frac {\partial V}{\partial r}}{\frac {dr}{dh}}}

The difference between the total and partial derivative is the elimination of indirect dependencies between variables in partial derivatives.

If (for some arbitrary reason) the cone's proportions have to stay the same, and the height and radius are in a fixed ratio k,

k = h r = d h d r . {\displaystyle k={\frac {h}{r}}={\frac {dh}{dr}}.}

This gives the total derivative with respect to r:

d V d r = 2 π r h 3 + π r 2 3 k {\displaystyle {\frac {dV}{dr}}={\frac {2\pi rh}{3}}+{\frac {\pi r^{2}}{3}}k}

which simplifies to:

d V d r = k π r 2 {\displaystyle {\frac {dV}{dr}}=k\pi r^{2}}

Similarly, the total derivative with respect to h is:

d V d h = π r 2 {\displaystyle {\frac {dV}{dh}}=\pi r^{2}}

The total derivative with respect to both r and h of the volume intended as scalar function of these two variables is given by the gradient vector

V = ( V r , V h ) = ( 2 3 π r h , 1 3 π r 2 ) . {\displaystyle \nabla V=\left({\frac {\partial V}{\partial r}},{\frac {\partial V}{\partial h}}\right)=\left({\frac {2}{3}}\pi rh,{\frac {1}{3}}\pi r^{2}\right).}

Optimization [edit]

Partial derivatives appear in any calculus-based optimization problem with more than one choice variable. For example, in economics a firm may wish to maximize profit π(x, y) with respect to the choice of the quantities x and y of two different types of output. The first order conditions for this optimization are π x = 0 = π y . Since both partial derivatives π x and π y will generally themselves be functions of both arguments x and y, these two first order conditions form a system of two equations in two unknowns.

Thermodynamics, quantum mechanics and mathematical physics [edit]

Partial derivatives appear in thermodynamic equations like Gibbs-Duhem equation, in quantum mechanics as Schrodinger wave equation as well in other equations from mathematical physics. Here the variables being held constant in partial derivatives can be ratio of simple variables like mole fractions xi in the following example involving the Gibbs energies in a ternary mixture system:

G 2 ¯ = G + ( 1 x 2 ) ( G x 2 ) x 1 x 3 {\displaystyle {\bar {G_{2}}}=G+(1-x_{2})\left({\frac {\partial G}{\partial x_{2}}}\right)_{\frac {x_{1}}{x_{3}}}}

Express mole fractions of a component as functions of other components' mole fraction and binary mole ratios:

x 1 = 1 x 2 1 + x 3 x 1 {\displaystyle x_{1}={\frac {1-x_{2}}{1+{\frac {x_{3}}{x_{1}}}}}}
x 3 = 1 x 2 1 + x 1 x 3 {\displaystyle x_{3}={\frac {1-x_{2}}{1+{\frac {x_{1}}{x_{3}}}}}}

Differential quotients can be formed at constant ratios like those above:

( x 1 x 2 ) x 1 x 3 = x 1 1 x 2 {\displaystyle \left({\frac {\partial x_{1}}{\partial x_{2}}}\right)_{\frac {x_{1}}{x_{3}}}=-{\frac {x_{1}}{1-x_{2}}}}
( x 3 x 2 ) x 1 x 3 = x 3 1 x 2 {\displaystyle \left({\frac {\partial x_{3}}{\partial x_{2}}}\right)_{\frac {x_{1}}{x_{3}}}=-{\frac {x_{3}}{1-x_{2}}}}

Ratios X, Y, Z of mole fractions can be written for ternary and multicomponent systems:

X = x 3 x 1 + x 3 {\displaystyle X={\frac {x_{3}}{x_{1}+x_{3}}}}
Y = x 3 x 2 + x 3 {\displaystyle Y={\frac {x_{3}}{x_{2}+x_{3}}}}
Z = x 2 x 1 + x 2 {\displaystyle Z={\frac {x_{2}}{x_{1}+x_{2}}}}

which can be used for solving partial differential equations like:

( μ 2 n 1 ) n 2 , n 3 = ( μ 1 n 2 ) n 1 , n 3 {\displaystyle \left({\frac {\partial \mu _{2}}{\partial n_{1}}}\right)_{n_{2},n_{3}}=\left({\frac {\partial \mu _{1}}{\partial n_{2}}}\right)_{n_{1},n_{3}}}

This equality can be rearranged to have differential quotient of mole fractions on one side.

Image resizing [edit]

Partial derivatives are key to target-aware image resizing algorithms. Widely known as seam carving, these algorithms require each pixel in an image to be assigned a numerical 'energy' to describe their dissimilarity against orthogonal adjacent pixels. The algorithm then progressively removes rows or columns with the lowest energy. The formula established to determine a pixel's energy (magnitude of gradient at a pixel) depends heavily on the constructs of partial derivatives.

Economics [edit]

Partial derivatives play a prominent role in economics, in which most functions describing economic behaviour posit that the behaviour depends on more than one variable. For example, a societal consumption function may describe the amount spent on consumer goods as depending on both income and wealth; the marginal propensity to consume is then the partial derivative of the consumption function with respect to income.

See also [edit]

  • d'Alembertian operator
  • Chain rule
  • Curl (mathematics)
  • Divergence
  • Exterior derivative
  • Iterated integral
  • Jacobian matrix and determinant
  • Laplacian
  • Multivariable calculus
  • Symmetry of second derivatives
  • Triple product rule, also known as the cyclic chain rule.

Notes [edit]

  1. ^ This can also be expressed as the adjointness between the product space and function space constructions.

References [edit]

  1. ^ Miller, Jeff (2009-06-14). "Earliest Uses of Symbols of Calculus". Earliest Uses of Various Mathematical Symbols . Retrieved 2009-02-20 .
  2. ^ Spivak, M. (1965). Calculus on Manifolds. New York: W. A. Benjamin, Inc. p. 44. ISBN9780805390216.
  3. ^ R. Wrede; M.R. Spiegel (2010). Advanced Calculus (3rd ed.). Schaum's Outline Series. ISBN978-0-07-162366-7.
  4. ^ The applicability extends to functions over spaces without a metric and to differentiable manifolds, such as in general relativity.
  5. ^ Chiang, Alpha C. Fundamental Methods of Mathematical Economics, McGraw-Hill, third edition, 1984.

External links [edit]

  • "Partial derivative", Encyclopedia of Mathematics, EMS Press, 2001 [1994]
  • Partial Derivatives at MathWorld

kellyrethe1946.blogspot.com

Source: https://en.wikipedia.org/wiki/Partial_derivative

0 Response to "Existence of Partial Derivatives Implies Continuity"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel