You are on page 1of 37

$

'

FUNCTIONS OF SEVERAL VARIABLES


FABIZ I, Fall 2014

Luiza Bdin
Department of Applied Mathematics,
Bucharest University of Economic Studies

&

Functions of several variables

'

Limits. Continuity
So far, we have studied functions of one variable, typically written as y = f (x),
which represented the variation that occurred in some (dependent) variable y, as
another (independent) variable x changed.
In the real world, however, it is unusual to deal with functions that depend on a
single variable, and instead of y = f (x), we often work with y = f (x1 , x2 ),
y = f (x1 , x2 , x3 ), or even the multivariate case y = f (x1 , x2 , ..., xn ).
Economic models are usually functions of more than one variable, assuming for
instance that output, Q = f (L, K), is a function of two inputs, labor and capital.
In order to understand the concept of limit and continuity in the general,
multivariate case, we have to start with the idea of closeness" in the
n-dimensional space.
In the univariate space, we measure the closeness of two arbitrary points by the
length of the segment joining the points.
In the n-dimensional space, the distance will be called Euclidian distance.
&

Functions of several variables

'

Limits. Continuity
Consider the set
Rn = {x = (x1 , x2 , . . . , xn )| xi Rn , i = 1, . . . , n} = |R R
{z. . . R}.
n times R

Definition 1. An application f : A Rn R, f (x) = f (x1 , x2 , . . . , xn ) is called real


function of n variables.
Definition 2. An application d : Rn Rn [0, ) is said to be a distance if the
following properties hold:
1. d(x, y) 0, x, y Rn and d(x, y) = 0 x = y;
2. d(x, y) = d(y, x), x, y Rn ;
3. d(x, z) d(x, y) + d(y, z), x, y, z Rn

&

Functions of several variables

'

Limits. Continuity
Example 1. The function d : Rn Rn [0, ) defined by
v
u n
uX
d(x, y) = t (xi yi )2
i=1

for x = (x1 , x2 , . . . xn ) and y = (y1 , y2 , . . . , yn ) is a distance named Euclidian distance.


For n = 1, we have d(x, y) = |x1 y1 |, where x = x1 and y = y1 .
p
For n = 2, we have x = (x1 , x2 ), y = (y1 , y2 ) and d(x, y) = (x1 y1 )2 + (x2 y2 )2 .

&

Functions of several variables

'

Limits. Continuity
Definition 3. Consider x0 Rn and r > 0. The set Sr (x0 ) = {x Rn | d(x0 , x) < r}
is the open sphere centered at x0 with radius r.
The point x0 Rn is an interior point of the set A Rn if and only if r > 0 such
that Sr (x0 ) A.
An n-dimensional interval is I1 I2 . . . In = {(x1 , x2 , . . . , xn )|xk Ik , k = 1, . . . n}
where Ik = (ak , bk ), k = 1, . . . n.
Any open sphere centered at x0 contains an n-dimensional interval which includes x0
and conversely.

&

Functions of several variables

'

Limits. Continuity
For simplicity, all the results are presented for n = 2.
Consider A R2 , f : A R and (a, b) an interior point of A.
Definition 4. (Limit)

lim
(x,y)(a,b)

f (x, y) = l R if for every (xn , yn )n1 A with

(xn , yn ) (a, b) and (xn , yn ) 6= (a, b), n 1 we have lim f (xn , yn ) = l.


n

Equivalently, if l R,

lim
(x,y)(a,b)

f (x, y) = l R if for every > 0 there exists

= () > 0 such that for every (x, y) A with |x a| < , |y b| < we have
|f (x, y) l| < .
Definition 5. (Continuity) A function f is continuous at (a, b) if the limit
lim f (x, y) exists and it is equal to f (a, b):
lim f (x, y) = f (a, b).
(x,y)(a,b)

&

(x,y)(a,b)

Functions of several variables

'

Examples
1. Show that the function
f (x, y) =

2xy , (x, y) 6= (0, 0)


2
2
x +y

0,

(1)

(x, y) = (0, 0)

is continuous at origin.
2. Show that the function

x2 y2 , (x, y) 6= (0, 0)
2
2
f (x, y) = x +y
0,
(x, y) = (0, 0)

(2)

is not continuous at origin.

&

Functions of several variables

'

Partial Derivatives
Consider a set A R2 , f : A R and (a, b) an interior point of A.
f (x, b) f (a, b)
Definition 6. If lim
exists and is finite, we say that f admits a
xa
xa
partial derivative with respect to x at the point (a, b) and we write:
f (x, b) f (a, b)
f
= fx0 (a, b) =
(a, b).
xa
xa
x
lim

f (a, y) f (a, b)
exists and is finite, we say that f admits a
yb
yb
partial derivative with respect to y at the point (a, b) and we write:
Definition 7. If lim

f (a, y) f (a, b)
f
= fy0 (a, b) =
(a, b).
yb
yb
y
lim

&

Functions of several variables

'

Multivariate case (n 2)
If A Rn and a = (a1 , a2 , . . . , an ) is an interior point of A, then for every i = 1, . . . , n
lim

xi ai

f (a1 , . . . , xi , . . . an ) f (a1 , . . . , ai , . . . an )
f
= fx0 i (a1 , . . . , an ) =
(a1 , a2 , . . . , an ).
x i ai
xi

If fx0 i as n-variable function of (x1 , x2 , . . . , xn ) admits first order partial derivatives


with respect to xj at some point (a1 , a2 , . . . , an ) Rn , then
(fx0 i )0xj (a1 , a2 , . . . , an )

fx00i xj (a1 , a2 , . . . , an )

2f
=
(a1 , a2 , . . . , an )
xj xi

is the second order partial derivative of function f calculated at (a1 , a2 , . . . , an ).


For a two-variable function f : A R with A R2 , if the applications fx0 , fy0 : A R
are defined at any point of A and if they also admit partial derivatives with respect
to x and y, then their partial derivatives are the second order partial derivatives and
00
00
the following notations apply: fx002 = (fx0 )0x , fxy
= (fx0 )0y , fyx
= (fy0 )0x , fy002 = (fy0 )0y .
&

Functions of several variables

10

'

Examples
00
00
Example 2. Find fx002 , fy002 , fxy
, fyx
for the next two-variable functions:

1. f (x, y) = x3 + 2xy 2 xy , y 6= 0;
2. f (x, y) = ln(1 + x2 + 2y 2 ).

&

Functions of several variables

11

'

Differentiability
A sufficient condition for the mixed second order partial derivatives of a two-variable
function to be equal is provided by the next theorem.
Theorem 1. (Schwartz Criterion) If f : A R2 R has second order partial
derivatives at any point of an open sphere centered at (a, b), Sr (a, b) A and these
are continuous at (a, b), then the mixed second order partial derivatives are equal:
00
00
fxy
(a, b) = fyx
(a, b).

Definition 8. (Differentiability) Consider A R2 , f : A R, and (a, b) an interior


point of A. The function f is differentiable at (a, b) if there exist the constants and
R and a function : A R continuous at (a, b), with (a, b) = 0, such that
f (x, y) f (a, b) = (x a) + (y b) + (x, y)(x, y)
p
where (x, y) = (x a)2 + (y b)2 .
&

(x, y) A,

Functions of several variables

12

'

Differentiability, partial derivatives and continuity


Next results establish the connection between differentiability, partial derivatives and
continuity.
Theorem 2. If A R2 and f : A R is differentiable at (a, b) A then f admits
first order partial derivatives with respect to x and y at (a, b) and fx0 (a, b) = ,
fy0 (a, b) = .
Proof. Consider y = b, x 6= a such as (x, b) A. As f is differentiable at (a, b) we
have
f (x, y) f (a, b) = (x a) + (y b) + (x, y)(x, y)
f (x, b) f (a, b)
|x a|
= + (x, b)
.

xa
xa
Since

lim

xa,yb

(x, y) = 0 then

f (x, b) f (a, b)
|x a|
= + lim (x, b)
= fx0 (a, b) = .
xa
xa
xa
xa
lim

Similarly, fy0 (a, b) = .


&

Functions of several variables

13

'

Therefore, if the two variable function f is differentiable at (a, b), then we have
f (x, y) f (a, b) = fx0 (a, b)(x a) + fy0 (a, b)(y b) + (x, y)(x, y).

&

Functions of several variables

14

'

Differentiability, partial derivatives and continuity


Theorem 3. If f : A R2 R is differentiable at (a, b) A then f is continuous at
(a, b).
Proof. If f is differentiable at (a, b) then there exists the two-variable function
: A R continuous at (a, b) with (a, b) = 0 such that
f (x, y) f (a, b) = fx0 (a, b)(x a) + fy0 (a, b)(y b) + (x, y)(x, y).
Since : A R is continuous at (a, b) with (a, b) = 0, we have lim (x, y) = 0.
xa,yb
p
(x a)2 + (y b)2 = 0 and so
Moreover lim (x, y) = lim
xa,yb

lim [f (x, y)f (a, b)] =

xa,yb

lim

xa,yb

xa,yb

lim

xa,yb

 0

0
fx (a, b)(x a) + fy (a, b)(y b) + (x, y)(x, y) = 0

f (x, y) = f (a, b), which is exactly the continuity of the function f at the

point (a, b).


Next theorem will be presented without proof.
&

Functions of several variables


'

15

Theorem 4. If the first order partial derivatives fx0 , fy0 of f : A R2 R, are


defined at any point in an open sphere centered at (a, b), Sr (a, b) A and they are
continuous at (a, b), then f is differentiable at (a, b).

&

Functions of several variables

16

'

The total differential


Definition 9. Consider f : A R2 R and (a, b) an interior point of A such that
f is differentiable at (a, b). Then the total differential of the function f at point
(a, b), denoted by df (x, y; a, b) or df(a,b) (x, y) is the two-variable function defined by
df(a,b) (x, y) = fx0 (a, b)(x a) + fy0 (a, b)(y b).
Remark 1. Consider the two-variable functions on R2 , , : R2 R,
(x, y) = x, (x, y) = y that are differentiable on R2 and
0x (x, y) = 1, y0 (x, y) = 1, 0y (x, y) = 0, x0 (x, y) = 0 and so
d(a,b) (x, y) = (x a)

notation

dx and d(a,b) (x, y) = (y b)

notation

dy.

Therefore, if f is an arbitrary function,


df(a,b) (x, y) = fx0 (a, b)dx + fy0 (a, b)dy
or
df(a,b) = fx0 (a, b)dx + fy0 (a, b)dy.
The total differential of function f approximates the variation of f around (a, b).
&

Functions of several variables

17

'

The second order total differential of the function f at the point (a, b) is
00
(a, b)dxdy.
d2 f(a,b) (x, y) = d(df )(a,b) (x, y) = fx002 (a, b)dx2 + fy002 (a, b)dy 2 + 2fxy

&

Functions of several variables

18

'

Examples
1. Find out the first and the second order partial derivatives of the following
functions:
(a) f : A = {(x, y) R2 |y 6= 0} R, f (x, y) = xy +
(b) f : A = R2 \ {(0, 0)} R, f (x, y) =

x
y

x
x2 +y 2

(c) f (x, y) = x3 + y 3 + 3xy


(d) f (x, y) = x3 + 3xy 2 12y 15x
(e) f (x, y) = xy +

50
x

20
y
2

3, x 6= 0, y 6= 0;

(f) f (x, y, z) = 2x2 + 2y + 2(xy + yz + x + y + 3z);


(g) f (x, y, z) = x2 + y 2 + z 2 xy + x 2z.
00
00
2. Prove that fxy
(0, 0) 6= fyx
(0, 0) for the function f : R2 R,

xy x2 y2 , (x, y) 6= (0, 0)
x2 +y 2
f (x, y) =
0,
(x, y) = (0, 0)

(3)

.
&

Functions of several variables

19

'

3. Using the definition, prove that the function f : R2 R, f (x, y) = 3x + y 2 is


differentiable at (2, 1).
4. Is the function f : R2 R
(a) f (x, y) = x3 + xy + y 3 differentiable at (1, 1)?
p
(b) f (x, y) = x2 + y 2 differentiable at (0, 0) ?
(c)

xy x2 y2 , (x, y) 6= (0, 0)
x2 +y 2
f (x, y) =
0,
(x, y) = (0, 0)
differentiable at (0, 0)?
5. Consider the function f : R R R, f (x, y) = (x 2)46 (y 3)44 .
Then
82
f (26, 27)
x42 y 40
is equal to: a) 1650; b) 1560; c) 1272; d) 1722; e) 1982; f) 1892; g) 2700; h) 2070;
i) 2256; j) 2526; k) 2450; l) 2540; m) none of the previous.
&

Functions of several variables

20

'

Optimization: Finding maxima and minima


Consider f : A R2 R and (a, b) an interior point of A. Assume that f is n-times
differentiable at (a, b) and the mixed partial derivatives are equal.
Definition 10. Consider f : A R2 R and (a, b) A. The point (a, b) is a local
maximum (minimum) for f if there exists r > 0 such that Sr (a, b) A and for every
(x, y) Sr (a, b) we have f (a, b) f (x, y) (respectively f (a, b) f (x, y)).
If (a, b) is a local maximum or a local minimum, then (a, b) is a local extreme point.
In other words, a point is a local maximum if there are no nearby points at which f
takes a larger value. If we want to emphasize that a point (a, b) is a max of f on the
whole domain A, not just a local max, we call (a, b) a global max or an absolute max
of f on A.

&

Functions of several variables

21

'

Extreme points and partial derivatives


Proposition 1. If (a, b) A R2 is a local extreme point for the function
f : A R and if r > 0 such that the partial derivatives fx0 , fy0 exist on Sr (a, b) A
and are defined at any (x, y) Sr (a, b), then fx0 (a, b) = 0, fy0 (a, b) = 0.
Proof. Let (x, b) Sr (a, b) and consider the function (x) = f (x, b). As (a, b) is a
local extreme point of f , it comes that x = a is a local extreme point for . Because
0 (a) exists, by Fermat theorem we have 0 (a) = 0 and so
f (x, b) f (a, b)
= 0 (a) = 0. Similarly, fy0 (a, b) = 0.
fx0 (a, b) = lim
xa
xa

&

Functions of several variables

22

'

Stationary points and saddle points


Definition 11. An interior point (a, b) of A, is called a stationary point of f , if
fx0 (a, b) = 0 and fy0 (a, b) = 0.
Any local extreme point (a, b), interior of A, is a stationary point of f (x, y). The
reciprocal is not true: there are stationary points that are not extremes.
Definition 12. Stationary points that are not extreme points are called saddle points.

&

Functions of several variables

23

'

Finding Extremes
Theorem 5. Consider a subset A R2 , f : A R and (a, b) a stationary point for
the function f . Assume that r > 0 such that the second order derivatives
00
00
fx002 , fy002 , fxy
, fyx
are continuous on Sr (a, b). Let H(a, b) = (fx00i xj (a, b))i,j=1,2 be the
hessian matrix and let 1 (a, b) = fx002 (a, b), 2 (a, b) = det H(a, b). Then:
if 2 (a, b) > 0, (a, b) is a local extreme point:
if 1 (a, b) > 0 then (a, b) is a local minimum;
if 1 (a, b) < 0 then (a, b) is a local maximum.
if 2 (a, b) < 0, then (a, b) is not an extreme point, is a saddle point.

&

Functions of several variables

24

'

Finding extremes
1. If 2 (a, b) = 0 we can conclude nothing and the investigation has to be
continued some other way. For instance we might check the sign of
f (x, y) f (a, b) on Sr (a, b).
2. We note that when 2 (a, b) > 0, the second order partial derivatives
fx002 (a, b), fy002 (a, b) have the same sign, since fx002 (a, b)fy002 (a, b) > 0, so we could as
well check whether fy002 is positive or negative if that were easier.

&

Functions of several variables

25

'

Multivariate case n 2
Consider A Rn , f : A R, a = (a1 , a2 , . . . , an ) A a stationary point for the
function f such as its second order partial derivatives are continuous on an open
sphere Sr (a). Then the Hessian matrix associated to f at a A is
H(a) = (fx00i xj (a))i,j=1,...,n .
Consider the following determinants:
1 (a) = fx002 (a),
1

00
fx2 (a) fx001 x2 (a)
1
2 (a) =
fx002 x1 (a) fx002 (a)
2
.........




= f 002 (a)f 002 (a) fx00 x (a)fx00 x (a),
1 2
2 1
x1
x2

n (a) = det H(a)


and assume i (a) 6= 0, i = 1, . . . , n.
&

Functions of several variables

26

'

Multivariate case n 2
The matrix H(a) is called positive definite if 1 (a) > 0, 2 (a) > 0, . . . , n (a) > 0
and negative definite if 1 (a) < 0, 2 (a) > 0, . . . , (1)n n (a) > 0.
Then if H(a) is
positive definite, then x = a is a local minimum.
negative definite, then x = a is a local maximum;
indefinite (neither positive nor negative definite), then x = a is a saddle point.

&

Functions of several variables

27

'

Examples
Find the local extreme points of the following functions:
Example 3. f : R2 R, f (x, y) = x2 + y 2

z = f(x,y)

0
2
1

2
1

1
y values

1
2

x values

Figure 1: f (x, y) = x2 + y 2
&

Functions of several variables

28

'

Examples
Example 4. f : R2 R, f (x, y) = x2 y 2

z = f(x,y)

4
2
1
0
1
y values

x values

Figure 2: f (x, y) = x2 y 2
&

Functions of several variables

29

'

Examples
Example 5. f : R2 R, f (x, y) = xe(x

2 +y 2 )

z = f(x,y)

0.5

0.5
2
1

2
1

1
y values

1
2

x values

Figure 3: f (x, y) = xe(x


&

2 +y 2 )

Functions of several variables

30

'

Examples
Example 6. f : R2 R, f (x, y) = xye(x

2 +y 2 )

0.2

z = f(x,y)

0.1

0.1

0.2
2
1

2
1

1
y values

1
2

x values

Figure 4: f (x, y) = xye(x


&

2 +y 2 )

Functions of several variables

31

'

Examples
Find the local extreme points of the following functions:
1. f : R2 R, f (x, y) = x3 + y 3 + 3xy;
2. f : R2 R, f (x, y) = x3 y 2 4x;
3. f : R2 R, f (x, y) = x3 + 3xy 2 12y 15x;
4. f (x, y) = xy +

50
x

20
y

3, x 6= 0, y 6= 0;

5. f : R3 R, f (x, y, z) = 2x2 + 2y 2 + 2(xy + yz + x + y + 3z);


6. f : R3 R, f (x, y, z) = x2 + y 2 + z 2 xy + x 2z.

&

Functions of several variables

32

'

Least Squares Method


The Least Squares Method (LSM) was first described by Gauss around 1794 and
the most important application of the LSM is in data fitting. The best fit in the
least-squares sense minimizes the sum of squared residuals, a residual being the
difference between an observed value and the fitted value according to a given
model.
Least squares problems fall into two categories: linear or ordinary least squares
and non-linear least squares, depending on whether or not the residuals are
linear in all unknowns.
Researchers studying the data from experiments are often interested in
discovering whether the variables under study are linearly related, or in finding
the linear approximation which best fits the data points according to some
specific criterion. This may help detecting any possible underlying patterns and
also predict future values.
&

Functions of several variables

33

'

Least Squares Method


Suppose the data points are (x1 , y1 ), . . . , (xn , yn ), n 3.
For any given line y = ax + b we can measure the distance from any of these points
to the line yi0 = axi + b, by
d2i = (yi yi0 )2 = (yi (axi + b))2 .
The line which minimizes the sum of squared residuals:
n
X
S(a, b) =
[yi (axi + b)]2
i=1

is called the least squares line.


The corresponding method is called least squares method or ordinary least squares
(OLS) and occurs in linear regression analysis.
The values a and b that minimize S(a, b) are usually called least squares
approximations (estimators) and under specific assumptions on the data generating
process, they have important statistical properties.
&

Functions of several variables

34

'

Least Squares Method

n
X

[yi (axi + b)]xi

Sa (a, b) = 2

Sb (a, b) = 2

i=1
n
X

[yi (axi + b)].

i=1

X
n
n
n
X
X

xi + b
xi =
xi y i

a
i=1

n
X
i=1

(4)

i=1

xi + nb =

i=1

n
X

(5)

yi .

i=1

The equation system (5) is called Gauss normal equations system and it can be
proved that it has a unique solution, which is the global minimum point for the sum
of squared residuals, S(a, b).
&

Functions of several variables

35

'

Least Squares Method



Pn 2 Pn
i=1 xi
i=1 xi

= Pn
n

i=1 xi



n
X

=n
x2i


i=1


Pn
Pn
i=1 xi yi
i=1 xi
a = Pn
n

i=1 yi

Pn 2 Pn

xi
i=1 xi yi
b = Pi=1
Pn
n

i=1 xi
i=1 yi

&

n
X

!2
xi

6= 0

i=1



n
n
n
X
X
X

=n
xi yi
xi
yi


i=1
i=1
i=1



n
n
n
n
X
X
X
X
=
yi
xi
xi y i
x2i

i=1
i=1
i=1
i=1

Functions of several variables

36

'

Least Squares Method

Pn
Pn
Pn
n
x
y

x
yi

a
i=1
i=1
Pn i i 2
Pn i i=1
a =
=

n i=1 xi ( i=1 xi )2
b
b =
=

&

Pn
Pn
Pn
2
x
y

x
i=1 i Pi=1 i
i=1 i
i=1 xi yi
P
n ni=1 x2i ( ni=1 xi )2

(6)

Pn

(7)

Functions of several variables

37

'

Least Squares Method


Example 7. Find the line which best fits the data points: (0, 4), (3, 3), (4, 2), (3, 1),
(5, 0).
Answer: 5x + 7y = 29.
Example 8. Consider the following time series corresponding to the monthly sales of
some company:
t

y(t)

10

12

12

12

14

15

15 15 17 18

10

i) Using the Least Squares Method, find parameters a and b such that equation
y(t) = a + bt provides the best linear fit for the given data.
ii) Using the result of (i), predict the sales for November (t=11) and December
(t=12).
Answer: a = 9, 6 and b = 0, 8.
&

You might also like