This action might not be possible to undo. Are you sure you want to continue?

October 5, 2009

These lecture notes are unlikely to contain: material spoken verbally in class but not written on the board, examples presented in class, remarks that motivate the mathematical material, and explanations of standard mathematical notation. The ordering of material might diﬀer slightly from that presented in class. On the other hand they contain material that being either elementary or tedious was not presented in class. Note that such material might be used later in class and you are expected to know it. These lecture notes might be modiﬁed as the course progresses. Please bring any mistakes to my attention.

Contents

1 Introduction to Rn 1.1 Rn as an inner product space . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.2 Functions Rn → Rm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 Diﬀerential calculus in Rn 2.1 Diﬀerential operators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.2 Parametrised curves and surfaces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 Integral calculus in Rn 3.1 On one-dimensional domains . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.2 On two-dimensional domains . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.3 On three-dimensional domains . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 2 3 5 5 8 8 8 9 9

1

symmetric (i. .1. . .12 (Orthogonality). y ∈ Rn . . . . . 2. . x2 . e3 is also denoted by k.1. x2 . 1. The inner product is bi-linear. ∀α. (αβ)x = α(βx)) and possesses an identity. . x = (x1 . 0. . 0. x 0) and deﬁnite (i. 1x = x. For m > n the function deﬁned by Rn x → (x1 . .10 (Properties of the inner product). . . ∀x. every x ∈ Rn possesses an additive inverse denoted −x.8. . ei ∈ Rn is deﬁned by πi (ei ) = 1. These properties follows from the corresponding properties of addition and multiplication on R. y = 0. . . . x2 . . 0) + · · · + xn (0. ∀x ∈ Rn . . ∀i = j = 1..6 (Properties of addition and scalar multiplication on Rn ). 0. . x = 0 =⇒ x = 0). = x1 (1. This is the ith component of x. . (x + y)i = xi + yi . ˆ In R2 and R3 . x2 . x ∈ Rn . . . n. For x ∈ R i i Deﬁnition 1. Deﬁnition 1. x2 . x2 . 2. β ∈ R. Remark 1. . . 0) ∈ Rm (where the last m − n components are zero) is an embedding of Rn into Rm . (the n) projections. . namely 1 (thus ∀x ∈ Rn . . . x. 0) + (0.2 (Projections and components). .9 (Euclidean inner product). xn ) → xi ∈ R are n we also denote π (x) by x . 0.5 (Scalar multiplication on Rn ).1 Rn as a vector space Deﬁnition 1.7 (Basis vectors). x + y ∈ Rn is deﬁned by ∀i = 1. . 2.1. Addition on Rn is commutative. Deﬁnition 1. Remark 1. x. . n. For i = 1. x). Deﬁnition 1. Rn (x1 .1.. ∀x ∈ Rn . . xn ) = x1 e1 + x2 e2 + · · · + xn en .e. n. . y ∈ Rn : α(x + y) = αx + αy.e. e1 and e2 are also denoted by ˆ and ˆ respectively. . Scalar multiplication on Rn is associative (i. . the additive inverse of x ∈ Rn is −1x. In R3 . . For i = 1. .2 Rn as an inner product space Pn i=1 xi yi ∈ R. . . x. x ⊥ y . x⊥ := (−x2 .e. . y = 0}. . (α + β)x = αx + βx. Deﬁnition 1. . . . Rn := {(x1 . . 0) + x2 (0. if x. n. .e.1. .3 (Embeddings). . n.. .1. xn ) = (x1 . . Addition and scalar multiplication distribute over each other: ∀α. 2.11 (Inner product and projections). Observe that for i = 1. .1. . . x. πi (·) = ei .1. (αx)i = αxi . 2. x. 0. positive (i.1.1. . . .) −x. 2. . . ·. . . .1. xn ∈ R}. xn ) | x1 . Observe that for x ∈ Rn . .1.. . y = y. αx ∈ Rn is deﬁned by ∀i = 1. y ∈ Rn are orthogonal. x1 ). 1.1 (Rn ). Addition is deﬁned component-wise: ∀x. x. For x. The orthogonal complement of x ∈ Rn is {x}⊥ := {y ∈ Rn | x. 1) 1. y := Remark 1.1. 2 . namely 0 ∈ Rn .1 Introduction to Rn Rn as an inner product space π Deﬁnition 1.1. .1. For x ∈ R2 . n. xn . πj (ei ) = 0. . . y ∈ Rn . Deﬁnition 1. .4 (Addition on Rn ). . β ∈ R. i Deﬁnition 1. associative and possesses an additive identity. 0) + · · · + (0. . y ∈ Rn . Scalar multiplication is deﬁned component-wise: For α ∈ R and x ∈ Rn . i j Remark 1.13 (Orthogonal vector).1 1.

x × y := (x2 y3 − x3 y2 . Let f : Rn → Rm and x. Deﬁnition 1. x = 0 =⇒ x = 0). . These deﬁnitions are related: (x1 . for i = 1.19. positively homogeneous (i. For f : Rn → R2 we deﬁne f ⊥ : Rn → R2 by f ⊥ (x) := (f (x))⊥ . Let f : Rn → Rm . The cross product in R2 results in a scalar (i. 2. . (x1 . Thus for x ∈ Rn .1 Operations on functions Rn → Rm Functions Rn → Rm Deﬁnition 1.4 (Scalar multiplication of functions Rn → Rm ). .2. is the cross product or vector product of x and y . . y2 . f2 (x). x ∈ Rn .2 (Bounded functions). deﬁnite (i.4 The cross product in R2 and R3 Deﬁnition 1. . . v ∈ Rn with v = 1. Remark 1. y ∈ R2 . (f + g)(x) = f (x) + g(x). . 2. .2. Then g ◦ f : Rn → Rl is deﬁned by ∀x ∈ Rn . . Remark 1. a real number) while the cross product in R3 results in another vector in R3 . ∀x ∈ Rn .2. The 3 . 2. x := x. Moreover.6 (Components of a function). · : Rn → R+ is deﬁned by ∀x ∈ Rn .. x 0).2. ∀x ∈ Rn . . . .3 (Addition of functions Rn → Rm ). fm (x)). x × y := x1 y2 − x2 y1 ∈ R.e. m. x1 y2 − x2 y1 ) ∈ R3 .1. Let f : Rn → Rm and g : Rm → Rl . . . f (x) = ((f (x))1 .2. αx = |α| x) and satisﬁes ∀x.e. 0) = (0. x2 . Deﬁnition 1. y x y (Triangle inequality) (Cauchy-Schwarz inequality) Deﬁnition 1. . x3 y1 − x1 y3 .2. 0) × (y1 .2. xn ). (g ◦ f )(x) := g(f (x)).14 (Euclidean norm).2 Deﬁnition 1. . ei = 1. j = 1. S ⊂ Rn is bounded if there exists B 0 such that ∀x ∈ S . ∀α ∈ R.1. Deﬁnition 1. 1. . We deﬁne the components of f to be the functions fj := πj ◦ f : Rn → R.1. Given f : Rn → Rm and g : Rn → Rm we deﬁne f + g : Rn → Rm by ∀x ∈ Rn . x1 y2 − x2 y1 ). i i=1 Remark 1.. (αf )(x) = α f (x). . The norm is positive (i. . x + y x + y x. x. Given f : Rn → Rm and α ∈ R we deﬁne αf : Rn → Rm by ∀x ∈ Rn .1. 1.18 (Cross product in R2 and R3 ).17 (Bounded sets).e.1.7 (Sections of a function). x B . 1.1. n. xn ) = t x2 . v u n p p uX x = x. .16 (Properties of the norm). f : Rn → Rm is bounded if there exists C ∈ R such that ∀x ∈ Rn . section of f at x along v is the function R t → f (x + tv) ∈ Rm . f (x) C .1. Deﬁnition 1.. . For x.2..1.1.5 (Composition of functions). y ∈ R2 . x ∈ Rn is a unit vector or normalised if x = 1. 0. y ∈ Rn . For x ∈ Rn . x2 . .3 Rn as normed space p Deﬁnition 1. x = (x1 . (f (x))2 . For x. Deﬁnition 1. (f (x))m ) = (f1 (x).e. is the cross product or vector product of x and y .15. 1. x2 .1.

Then f : Rn → Rm is also linear. 2. m are also linear. Then g ◦ f : Rn → Rl is also linear.19 (Representation of aﬃne functions). . Moreover Mg◦f = Mg Mf and cg◦f = Mg cf + cg . scalar multiplication. f : Rn → Rm is linear if ∀α ∈ R.1. j = 1.18.2.2. j = 1. Let f : Rn → Rm and g : Rm → Rl be linear.2. m be linear. 4 . Then f : Rn → Rm is also aﬃne. . Then αf : Rn → Rm is also linear.2.11 (Representation of linear functions).2. . Let f : Rn → Rm be linear and α ∈ R. f : Rn → Rm is aﬃne if ∃f : Rn → Rm linear and c ∈ Rm such that ˜ ∀x ∈ Rn . Let f : Rn → Rm and g : Rm → Rl be aﬃne.20 (Aﬃneness is preserved by addition). We have a complete characterisation of linear functions: Theorem 1. 1. Then f +g : Rn → Rm is also aﬃne. Let fj : Rn → R. Let f : Rn → Rm be linear. 2. f (x) = Mf x + cf . If f : Rn → Rm is linear then f (0) = 0 but the converse is false. g : Rn → Rm be linear. Projections are linear Remark 1.2. 2. j = 1. .2. Corollary 1. We say that Mf is the matrix associated with f . y ∈ Rn . 2. Example 1. f (x + y) = f (x) + f (y). Converse 1. .2. Example 1. m are also aﬃne. . . . . Linearity is stable under addition. . Then fj : Rn → R. f : Rn → Rm is aﬃne iﬀ ∃!Mf ∈ Rm×n . j = 1.17 (Aﬃne functions). (additivity) (1-homogenity).12 (Linearity is preserved by addition). We say that Mf is the matrix associated with f . Lemma 1.2.2. f (x) = Mf x. Let fj : Rn → R. . . m be aﬃne.15 (Linearity implies linearity of component functions). Lemma 1.2.16 (Linearity of component functions implies linearity). Let f : Rn → Rm be aﬃne and α ∈ R. Moreover Mf +g = Mf + Mg and cf +g = cf + cg . Lemma 1.23 (Aﬃneness implies aﬃneness of component functions). Let f : Rn → Rm be aﬃne.2. Moreover Mfj is the j th row of Mf . .2. scalar multiplication.2. Moreover Mαf = αMf . Constant functions and linear functions are aﬃne. composition and projection: Lemma 1. . Moreover Mf +g = Mf + Mg . Then αf : Rn → Rm is also aﬃne. Let f. composition and projection: Lemma 1.2 Linear functions Deﬁnition 1.22 (Aﬃneness is preserved by composition). Lemma 1. Then fj : Rn → R.3 Aﬃne functions ˜ Deﬁnition 1. g : Rn → Rm be aﬃne.2. f (x) = f (x) + c. f (αx) = αf (x). Aﬃneness is stable under addition. Moreover Mfj is the j th row of Mf . f : Rn → Rm is linear iﬀ ∃!Mf ∈ Rm×n such that ∀x ∈ Rn .2.2.2.9. Then f + g : Rn → Rm is also linear. cf ∈ Rm such that ∀x ∈ Rn . . Moreover Mfj is the j th row of Mf and cfj = (cf )j .10. . Moreover Mg◦f = Mg Mf . Moreover Mαf = αMf and cαf = αcf . x.13 (Linearity is preserved by scalar multiplication).21 (Aﬃneness is preserved by scalar multiplication).14 (Linearity is preserved by composition).8 (Linear functions). We have a complete characterisation of aﬃne functions: Theorem 1.24 (Aﬃneness of component functions implies aﬃneness). Let f. Converse 1. Corollary 1.2. Moreover Mfj is the j th row of Mf and cfj = (cf )j . Then g ◦ f : Rn → Rl is also aﬃne.

Notation 1. j = 1. If f : Rn → Rm is diﬀerentiable at xo ∈ Rn . m be continuous. Lemma 1. Diﬀerentiability is stable under addition. Continuity is stable under addition.4 (Diﬀerentiability is preserved by scalar multiplication).27 (Continuity is preserved by addition).32. 1. Let f : Rn → Rm be diﬀerentiable at x ∈ Rn and α ∈ R. Moreover D(f + g)(x) = Df (x) + Dg(x).1. 2.29 (Continuity is preserved by composition).30 (Continuity implies continuity of component functions). .31 (Continuity of component functions implies continuity). Let f : Rn → Rm be continuous.26. Let f : Rn → Rm be continuous and α ∈ R. Rm ). Then αf : Rn → Rm is also diﬀerentiable at x ∈ Rn . Then g ◦ f : Rn → Rl is also continuous. Lemma 1. Let f.25 (Continuous functions).2. the aﬃne approximation to f at xo is Rn x → Df (xo )(x − xo ) + f (xo ) ∈ Rn . 2. 1. continuous if it is continuous at all xo ∈ Rn . 2.1 (Diﬀerentiable functions and derivatives).2.4 Continuous functions Deﬁnition 1.1 Diﬀerential calculus in Rn Diﬀerential operators The derivative Deﬁnition 2. . Then fj : Rn → R. f : Rn → Rm is 1.2.2. f : Rn → Rm is diﬀerentiable if it is diﬀerentiable at all xo ∈ Rn .1. . discontinuous at xo ∈ Rn if it is not continuous at xo . In this case. composition and projection: Lemma 2. Aﬃne functions are diﬀerentiable.28 (Continuity is preserved by scalar multiplication). f : Rn → Rm is diﬀerentiable at xo ∈ Rn if ∃Df (xo ) ∈ Rm×n such that ∀ > 0 ∃δ > 0 ∀x ∈ Rn . 5 . Corollary 1.1. . scalar multiplication. 2. they are their own aﬃne approximation. Df (xo ) is the derivative of f at xo and Df : Rn → Rm×n is the derivative of f . 3. . Example 1. Let fj : Rn → R.1. 2. Let f : Rn → Rm and g : Rm → Rl be continuous. g : Rn → Rm be diﬀerentiable at x ∈ Rn . Converse 1. Then f + g : Rn → Rm is also diﬀerentiable at x ∈ Rn .2. . .3 (Diﬀerentiability is preserved by addition).1 2. discontinuous if ∃xo ∈ Rn at which it is discontinuous.2. j = 1. 2 2. Let f. Lemma 2. x − xo < δ =⇒ f (x) − f (xo ) < .1. Then f + g : Rn → Rm is also continuous. Aﬃne functions are continuous.2. g : Rn → Rm be continuous. Then αf : Rn → Rm is also continuous. continuous at xo ∈ Rn if ∀ > 0 ∃δ > 0 ∀x ∈ Rn . indeed. Remark 2.1. scalar multiplication. x − xo < δ =⇒ f (x) − (Df (xo )(x − xo ) + f (xo )) < x − xo . Moreover D(αf )(x) = αDf (x). We denote the the set of all continuous functions from Rn to Rm by C(Rn .2.2. . m are also continuous. Then f : Rn → Rm is also continuous. 4.2. composition and projection: Lemma 1.

x1 ). x3 ) = x2 x3 .) 2.9. x3 ) = x2 . Dv f (x).1. . x2 . D3 f2 (x1 . In particular.1. . Let f : Rn → Rm be diﬀerentiable at x ∈ Rn .5 (Chain rule). 2. n. D1 f2 (x1 . 1.9. Let f : Rn → Rm and g : Rm → Rl be diﬀerentiable at x ∈ Rn and f (x) respectively.17. Let f : Rn → Rm be diﬀerentiable at x ∈ Rn . 6 . . Solution. D3 f1 − D1 f3 . . The curl of a R2 -valued function on R2 is a R-valued function on R2 . D1 f2 − D2 f1 ) = (0 − x2 . i = 1. Compare Deﬁnition 2. D1 f2 − D2 f1 ). Theorem 2.1. x2 . Moreover Dfj (x) is the j th row of Df (x).j = Dj fi (x). Then Dv f (x) exists ∀v ∈ Rn with v = 1.1.1. 0 − x3 . Then g ◦ f : Rm → Rl is also diﬀerentiable at x and D(g ◦ f )(x) = Dg(f (x))Df (x). D1 f3 (x1 . . D2 f1 (x1 .8 (Directional and partial derivatives). x3 ) = x3 . x3 ) = x3 x1 .13 these functions are diﬀerentiable. . . 2. the curl of f . n. the curl of f . x3 ) = 0. D3 f1 − D1 f3 . Theorem 2.1. From Corollary 2. even though the function is not diﬀerentiable there. j = 1. . j = 1. x3 ) = x1 . else. m. Then f is diﬀerentiable at x. Converse 2. if f : Rn → Rm is diﬀerentiable at x ∈ Rn then (Df )i.Theorem 2. f Example 2. For i = 1.6. x3 ) = 0. Let all partial derivatives of f : Rn → Rm exist and be continuous at x ∈ Rn . Then. x2 . 2. C 1 (Rn . i = 1.6 and Theorem 2. Deﬁnition 2. (By Theorem 2. Let fj : Rn → R. Let f : R3 → R3 . Theorem 2. Let f : R2 → R2 . .2 The curl Deﬁnition 2.10. . Then fj : Rn → R. 1. Let f : Rn → Rm be diﬀerentiable at x ∈ Rn . x2 . x2 x3 . . m be diﬀerentiable at x ∈ Rn .12 (Diﬀerentiability implies continuity). m are also diﬀerentiable at x.1. . x3 .15 with Deﬁnition 1. Let f : Rn → Rm and x. both of whose partial derivatives exist at 0.18. n. 2. whenever it exists. Then f : Rn → Rm is also diﬀerentiable at x.1.1. x3 ) = x1 x2 . Remark 2.15 (Curl). . curl f : R3 → R3 is deﬁned by curl f := (D2 f3 − D3 f2 . .1.1.1. Then. . . . Find the curl of R3 x → (x1 x2 . D3 f1 (x1 . Di f (x) := Dei f (x) is the ith partial derivative of f . Remark 2. 2. . . Corollary 2.1. j = 1. . . x2 . Rm ) is the set of all functions from Rn to Rm all of whose partial derivatives exist and are continuous. . Notation 2.11. . Existence of all directional derivatives of a function at a point does not imply that the function is diﬀerentiable at that point.1.14.7. Remark 2. This is illustrated by the function R x → 2 ( 1 0 if x1 x2 = 0.1. curl f : R2 → R is deﬁned by curl f := D1 f2 − D2 f1 . f3 (x1 . whenever it exists. x2 . Thus curl f (x) = (D2 f3 − D3 f2 . x2 . v ∈ Rn with v = 1. . 0 − x1 ) = −(x2 . Moreover Dfj (x) is the j th row of Df (x).1. Di f (x) exists and is the ith column of Df (x).1. Moreover Dv f (x) = Df (x)v . the directional derivative of f in the direction v at x is the derivative at 0 of the section of f at x along v : ˛ Dv f (x) := 2. We see that f1 (x1 . f2 (x1 . ˛ dt t=0 1.1.16. ˛ d f (x + tv)˛ . x2 . 2.13 (Suﬃcient condition for diﬀerentiability). 2. x3 ) = 0. D2 f3 (x1 . Then f is continuous at x. but the curl of a R3 -valued function on R3 is another R3 -valued function on R3 .1. 2. x3 x1 ) ∈ R3 . The curl is deﬁned only for functions R2 → R2 and R3 → R3 . x2 .

Note that the divergence is deﬁned only for functions Rn → Rn and the divergence of a function is a R-valued function on Rn . R) is harmonic if ∆f = 0. x3 ) = 1. C 2 (Rn . div : C 1 (Rn . j = 1. R). curl : C 1 (R2 . x2 . 3. E.21 and 2. Rm ) to its derivative. i. Solution. Notation 2. i. the divergence of f . x2 . x3 ) = x1 . x3 ) = x3 . 2. R1×n ). R2 ) to its curl. (D2 f )i. x3 ) = x2 . i.21 (Diﬀerential operators).24. R) is deﬁned by ∆ = div ◦D ≡ 2. x3 ) = 1. Rn ) to its divergence. x2 . R2 ) → C(R2 .1. R3 ) → C(R3 . R) → C(Rn . R) and C 2 (R3 .3 The divergence Deﬁnition 2. Rm ) and α ∈ R. Theorem 2.23 (Second derivative). Pn 2 i=1 Di .26 are linear. wherever it exists. Deﬁnition 2. D3 id3 (x1 . (The trace of a square matrix is the sum of its diagonal elements. R3 ). Find the divergence of the identity function on R3 . 1. R3 ) to its curl.) Example 2. 2. If f is diﬀerentiable then div f = trace(Df ).1.1.1. The diﬀerential operators in Deﬁnitions 2.1. we see that id1 (x1 . Rm×n ) is the function that maps a function in C 1 (Rn .1. 1. .4 The derivative. Remark 2. 4. f ∈ C 2 (Rn . g ∈ C 1 (Rn . D : C 1 (Rn . R) → C(Rn . div ◦ curl = 0 on C 2 (R3 .i . 2.22. 2. curl ◦D = 0 on C 2 (R2 . 7 . Remark 2. R) is the function that maps a function in C 1 (Rn . x2 . the function R3 x → x ∈ R3 . x2 .2.1. 1. Rn ) → C(Rn . 2. Rn×n ) is deﬁned by D2 f := D(Df ). curl and divergence as diﬀerential operators Deﬁnition 2. n. Deﬁnition 2. x2 . D(f + g) = Df + Dg.1. 1.26 (Laplacian and harmonic functions).27. The Laplacian. Remark 2. For f ∈ C 2 (Rn . D2 : C 2 (Rn . D1 id1 (x1 .j = (D2 f )j. . id3 (x1 .g. R) is the set of all functions from Rn to R whose derivative exists and is in C 1 (Rn .19. x3 ) ∈ R3 . for f.e. id Thus div f (x) = D1 id1 (x) + D2 id2 (x) + D3 id3 (x) = 1 + 1 + 1 = 3.1.25. x2 .18 (Divergence). id2 (x1 . Writing the function as R3 (x1 . Rm ) → C(Rn . R3 ) is the function that maps a function in C 1 (R3 .1.. x3 ) = 1. For f : Rn → Rn .. x3 ) → (x1 .e. R). x2 .. . D2 id2 (x1 . This is because Di Dj f = Dj Di f .1.1. curl : C 1 (R3 . ∆ : C 2 (Rn . .1.20. R) is the function that maps a function in C 1 (R2 . div f : Rn → R is deﬁned by div f := D1 f1 + D2 f2 + · · · + Dn fn .1. D2 f is symmetric. D(αf ) = αDf.

Then Z C Df · ds = 0. 3 Integral calculus in Rn Remark 3. Dp(t) dt. Corollary 3. 1. b]. A path in Rn is a C 1 function [a.2.2). −D2 g. A curve in Rn is 1. 1) and the normal that points downwards is (D1 g. Jordan’s theorem provides a deﬁnition of int(C).1 3. the interior of C . Closed if it can be parametrised by p ∈ C 1 ([a. Simple if it can be parametrised by a path p ∈ C 1 ([a.4.3.1.2. which coincides with our intuitive understanding and proves that all simple closed curves in R2 have interiors. 2.2 2. curves and parametrisations).0. Let C ⊂ Rn be a curve parametrised by p ∈ C 1 ([a.2 Diﬀerentiable functions R2 → R3 : Parametrised surfaces in R3 Remark 2. 2. A path parametrises its curve. The normal that points upwards is (−D1 g. R).1 (Paths. 3. 3. We deﬁne C f · ds := f (p(t)). 2. Rn ) such that p is injective on [a. b].1. −1). All these operators are linear. This integral is independent of the parametrisation of C upto sign.5 (Surfaces that are graphs of functions).2 (Simple.1 Parametrised curves and surfaces Diﬀerentiable functions R → Rn : Parametrised curves Deﬁnition 2. g(x)) ∈ R3 . Rn ) such that p(a) = p(b). A curve in Rn is the image of a path in Rn . The normal to this surface at (x. D2 g. Below we deﬁne integral operators on (one-dimensional) curves in Rn .2 (Fundamental theorem of line integrals). Simple closed curves in R2 can be oriented either clockwise or anti-clockwise. D2 g. Let g ∈ C 1 (Ω ⊂ R2 . g(x)) is ±(D1 g.6 (Linearity of integration). Note that the right hand side of (1) depends only on f evaluated at the end points of C . b] → Rn . else p(a) and p(b) are the end points of the curve.2.1. 8 . Remark 2.2. b) and (a. Then Z C Df · ds = f (p(b)) − f (p(a)).2.2. (1) RemarkR3.1. Theorem 3. Rn ).1 Integration on one-dimensional domains Integration on curves in Rn Z Z b a Deﬁnition 3. Deﬁnition 2. Let C ⊂ Rn be a curve parametrised by p ∈ C 1 ([a. Rn ) and f ∈ C(C. Let C ⊂ Rn be a simple closed curve and f ∈ C 1 (Rn . −1).1. (two-dimensional) subsets of R2 and surfaces in R3 . Thus C Df · ds depends on C only through its end points.2. and (three-dimensional) subsets of R3 .2. Rn ) and f ∈ C 1 (Rn . b]. Then the graph of g p is the surface in R3 parametrised by Ω x → (x.3 (Interiors of simple closed curves). b].1. Let C ⊂ R2 be a simple closed curve in R2 . R).4 (to Theorem 3.1. else it is self-intersecting. R). Remark 2. self-intersectiong and closed curves). b].

3. from your feet to your head). 3.2. ∂S .2. R). Remark 3. Let D ⊂ R2 be elementary and f ∈ C 1 (R2 .. Then ZZZ ZZ V div(f ) dV = ∂V f · dS where the normal to ∂V is taken to point outwards.2.2. R3 ). We deﬁne ZZ ZZ S f · dS := D f ◦ p. Let D ⊂ R2 be elementary and S ⊂ R3 be the graph of a function g ∈ C 1 (D.2.2 3. R3 ). R3 ). Deﬁnition 3. 9 .3 Integration on three-dimensional domains Theorem 3. R3 ).2.1 (The divergence theorem or Gauss’ theorem).2.5 (Stokes’ theorem). the boundary of S .e.2.5).6.1 (Green’s theorem). Let V ⊂ R3 be elementary and f ∈ C 1 (R3 . Let D ⊂ R2 be elementary and S ⊂ R3 be parametrised by p ∈ C 1 (D.1 Integration on two-dimensional domains Integration on R2 ZZ Z Theorem 3.3.3. Let D ⊂ R2 be elementary and S ⊂ R3 be parametrised by p ∈ C 1 (D. Let f ∈ C(R3 . Then the (direction of the) normal to S should be picked so as to point upwards relative to you (i. R3 ).2 Integration on surfaces in R3 Deﬁnition 3.4 (Boundary of a surface). Then ZZ S curl(f ) · dS = Z ∂S f · ds (2) where S and ∂S are consistently oriented: Imagine walking along ∂S so that S is to your left. Remark 3. is deﬁned by ∂S := {(x. Then D curl(f ) dA = ∂D f · ds where ∂D is oriented anti-clockwise. Note that the right hand side of (2) depends only on f integrated over ∂S . g(x)) | x ∈ ∂D}. 3. Theorem 3. Let f ∈ C(S. Green’s theorem is the two-dimensional version of Stokes’ theorem (Theorem 3. Thus RR S curl(f ) · dS depends on S only through its boundary.2. R2 ).2. n ◦ p dA (this integral is independent of the parametrisation of S upto sign).

- 23-MATHS
- agnotes
- Impossibility theorems for integrals
- Bochner Integral
- Hw02 1.1 Linear Constant Coefficients.pdf
- Calculus of Variation
- Homework 0
- Devil's Staircase
- Introduction to the Calculus of Variations
- Greens Function and SL operator.pdf
- Euler
- Tinh Toan Trong Matlab
- Space With Functions
- Gauss Quadrature Integration
- Hedmark
- Integration
- Mumford Oda
- Mathematics Final
- Another Possibility
- kdfb
- Differential geometry notes 1
- A.marsiglietti - On the Improvement of Concavity of Convex Measures
- FEM Notes 2
- Noncritical Belyi Maps
- UNIT_7.PDF Engg Math
- Double Integral
- Introduction to Probability Theory and Stochastic Processes
- Syllabus Jamia Millia Islamia
- Louis H. Kauffman- Math 215 Problem Sampler
- notes100-ch5.pdf
- Notes-MVC

Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

We've moved you to where you read on your other device.

Get the full title to continue

Get the full title to continue reading from where you left off, or restart the preview.

scribd