Professional Documents
Culture Documents
Ramin Shamshiri MAP5345 Project2
Ramin Shamshiri MAP5345 Project2
Ramin Shamshiri
th
(A.1)
Product rule of gradient: The gradient of the product of two scalar fields and follows the same form as the product rule in single variable calculus. (A.2)
Divergence:
div F
(A.3)
Product of a scalar and a vector: . . . Laplace operator, defined as the divergence of the gradient (dot product of . with itself)
(A.4)
(A.5)
Fundamental Theorem of Calclus: (A.6) (A.7) Divergence Theorem: be a vector field over Let F:
Where is the boundary of and n r is the unit vector that is outward to the surface at the point Integration by part: If is an open bounded subset of with piecewise smooth boundary an if and | closure of , then the formula for integration by part ( ) is: .
.F
F.
. Then (A.8)
(A.9)
Ramin Shamshiri
th
1.
Integral Theorems of Vector Calculus. Suppose that is a bounded domain in with sufficiently regular boundary and outward unit normal vector field . Given a vector field in V in , we have so called divergence theorem:
.V
V. n.
a.
(a.1.2)
(a.1.3)
. .
. .n .n
(a.1.4)
(a.1.5)
Plugging (a.1.5) into (a.1.4) and re-arranging will yield (a.1.6) which is equivalent to (a.1.2),
(a.1.6)
Ramin Shamshiri
th
b.
and
Solution: (b.1.1, Greens first identity) According to (A.1.2), we have: . Integrating both sides yields:
(b.2.1) (b.2.3)
. .
(b.1.2)
Since . n is the directional derivative of in the direction of the outward pointing normal to the surface element , then .n / . Thus we can re-arrange and write equation (b.1.3) as below which is equivalent to (b.1.1). This is called Greens first identity.
(b.1.3)
(b.1.4)
Solution (b.2.3): .
Since
Ramin Shamshiri
th
Solution, (b.2.1, Greens second identity): Let functions on the closure of , then stand for the linear differential operator, . If and: and be two continuously differentiable
(b.2.2)
Let
, (where
and
since
. . .
. . .
(b.2.5)
Therefore:
. . .
(b.2.6)
(b.2.7)
n n
(b.2.8) (b.2.9)
. n n
n n
(b.2.10)
(b.2.11)
Ramin Shamshiri
th
2.
Let be a rectangular domain in with sides of length complete system of eigenfunctions for the BVP
in , on Solution: Equation (2.1) with u=0 along the entire boundary can be written as follow: 0, , 0 0 ,0 , 0 0 (2.1) (2.2) (2.3)
0 and
, ,
This is a two-dimensional eigenvalues problem. The eigenvalues itself is linear homogenous partial differential equation in two independent variables with homogenous boundary conditions. We can solve (2.2)-(2.3) using the method of separation of variables in Cartesian coordinates; , .
and (2.4)
Dividing (2.4) by 1
separates x and y; 1
(2.5)
Now we have two ordinary differential equations resulted from separation of variables of a partial differential equation with two independent variables. , , 0 0 0 0 0 (2.6) 0 (2.7) is the eigenvalues and is the
Equation (2.6) is Sturm-Liouville eigenvalues problems in the x-variable where eigenfunctions. The eigenvalues for (2.6) are: , 1,2,3,
and the corresponding eigenfunctions are: For each value of , (2.7) is still an eigenvalues problem. There are infinite number of eigenvalues Thus it should be doubled subscripted, . The eigenvalues for (2.7) are: , 1,2,3, for each .
1,2,3,
Therefore, the complete system of eigenfunctions for the BVP in (2.1) is: , . , 1,2,3, &
1,2,3,
(2.8)
Ramin Shamshiri
th
3. Consider the following eigenvalues problem, known as Helmholtz equation, a. b. 0 with a . and (2.1)
Prove that all eigenvalues are real. Prove that eigenfunctions belonging to different eigenvalues weight over the entire domain . 0
(2.2)
c.
(2.3)
Solution (a): We can use the orthogonality of eigenfunctions to prove that the eigenvalues are real. To show that the eigenfunctions are orthogonal, we assume that there are infinite numbers of eigenvalues for (2.1) and that the resulting set of eigenfunctions is complete. Let , in which case the notation for the multidimensional eigenvalue problem in (2.1) becomes: 0 (2.4)
Comparing (2.4) with standard Sturm-Liouville differential equation shows that the weight function ( ) for this multidimensional problem is expected to be 1. Let and be eigenvalues with corresponding eigenfunctions and . Then equation (2.1) can be written as: 0 0 or or 0 0 (2.5) (2.6)
For many different kind of boundary conditions (i.e., regular Sturm-Liouville types, periodic case and the singular case), the boundary terms vanish if and both satisfy the same set of homogenous boundary conditions. Since and are eigenfunctions, they satisfy this condition, and thus (2.7) implies that: 0 0 (2.8)
If
, then
which means that eigenfunctions to different eigenvalues are orthogonal (in multidimensional sense with 1). To prove that the eigenvalues are real. If is a complex eigenvalue and the corresponding eigenfunctions: 0 (2.1)
(2.9)
Ramin Shamshiri
th
The complex conjugate of (2.1) is also valid: 0 The complex conjugate of is exactly operating on the complex conjugate of coefficients of the linear differential operator are also real, thus: 0 , (2.10) , since the
(2.11)
If satisfies boundary conditions with real coefficients, then satisfies the same boundary conditions. Equation (2.11) and the boundary conditions show that satisfies the Helmholtz equation but with eigenvalue being . Therefore, if is a complex eigenvalue with corresponding eigenfunctions , then is also an eigenvalue with corresponding eigenfunctions . However we can show that cannot be complex. According to our fundamental orthogonality theorem proved in (2.8)-(2.9), the corresponding eigenfunctions ( and ) must be orthogonal. Thus:
0, the integral in (2.12) is 0. In fact, the integral can equal zero only if is an eigenfunctions. Thus (2.12) implies that , and hence is real.
(2.12) 0, which is
Ramin Shamshiri
th
Solution (b): If two or more eigenfunctions corresponding to the same eigenvalue, they can be made orthogonal to each other by Gram-Schmidt method. Suppose , , , are independent eigenfunctions corresponding to the same , ,, which are mutually eigenvalue. We will form a set of n independent eigenfunctions denoted orthogonal, even if , , , are not. Let be any one eigenfunctions. Any linear combination of the eigenfunctions is also an eigenfunctions (since they satisfy the same linear homogenous differential equation and boundary conditions). Thus, is also an eigenfunctions (automatically independent of , where is an arbitrary constant. We chose so that is orthogonal to :
0 becomes
is uniquely determined:
Since there may be more than two eigenfunctions corresponding to the same eigenvalue, we continue this process. , where we choose and so that is orthogonal to the A third eigenfunctions is previous two:
0. Thus
0 , and hence
However,
is already orthogonal to 0
easily determining the two constants. This process can be uses to determine n orthogonal eigenfunctions. In general,
Ramin Shamshiri
th
10
Solution (c): Multiplying (2.1) by and integrating over the entire region and solving for yields: 0 (2.13)
Integration by part is based on the product rule for the derivative, Instead of using the derivative, we use a product rule for the divergence as in (A.4). Letting F follows that: Since . . and . . . | . | , we can write above as below: (2.15)
in (A.4), it
(2.14)
| | |
Using (2.15), the quotient in (2.13) yields an alternative expression for the eigenvalue:
. .
Using the divergence theorem to evaluate the first integral in the numerator of (2.17), it follows:
(2.16)
(2.17)
Ramin Shamshiri
th