You are on page 1of 16

Solving of linear Equations

using SVD
nSolving a linear equation
nGauss elimination and SVD

nHowTo

nSome tricks for SVD


Jochen Schieck
MPI Munich
Problems in Linear Equations

n generally in solving Ax=b two error


contributions possible:
1. intrinsic errors in A and b
2. numerical errors due to rounding
Mathematical description and handling
available
General Error Analysis
Matrix Norm necessary
• e.g.

How does the final result (and its error) depend on


the input value and ?

with

Eigenvalues of matrix
Example for Condition of Matrix

Exact solution: (2,-2)T cond(A)=1.513x108

Change input 0.8642 g 0.86419999


values:
0.1440 g 0.14400001

approximate solution: (0.9911,-0.4870)T

NOT ACCEPTABLE!
Gauss Elimination

n Error due to rounding: n=m=30 g1013

with eps = machine accuracy (double: ~10-16)

However, this is the worst case


scenario!
Singular Value Decomposition

n here for (nxn) case, valid also for (nxm)


n Solution of linear equations numerically
difficult for matrices with bad condition:
Ø regular matrices in numeric approximation
can be singular
Ø SVD helps finding and dealing with the
sigular values
How does SVD work
Definition of singular value decomposition:

with U and V orthogonal matrices

σ 1 
singular values  
 σ2 
 ... 
 
 σ 
 n

σi are eigenvalues of ATA


Solution
Solution of the Equation:

numeric behaviour of SVD can be determined:


SVD after Golub and Reisch
Householder Matrix P,Q=1-wwT
(numerically OK)

x x x x * * * *
x x x x  0 * * *
A= → PA = A' = 
x x x x 0 * * *
   
x x x x 0 * * *

* * * * x * 0 0
0 * * * 0 * * *
A' =  → A' Q = A' ' = 
0 * * * 0 * * *
   
0 * * * 0 * * *

P,Q unitary
SVD after Golub and Reisch
after a couple of iterations:

J0 Bidiagonal form:
with:
q1 e2 0 0
0
J0 = 
q2 e3 *  J 0 = Pn ...P1 AQ1...Qn − 2
0 0 q3 en 
  Q and P Householder matirces
0 0 0 qn 

ØA and J0 have the same singular values


SVD after Golub and Reisch
use iterative procedure to transform
bidiagonal J0 matrix to diagonal form

apply ‘Givens reflections’ to bidiagonal Matrix:


_
J 0 = Sn −1,n ...S23S12 J 0 T12 T23 ...Tn −1,n

‘Givens Reflection’:

~cubic convergence expected


Arithmetic Expenses

n Gauss (normal) solution


n ½mn2+1/6n3
n SVD Golub-Reinsch
n 2mn2+4n3

Ø m=n :
SVD is 9 times more expensive!
How to deal with Singularities
n singularities are determined with the SVD
n 1/σi is used for solution of linear equation
n relate 1/σi to machine accuracy and
resolution τ
n usage of values σi < τ corrupts complet
result!
Ø careful handling necessary!
n simple approach:
n negelect all values in matrix with σi < τ
Smooth cutoff

n Regularisation of the
singularities
1/s2 n Replace the singular
values with a
function:

1 s2

s 2
(
s2 + t 2 )
2

τ=0.05
Example

Number-of-
events response
matrix

• di ~ bi/∆bi
di < 1 g
statistically insignificant

• Problem for alignment:


determination of τ
Conclusion

n Numeric Solution of a regular linear


equation can be distorted by singular
behaviour
n SVD returns singular values
n Singular values can be handeld with
smooth cut off
n Mathemaical well described procedure

You might also like