Linear-Algebra-MIT-Gilbert-Strang-(L27-L29)

Ting Qiao
6 min readDec 16, 2020

Lecture 27 Positive Definite and its Graphical analysis

  1. How to check whether it is positive definite.

Determinant > 0

Eigenvalue > 0

pivot: pivot and pivot divided by a (for the second pviot, multiply together to become determinant)

x^T A x > 0

Semi definite when it is equal to 0

2. 2D example first

Find the pivot and determinate first.

Validate x^T *A *x:

It will give out a Quadric form:

3. Now let’s study how this quadratic looks like. The graph will be something like this. Notice that, it is not a positive define. The curve goes on both sides.

4. Positive Definite Quadratic. The curve goes upward only.

5. In order to decide the properties of these quadratic function, the first and second derivative is needed.

So in order to decide whether the curve will go below zero, we need to find the lowest point, where the derivative is equal to zero. And to decide the curve’s direction, the second-order derivative is needed.

6. Another way to decide the shape of the curve is to write it in square form. And the pivots are the coefficient of squares.

7. Second order derivative: Symmetric. It also needs to be positive definite.

8. 3D examples.

9. The shape of the function will be like rugby. The length of the axis is defined by the eigenvalues. length/2 = 1/eigenvalue.

Lecture 28. Similar matrice

  1. Positive Definite is from the least square problem.
  2. The inverse of positive definite is positive definite (1/eigenvalue > 0)

3. validated by the x^T * A^TA * x > 0

4. If rank=n means there is no null space, it is guaranteed to be positive definite. Row exchanges are not needed if dealing with a positive definite matrix.

5. Similar Matrix from now on

6. All matrices in the same family are similar to each other. Diagonal is the best, and most needed.

7. Similar matrices have the same eigenvalues. The Eigenvector is not the same.

8. By using this, the eigenvector of one of the matrices is M^-1 * Another one.

Diagonalize: Diagonal matrix’s eigenvectors are zeros and ones.

9. Two eigenvalues could be the same. Possibly is not able to diagonalize.

The two matrices are similar, in the same family. However, the lower one is not diagonalizable.

Jordan form is trying to find the most diagonal one(some of them is not diagonalizable) in the family.

10. All those are in the same family with the same trace and determinants. But, not diagonalizable.

11. Jorden theorem:

Lecture 29. Singular Value Decomposition.

  1. Eigenvalue(A^T*A) = Eigenvalue(A*A^T)
  2. This decomposition is very important. As it shows, it decomposite a matrix into a symmetric positive form. However, Eigenvector matrix isn’t always orthogonal.

A = S*A*S^T

A = Orth * Diag * Orth

3. Looking for something that orthogonal basis in row space transformed by A, still orthogonal in column space. In my opinion, the U and V are actually similar thing. They are all unit vectors, but in different space(row and column). This decomposition is making a matrix represented by this basis.

4. Express it in matrix form, In the end, we can solve it by using: A^T * A(Symmetric Positive Definite). So v will be the eigenvectors, sigmas will be its eigenvalue.

Note: How to find u? One way is finding A* A^T.

5. 2D Example. Compute the eigenvalues first. and then compute the rest.

6. Find u by using the opposite order of A multiplication. The Sigma multiplied by Sigma is known.

7. Another example that is not so special. It is easy to pick basis in column and row space.

8. Do the same procedure, A^T * A, A * A^T, find the U and V

9. We are choosing the right basis for the 4 subspaces for the linear algebra. We need the eigenvalue, because these basis matrix are diagonal.

--

--