Linear-Algebra-MIT-Gilbert-Strang-(L24-L26)

Ting Qiao
5 min readOct 25, 2020

Lecture 24: Markov and Fourier.

  1. Markov matrice:

All entries ≥ 0

All columns add to 1.

Eigenvalue = 1 is one of the eigenvalues. All eigenvalue ≤ 1 to guarantee to have a steady-state.

2. Find the steady-state of the Markov matrix. The power of the matrix. Similar to what happened before. We will need to find the eigenvalue and eigenvector.

In the last lecture, for a differential equation, the steady-state criteria were eigenvalue =0. This time, we take the power of the matrix. So steady-state should be something related to 1. And all other eigenvalues ≤1.

3. Why eigenvalue is 1? Why A-I is singular.

4. So the A^T and A is just a left matrix and right matrix. They share the same eigenvalue, but different eigenvectors.

5. Therefore, we find the Null space of (A-λI) to find the eigenvectors.

6. Applications: It is a state transition matrix. It is describing the transition probability.

7. As transition goes on, the u = Au -> u = A^n u. It is powered up.

To decide the steady-state, we need the eigenvector. Find the eigenvalue first, then eigenvector. Plus a function to decompose initial state u0.

8. There are some other fields, ECE et al. like to use row-based approach. That means all rows add to 1. Instead of doing it from the right-hand side, they go left-hand side.

9. Orthonormal: Take a transpose of q1.

10. Fourier

Vectos are functions (cos(x), sin(x), cos(2x), sin(2x)……) as basis.

And most important, those basis are orthogonal. The inner product of vectors is 0.

11. Like what we did in vector space. We take a transpose of cos(x)!

Integral from 0 to 2pi: cos(x)f(x) = a1cos(x)cos(x) + b1 sin(x)cos(x) + …

Lecture 25. Symmetric Matrices.

  1. For Symmetric Matrices:

2. For symmetric matrices. Decomposition can be different.

In Mathematics: Spectral Theorem (QAQ^T)^T = (QAQ^T)

In Mechanics: Principal Axis Theorem

3. Why eigenvalues are real. If the A is complex, okay as well. As long as it is symmetric, A=bar(A)^T.

4. Which means x*x bar is a real positive number that can be cancelled out.

5. Every symmetric matrix is a combination of perpendicular projection matrices.

6. When we compute the eigenvalue for a matrix. Using the Det(A-λI)=0 becomes impossible when doing with high-dimensional data. However, it is able to compute the pivots. And here is a rule: For a symmetric matrix, the number of the positive(negative) sign of pivots is equal to the number of the positive(negative) eigenvalues.

7. The product of eigenvalues = determinant = the product of pivots.

8. Positive Definite matrix.

Lecture 26. Complex matrix, Fast Fourier. (Not confident about this lecture. )

  1. Fast Fourier: from n² to n*log(n).
  2. Start from length.

3. Hermitian.

4. Similar to the symmetric matrix, the hermitian matrix has:

5. If the matrix is an orthogonal matrix. In complex space, it has the corresponding matrix called the Unitary matrix.

6. Move on to the Fourier matrix. Full matrix. None zeros.

7. Why it is remarkable? The F4 matrix is a four-point Fourier transform(4 frequencies? ). When we want to find the four-point Fourier transform, we multiply this F4 matrix or the inverse of this F4 matrix.

8. Fast Fourier Transform.

The idea of reducing the computation needed is based on finding the relationship between F64 and F32 (Half of the size matrix. F6, F3). In this case, the F64 can be computed from the F32, so the computations needed is reduced to compute F32 and do something else.

9. Find the connection!

10. Recursion to take down the 32 to 16, then 16 to 8, 8 to 4, 4 to 2, 2 to 1.

--

--