Weirdness of non-Hermitian matrices Statistical Mechanics: Entropy, Order Parameters, and Complexity James P. Sethna, Cornell University, Spring 2020 Lecture 23 Good morning! Welcome back to Statistical Mechanics! I hope you are all safe and happy and well, wherever you may be working. This morning I'll start by discussing left and right eigenvectors. The transition matrix for 'Coin flips and Markov' was weird. It wasn't symmetric or Hermitian. Its left and right eigenvectors were not the same. Beware! Markov chain matrices demand a whole new set of linear algebra tools. The decomposition of a state rho into right eigenvectors, for example, will always have a coefficient one for the steady-state rho*! You get the coefficients of the decomposition by dotting rho with the left eigenvectors. In many cases, you are better off dealing instead with the singular values and singular vectors of the matrix A, which are given by the square roots of the eigenvalues of the symmetric matrix A transpose A. These generalize the eigenvalues in the case of symmetric matrices (except the singular values are the absolute value of the eigenvalues). This singular value decomposition also is useful for non-square matrices, and is often a more stable, accurate approach to solving matrix equations and taking matrix inverses. It is also the foundation of principal component analysis, now a standard method for analyzing data. For Markov chains, eigenvalues are important. The evolution of a state rho after n steps is given by applying the Markov matrix n times, so the decomposition of rho into eigenvectors rho_alpha gets multiplied by the eigenvalue lambda_alpha^n. The eigenvalue lambda_1 with magnitude closest to one, determines the eigenvector that goes away most slowly. You would usually be right to think that this slowest decay would also give the time needed to reach equilibrium. But this is not always true! Consider the Markov chain describing a system of length N that moves x by one to the right at each step, until it reaches the cliff at the end. Clearly the time it takes to reach the stationary state at the end is N. But the matrix has N-1 zero elements on the diagonal and only zeros above the diagonal -- lambda_1 is zero, as are all the others, and there are only two eigenvectors: rho* and the state just before the cliff. So the slowest decaying eigenvalue (goes to zero in one step) in this case is not a good measure of the time to approach the stationary state (around N). Left and right eigenvectors will also be important in our study of the renormalization group. Let us turn now to the in-class activity for today, where we shall study Detailed balance.