A real square matrix A = [a_{ij}] is called symmetric if transposition leaves it unchanged,
AT = A thus a_{ij} = a_{ji} .
The Eigenvalues of symmetric matrix are always real.
Example:
A real square matrix A = [aij] is called skewsymmetric if transposition gives the negative of A,
AT =  A thus aij = a_{ji}
Every skewsymmetric matrix has all main diagonal entries zero.
The Eigenvalues of skewsymmetric matrix are pure imaginary or zero.
Example :
NOTE:
Any real square matrix A may be written as the sum of a symmetric matrix R and a skewsymmetric matrix S, where
Example:
A real square matrix A =[a_{ij}] is called orthogonal if transposition gives the inverse of A,
AT = A^{1}
NOTE:
(i) A real square matrix is orthogonal if and only if its column vectors and also its row vectors form an orthonormal system, that is
Thus A^{T}A = I
(ii) The determinate of an orthogonal matrix has the value +1 or 1.
(iii) The eigenvalues of an orthogonal matrix A are real or complex conjugate in pairs and have absolute value 1.
Example:
Its characteristic equation is
The complex conjugate of an matrix A is formed by taking the complex conjugate of each element. Thus
For the conjugate transpose, we use the notation
Example:
A square matrix A = [a_{ij}] is called Hermitian if
If A is Hermitian, the entries on the main diagonal must satisfy that is they are real.
If a Hermitian matrix is real, then = A^{T} = A. Hence a real Hermitian matrix is a symmetric matrix.
The eigenvalues of a Hermitian matrix (and thus a symmetric matrix) are real.
Example: The eigenvalues are 9, 2 .
Skew Hermitian Matrix
A square matrix A = [aij] is called skewHermitian if
Example:
The eigenvalues are 4i,  2i.
A square matrix A = [a_{ij}] is called unitary if
Example:
The eigenvalues are
Eigenvectors of an n x n matrix A may (or may not) form a basis. If they do, we can use them for “diagonalizing” A , that is, for transforming it into diagonal form with the eigenvalues on the main diagonal.
An n x n matrix Â is called similar to an n x n matrix A if
Â = P^{1}AP
For some (nonsingular) n x n matrix P . This transformation, which gives Â from A , is called similarity transformation.
If an n x n matrix A has n distinct eigenvalues, then A has a basis of eigenvectors
If an n x n matrix A has a basis of eigenvectors, then
D = X^{1} AX
is diagonal, with the eigenvalues of A as the entries on the main diagonal. Here X is the matrix with these eigenvectors as column vectors. Also
D^{m }= X^{1} A^{m}X
A square matrix which is not diagonalizable is called defective.
Example: A = has eigenvalues 6, 1. The corresponding eigenvectors are
Thus
Example: A = has eigenvalues λ_{1} =3,λ_{2},=2,λ_{1} =1.
The eigenvectors of A corresponds to eigen value respectively λ_{1} = 3. λ_{2} = 2. λ_{1} = 1 are
Now, let X be the matrix with these eigenvectors as its columns:
Note there is no preferred order of the eigenvectors inX; changing the order of the eigenvectors in X just changes the order of the eigenvalues in the diagonalzed form of A.
Thus, D = X^{1} AX =
Note that the eigenvalues λ_{1}, = 3, λ_{2} = 2, λ_{1} = 1 appear in the diagonal matrix.
Functional Matrices
If the matrix A is diagonalizable, then we can find a matrix X and a diagonal matrix D such that
D = X^{1} AX ⇒ A = XDX^{1} and A^{n} = XD^{n}X^{1}
Applying the power series definition to this decomposition, we find that f (A) is defined
by
f (A) = I + α A + βA^{2} + y A^{3} + (a, β,y are coefficient of Taylor Expansion)
⇒ f (A) = XIX^{1} + aXDX^{1} + βXD^{2}X^{1} + γXD^{3}X^{1} +....... ∵ A^{n} = XD^{n}X^{1}
⇒ f (A) = X [I + αD + βD^{2} + yD^{3} + ........] X^{1}
where d_{1}, d_{2}, d_{n} denote the diagonal entries of D.
Note: If A is itself diagonal then f (A) = f (D)
Example 10: If matrix
For matrix A Eigenvalues are λ_{1},λ_{2} and Eigenvectors are respectively
Thus
Example: If matrix
then
Example 11: If matrix A = then find e^{A} .
For eigenvalues (A  λI) = 0 ⇒
Eigenvector can be determined by the equation AX = λX .
For λ_{1} = 1
Normalized eigenvector can be determined by relation X_{1}^{T}X_{1} = 1
Normalized Eigenvector corresponds to λ_{1} = 1 is
For λ_{2} = 1
Normalized eigenvector can be determined by relation X_{2}^{T}X_{2} = 1.
Normalized Eigenvector corresponds to λ_{2} = 1 is
Hence, The cofactor of X is X^{c} =
Hence
NOTE: Always try to express X as unitary matrix because its inverse is same. If X is not unitary matrix then we have to find its inverse.
Thus,
78 videos18 docs24 tests

1. What is the difference between a Hermitian and a SkewHermitian matrix? 
2. What are the properties of a Unitary matrix? 
3. How do you determine if two matrices are similar? 
4. What is the significance of a basis of eigenvectors in diagonalizing a matrix? 
5. What are the properties of Symmetric, SkewSymmetric, Orthogonal, and Complex matrices? 

Explore Courses for Physics exam
