Let A be a matrix with complex enteries. If A is hermitian as well as ...
If a matrix A is both Hermitian and unitary, then it must be a diagonal matrix with diagonal entries of absolute value 1.
To prove this, let's start with the definition of a Hermitian matrix. A matrix A is Hermitian if it is equal to its conjugate transpose, i.e., A = A*.
Now, let's consider the unitary property of A. A matrix A is unitary if its conjugate transpose is equal to its inverse, i.e., A* = A^(-1).
Combining these two properties, we have A = A* = A^(-1).
Taking the conjugate transpose of both sides, we get A* = (A^(-1))*.
Now, let's multiply both sides of A = A* = A^(-1) by A to get A^2 = A*A^(-1) = I, where I is the identity matrix.
Taking the conjugate transpose of both sides, we have (A^2)* = (A*A^(-1))* = I* = I.
Now, let's multiply both sides of A^2 = I by A* to get (A^2)*(A*) = I*(A*).
Using the property of matrix multiplication, we have (A*A^(-1))*(A*) = A*(A*A^(-1)) = A*I = A.
Simplifying the left side, we get A*A^2 = A.
Since matrix multiplication is associative, we have A*(A*A) = A.
Using the definition of a Hermitian matrix, we have A*(A*A) = (A*A)* = A^2* = A.
Therefore, A is equal to its conjugate transpose squared, i.e., A = A^2.
This means that A is a diagonal matrix, since the diagonal entries of A^2 are the squares of the diagonal entries of A.
Furthermore, since A is Hermitian, its diagonal entries must be real numbers. Therefore, the diagonal entries of A are real numbers of absolute value 1.
In conclusion, if a matrix A is both Hermitian and unitary, it must be a diagonal matrix with diagonal entries of absolute value 1.