A set of vectors Ψ, ϕ, X, ... and set of scalars a, b, c defined over vector space which will follow a rule for vector addition and rule for scalar multiplication.
(i) Addition Rule
If Ψ and ϕ are vectors of elements of a space, their sum Ψ + ϕ is also vector of the same space.
(ii) Multiplication rule
The scalar product of two functions ϕ(x) and Ψ(x) is given by (Ψ ϕ) = ∫Ψ*(x)ϕ(x)dx. where ϕ(x) and Ψ(x) are two com plex function of variable x , ϕ* (x) and Ψ* (x) are complex conjugate of ϕ(x) and Ψ(x) respectively.
The scalar product of two function ϕ (x, y, z) and Ψ(x, y, z) in 3-dimension is defined as (Ψ,ϕ) = ∫Ψ*ϕdxdydz
The Hilbert space H consists of a set of vectors Ψ, ϕ, X and set of scalar a,b,c which satisfies the following four properties:
(i) H is a linear space
(ii) H is a linear space that defines the scalar product which is strictly positive.
(iii) H is separable i.e.,
(iv) H is complete ║Ψ - Ψm║ = 0 , when m →∞, n →∞
Linear independency:
A set of N vectors ϕ1 ,ϕ2, ϕ3 ......ϕn, is said to be linearly independent if and only if the solution of the equation is a1 = a2 = a3 = a4 = 0.... an = 0 , otherwise ϕ1 ,ϕ2, ϕ3 ......ϕn is said to be linear dependent.
The dimension of a space vector is given by the maximum number of linearly independent vectors that a space can have.
The maximum number of linearly independent vectors of a space is N i.e., ϕ1 ,ϕ2, ϕ3 ......ϕN , this space is said to be N dimensional. In this case any vector Ψ of the vector space can be expressed as linear combination,
Orthonormal Basis
Two vectors ϕi ,ϕj is said to be orthonormal, if their scalar product (ϕi, ϕj) = δi, j, where δi, j is kronekar delta that means δi, j = 0 , when i ≠ j and δi, j = 1 , if i = j.
If scalar product where α is a positive finite number, then Ψ (x) is said to be square integrable.
The square intergable function can be treated as probability distribution function, if α = 1 and Ψ is said to be normalized.
Dirac introduced what was to become an invaluable notation in quantum mechanics, state vector Ψ which is square integrable function to what he called a ket vector |Ψ〉 and its conjugate Ψ by a bra 〈Ψ| and scalar product (ϕ, Ψ) bra-ket 〈ϕ|Ψ〉 (In summary Ψ → |Ψ〉, Ψ* → 〈Ψ| and (ϕ, Ψ) = 〈ϕ|Ψ〉), where 〈ϕ, Ψ〉 = ∫ϕ*(r, t)Ψ(r, t)d3r.
Properties of kets, bras and bra-kets.
An operator A is the mathematical rule that when applied to a ket |ϕ〉 will transform it into another ket |Ψ〉 of the same space and when it acts on any bra 〈X| , it transforms it into another bra 〈ϕ| , that means A|ϕ〉 = |Ψ〉 and 〈X| A = 〈ϕ|
Example of Operator-
Identity operator, I | Ψ〉 = |Ψ〉
Pairity operator, π|Ψ(r)〉 = |Ψ(-r)〉
Gradiant operator, ∇ Ψ (r) and Linear momentum operator,
A is linear operator if,
If |Ψ〉 is in orthonormal basis of |ui〉 is defined as
|Ψ〉 c1|u1〉 + c2 |u2〉 + ... |Ψ〉 = Σci|ui〉 , where ci =〈ui|Ψ〉
Then, the ket |Ψ〉 is defined as vector.
Step I: Find transpose of A i.e., convert row into column i.e., AT
Step II: Then take complex conjugate of each element of AT.
Properties of Hermitian Adjoint A†
If an operator A is defined as, A |ψ〉 = λ|ψ〉, then
λ is said to be eigen value and |ψ〉 is said to be eigenvector corresponding to operator.
If A |ϕ〉 = |ψ〉, then 〈ϕ|A† 〈ψ| , where A† is Hermitian adjoint of matrix or operator A.
An operator A is said to be Hermitian if,
A† = A i.e., Matrix element 〈ui| A|uj〉 = (〈uj |A|ui〉)*
If A and B are two operators, then the commentator [A, B] is defined as AB - BA.
If [A,B] = 0 , then it is said that operators A and B commute to each other.
Properties of commutator:
Example 1: Prove that f(x) = x, g (x) = x2 , h(x) = x3 are linearly independent.
For linear independency 12 3
a1f(x) + a2g(x) + a3h(x) = 0 ⇒ a1x + a2x2 + a3x3 = 0
Equating the coefficient of x, x2 and x3 on both sides, one can get.
a1 = 0, a2 = 0, a3 = 0, so f(x), g(x) and h(x) are linearly independent.
Example 2: Prove that vector are linearly dependent.
So, are linearly dependent.
Example 3: then prove that f (x) and g (x) are orthogonal as well as linearly independent.
For linear independency:
So, f (x) and g (x) are linearly independent.
For orthogonality: (f(x)g (x)) = ∫f*(x)g(x)dx
Scalar product of f (x) and g (x) is zero i.e., orthogonal.
Example 4:
(b) Find the value of A such that |ψ〉 is normalized.
(b) For normalization condition-
Example 5:
It is given that 〈ϕi | ϕj〉 = δij then,
(a) Find the condition for |ψ1〉 and |ψ2〉 to be normalized.
(b) Find the condition for |ψ1〉 and |ψ2〉 to be orthogonal.
(a) If |ψ1〉 is normalized, then |ψ1 | ψ1〉 = 1
It is given that
So, |a1|2 + |a2|2 = 1
Similarly, for |ψ2〉 to be normalized-
⇒ |b1|2 + |b2|2 = 1
(b) For |ψ1〉 and |ψ2〉 to be orthogonal,
⇒ a*1b*1 + a*2b2 = 0
Similarly, from 〈ψ2 | ψ1〉 = 0 ⇒ b*1a1 + b*2a2 = 0
Example 6: If S operator is defined as
and 〈ui | uj〉 = δij ; i,j = 1, 2, 3
(a) Construct S matrix
(b) Prove that S is hermitian matrix
The Matrix
where matrix element Sij = 〈ui |S| uj〉
∵ S = S† i.e., S matrix is Hermitian.
Example 7: If Dx is defined as ∂/∂x and ψ(x) = A sin nπx/a
(a) Operate Dx on ψ (x)
(b) Operate D2x on ψ(x)
(c) Which one of the above gives eigen value problem?
(c) When D2x operate on
So, operation of D2x(x) on ψ(x) =A sin nπx/a give eigen value problem with eigen value
Example 8: If operator A is given by, then
(a) find eigen value and eigen vector of A.
(b) normalized the eigen vector.
(c) prove both eigen vector are orthogonal.
(a)
for eigen value
| A - λI | = 0
The eigen vector corresponding to λ = 1,
A |u1〉 = λ|u1〉
so, eigen vector corresponding to
eigen vector corresponds to λ = -1
(b) For normalised eigen vector.
Similarly,
(c) for orthogonality, 〈u1 | u2〉 = 〈u2 | u1〉 = 0
Example 9: If momentum operator Px is defined as and position operator X is defined as XΨ(x) = xΨ(x)
(a) Find the value of commutator [X, Px]
(b) Find the value of [X2 , Px]
(c) Find the value of [X , P2x]
(a): [X, Px] (X Px - PxX)
Operate on both side ψ
(b) [X2, Px] = [X·X, Px] = X [X, Px]+ [X, Px]X = Xiℏ + iℏX= 2iℏX
(c) [X, P2x] = [X,Px ·Px]
Example 10: (a) Prove that PΨ = |Ψ〉〈Ψ| is projection operator
(b) Operate PΨ on |ϕ〉
(c) Operate PΨ on 〈ϕ|
(d) Operate PΨ on |Ψ〉 and 〈Ψ|
(e) Find the eigen value of any projection operator.
(a)
So, PΨ is projection operator.
⇒ PΨ = 0, PΨ = I, so eigen value of PΨ = 0 or 1
Example 11:
(a) Find the value of [A, B]
(b) Write down eigen vector of B in the basis of eigen vector of A.
(a): [A, B] = AB - BA
{As [A, B] = 0 , so A and B will commute}
(b) Eigen vector of A is for eigen value λ1 = a
for eigen value λ2 = a
Eigen vector of B is for eigen value λ1 = b
for eigen value λ2∂ = -b
Example 12:
(a) find A†
(b) find B†
(c) which one of A and B have real eigen value?
(a)
A† = A , so A is Hermitian.
i.e., B† = -B
So, it is not Hermitian rather it is Anti-Hermitian.
(c) The eigen value of A matrix is real because A is Hermitian.
37 videos|16 docs|19 tests
|
1. What is Dirac notation and how is it used in linear vector spaces? |
2. What is the significance of Dirac notation in the context of IIT JAM exam? |
3. Can you provide an example of how Dirac notation is used to represent vectors and operators in quantum mechanics? |
4. What are the advantages of using Dirac notation in quantum mechanics? |
5. Are there any limitations or drawbacks to using Dirac notation in quantum mechanics? |
|
Explore Courses for Physics exam
|