A system of linear equations in two variables consists of two linear equations in the same pair of variables. Geometrically, each linear equation in two variables represents a straight line in the plane. The solution of the system is the point(s) at which the lines meet. There are three possible types of systems (in two variables):
Consider three linear equations in three unknowns:
a1x + b1y + c1z = d1
a2x + b2y + c2z = d2
a3x + b3y + c3z = d3
Form the determinant D of the coefficient matrix and the determinants Dx, Dy, Dz obtained by replacing the appropriate column by the constants column. If D ≠ 0, the system has the unique solution
x = Dx / D, y = Dy / D, z = Dz / D.
Note :
Trivial solution - if a homogeneous system has only the zero solution (all variables zero), that solution is called the trivial solution.
A system of linear equations can be written in matrix form as AX = B, where A is the coefficient matrix, X is the column vector of unknowns and B is the column vector of constants. A system is consistent if it has at least one solution.
If a system consists of n linear equations in n unknowns, we can write it as AX = B where A is square (n × n).
If A is non-singular (i.e., det(A) ≠ 0), then A has an inverse and the unique solution is
X = A-1 B.
If A is singular (det(A) = 0), there are two possibilities determined by the ranks:
The statements involving adjoint of A in some textbooks are used to study special cases, but the rank condition gives the complete criterion for existence and number of solutions.
A homogeneous system has the form AX = 0. For an n × n matrix A:
The following operations on a matrix are called elementary row transformations. They do not change the solution set of the corresponding system when applied to the augmented matrix [A|B]:
Note : Similar elementary column transformations also exist. Two matrices A and B are said to be equivalent (A ~ B) if one can be obtained from the other by a finite sequence of elementary row (or column) operations.
Two commonly used echelon forms are defined as follows:
The process of obtaining RREF from the augmented matrix to read off solutions directly is called Gauss-Jordan elimination. Reducing to REF by forward elimination and then back substitution is called Gaussian elimination.
For AX = B where A is an m × n matrix, X is an n × 1 column vector and B is an m × 1 column vector, the augmented matrix [A|B] is the m × (n+1) matrix obtained by adjoining B as the (n+1)-th column to A. Solving the system by row reductions uses this augmented matrix.
Ex.25 Solve the equations
Sol.
Ex.26 Solve
Sol.
We see that the system has an infinite number of solutions. Specific solutions can be generated by choosing specific values for the free parameter k.
Ex.27 Number of triplets of a, b & c for which the system of equations ax - by = 2a - b and (c + 1)x + cy = 10 - a + 3 b has infinitely many solutions and x = 1, y = 3 is one of the solutions is
Sol.
Ex.28 Solve
Sol.
Ex.29 Solve
by reducing the augmented matrix of the system to reduced row echelon form.
Sol.
It is easy to see that x1 = 1, x2 = -3, x3 = 6. The process of solving a system by reducing the augmented matrix to reduced row echelon form is called Gauss-Jordan elimination.
Ex.30 Determine conditions on a, b and c so that
will have no solutions or have an infinite number of solution.
Sol.
A square matrix A is said to be non-singular if det(A) ≠ 0 and singular if det(A) = 0.
Ex.31 Show that every skew-symmetric matrix of odd order is singular.
Sol.
Let A = [aij]n be an n × n matrix.
The matrix obtained by replacing each element aij by its cofactor Cij is called the cofactor matrix (also called the matrix of cofactors).
The transpose of the cofactor matrix is called the adjoint (or adjugate) of A and is denoted by adj A.
If A is non-singular, its inverse is given by
That is, A-1 = (1 / det A) adj A.
Remarks :
For a square matrix A, the polynomial |A - xI| in the variable x is called the characteristic polynomial of A. The equation |A - xI| = 0 is the characteristic equation.
Remark : By the Cayley-Hamilton theorem, every square matrix A satisfies its own characteristic equation, i.e., if p(x) = |A - xI|, then p(A) = 0.
Ex.32 Find the adjoint of the matrix A =
Sol.
Ex.33 If A and B are square matrices of the same order, then adj (AB) = adj B · adj A.
Sol.
Ex.34 If A be an n-square matrix and B be its adjoint, then show that Det (AB + K In) = [Det (A) + K]n, where K is a scalar quantity.
Sol.
Ex.35 If
be the direction cosines of three mutually perpendicular lines referred to an orthogonal Cartesian co-ordinate system, then prove that
is an orthogonal matrix.
Sol.
Ex.36 Obtain the characteristic equation of the matrix A =
and verify that it is satisfied by A and hence find its inverse.
Sol.
Ex.37 Find the inverse of the matrix A =
Sol.
Ex.38 If a non-singular matrix A is symmetric, show that A-1 is also symmetric.
Sol.
| 1. What is a system of linear equations? | ![]() |
| 2. How can I solve a system of linear equations? | ![]() |
| 3. Can a system of linear equations have no solution? | ![]() |
| 4. Can a system of linear equations have infinitely many solutions? | ![]() |
| 5. What is the importance of solving systems of linear equations? | ![]() |