No menu items!

Checking Linear Independence using Determinants

-

A collection of vectors is said to be linearly dependent if some non-trivial linear combination of them equals zero. More formally, vectors v_1,v_2,\ldots,v_n are said to be linearly dependent if there exist scalars a_1,a_2,\ldots,a_n with atleast one a_i \neq 0 such that, a_1v_1+a_2v_2+\ldots+a_nv_n = 0.

The vectors are said to be linearly independent if they are not linearly dependent. This means that, \text{If }a_1v_1+a_2v_2+\ldots+a_nv_n = 0, \text{ then }a_1=a_2=\ldots=a_n=0. The following theorem allows us to check using determinants whether n vectors lying in \mathbb{R}^n are linearly independent or not.

Theorem:

Let v_1,v_2,\ldots,v_n be vectors in \mathbb{R}^n. Let v_i=(v_{i1},v_{i2},\ldots,v_{in}). Then the vectors are linearly independent if and only if the determinant of the matrix A =[v_{ij}]_{n \times n}\neq 0.

This basically means that we arrange the components of the vectors as rows of our matrix and calculate the determinant of the resulting matrix to check whether it is nonzero. Notice that this theorem is applicable only when the number of vectors(=n) is equal to the number of components of each vector(=n). Let us now look at some examples where this theorem is applicable.

Example 1:

Consider the three vectors as shown below. v_1 = (1, 0, 2) v_2 = (3, 4, -1) v_3 = (4, 4, 1).

Taking the components of the three vectors to be rows of a matrix we obtain the following matrix, A = \begin{bmatrix} 1 & 0 & 2\\ 3 & 4 & -1 \\ 4 & 4 & 1 \end{bmatrix}. We now calculate the determinant of the above matrix. det(A) = 1 \times (4 + 4) - 0 \times ( 3 + 4) + 2 \times (12 - 16) = 8 - 0 + (-8) = 0.

Since the determinant of the above matrix is zero we conclude that the three given vectors are linearly dependent. This can also be seen very easily as follows, (4, 4, 1) = (1, 0, 2) + (3, 4, -1) v_3 = v_1 + v_2 \implies v_1 + v_2 - v_3 = 0. Since a non-trivial linear combination of the vectors is nonzero, this means that the vectors are linearly dependent. This gives us another way of verifying linear dependence.

Example 2:

Consider the three vectors as shown below. v_1 = (3, 1, 0) v_2 = (0, -1, 2) v_3 = (4, 1, 2).

We now calculate the determinant of the above matrix, A = \begin{bmatrix} 3 & 1 & 0\\ 0 & -1 & 2 \\ 4 & 1 & 2 \end{bmatrix}. det(A) = 3 \times (-2 - 2) - 1 \times ( 0 - 8) + 0 \times (0 + 4) = -12 + 8 + 0 = -4 \neq 0. Since the determinant is nonzero we conclude that the three vectors are linearly independent.

Hey 👋

I'm currently pursuing a Ph.D. in Maths. Prior to this, I completed my master's in Maths & bachelors in Statistics.

I created this website for explaining maths and statistics concepts in the simplest possible manner.

If you've found value from reading my content, feel free to support me in even the smallest way you can.



Share this article

Recent posts

Popular categories

Recent comments