IntroductionFinding the basis of null space is a fundamental skill in linear algebra that empowers students and professionals to solve homogeneous systems, analyze matrix transformations, and understand the geometry of vector spaces. This article explains how to find basis of null space step by step, provides a clear scientific explanation, and answers common questions. By following the outlined procedures, readers will be able to compute null space bases confidently, even for large matrices, while appreciating the underlying theory.
Understanding the Null Space
The null space (also called the kernel) of a matrix (A) consists of all vectors (\mathbf{x}) such that (A\mathbf{x} = \mathbf{0}). It is a subspace of (\mathbb{R}^n) where (n) is the number of columns in (A). A basis of null space is a set of linearly independent vectors that span this subspace. Knowing the basis allows us to describe every solution to the homogeneous equation (A\mathbf{x} = \mathbf{0}) as a linear combination of the basis vectors Not complicated — just consistent. Turns out it matters..
Steps to Find the Basis of Null Space
Below is a concise, numbered procedure that can be applied to any matrix.
-
Write the matrix equation
Express the problem as (A\mathbf{x} = \mathbf{0}). Identify the matrix (A) and the unknown vector (\mathbf{x}). -
Row‑reduce (A) to its reduced row‑echelon form (RREF)
Use Gaussian elimination to transform (A) into RREF. This simplifies the system and reveals pivot columns (leading 1s) and free columns. -
Identify pivot and free variables
- Pivot columns correspond to leading variables.
- Free columns correspond to free variables, which can take any real value.
-
Express pivot variables in terms of free variables
From the RREF, write each pivot variable as a linear combination of the free variables. This step creates parametric equations for the solution set. -
Introduce parameters
Assign a parameter (e.g., (t_1, t_2, \dots)) to each free variable. The general solution will be a vector (\mathbf{x}) expressed as a sum of parameter‑multiplied vectors. -
Extract the basis vectors
- For each parameter, set that parameter to 1 and all others to 0.
- The resulting vector is a basis vector of the null space.
- Collect all basis vectors; they form a set that spans the null space and is linearly independent by construction.
-
Verify linear independence (optional but recommended)
Check that no basis vector can be written as a linear combination of the others. This can be done by forming a matrix with the basis vectors as columns and confirming its rank equals the number of vectors No workaround needed..
Example
Consider the matrix
[ A = \begin{bmatrix} 1 & 2 & -1 & 0\ 0 & 1 & 1 & 1\ 2 & 4 & -2 & 2 \end{bmatrix} ]
- Row‑reduce (A) to RREF:
[ \begin{bmatrix} 1 & 2 & -1 & 0\ 0 & 1 & 1 & 1\ 0 & 0 & 0 & 0 \end{bmatrix} ]
-
Identify pivot columns: columns 1 and 2 are pivots; columns 3 and 4 are free.
-
Express pivot variables:
[ \begin{aligned} x_1 &= -2x_3 + 0x_4 \ x_2 &= -x_3 - x_4 \end{aligned} ]
-
Introduce parameters: let (t_1 = x_3) and (t_2 = x_4) That's the part that actually makes a difference..
-
Write the general solution:
[ \mathbf{x} = t_1\begin{bmatrix}-2 \ -1 \ 1 \ 0\end{bmatrix} + t_2\begin{bmatrix}0 \ -1 \ 0 \ 1\end{bmatrix} ]
- Basis vectors:
[ \mathbf{v}_1 = \begin{bmatrix}-2 \ -1 \ 1 \ 0\end{bmatrix},\qquad \mathbf{v}_2 = \begin{bmatrix}0 \ -1 \ 0 \ 1\end{bmatrix} ]
These two vectors form a basis of null space for (A).
Scientific Explanation
The process of finding the basis of null space hinges on the fundamental theorem of linear algebra, which states that the column space and null space of a matrix are orthogonal complements in (\mathbb{R}^n). Day to day, by reducing (A) to RREF, we isolate the structure of the linear equations. Now, the pivot columns dictate which variables are determined by the others, while free variables remain unrestricted. So this separation creates a natural vector space decomposition: each free variable contributes a direction in the null space. The set of direction vectors obtained in step 6 is linearly independent because each vector has a unique pattern of non‑zero entries corresponding to a distinct free variable Worth keeping that in mind..
[ \text{rank}(A) + \text{nullity}(A) = n ]
Understanding this theorem reinforces why the algorithm works and highlights the deep connection between matrix rank and the size of the null space.
Frequently Asked Questions
Q1: Can I find the basis of null space without row‑reduction?
Answer: While possible, row‑reduction is the most systematic method. Alternative approaches, such as computing the singular value decomposition (SVD), can also reveal the null space, but they are computationally heavier and less intuitive for beginners Not complicated — just consistent..
Q2: What if the matrix is already in RREF?
Answer: If (A) is already in RREF, you can skip step 2. Directly identify pivot and free columns, then proceed to express pivot variables and extract basis vectors.
Q3: How do I handle complex matrices?
Answer: The same steps apply, except that the entries are complex numbers. The RREF is defined over the field of complex numbers, and the basis vectors will have complex components Nothing fancy..
Q4: Is the basis unique?
Answer: No. A null space may have infinitely many bases. Any set of linearly independent vectors that span the same subspace is a valid basis. Still, the number of vectors (the nullity) is invariant