Linear Algebra
2.6 Basis and Rank
1 −4 2 17
−2 −2 3 −10 1 0 −1 11
−1 4 −3 1
(2.75)
is given as
1 0 0 −7 0 1 0 −15 0 0 1 −18 0 0 0 0
. (2.76)
We see that the corresponding linear equation system is non-trivially solv-able: The last column is not a pivot column, andx4=−7x1−15x2−18x3. Therefore,x1, . . . ,x4 are linearly dependent asx4can be expressed as a linear combination ofx1, . . . ,x3.
2.6 Basis and Rank
In a vector spaceV, we are particularly interested in sets of vectorsAthat possess the property that any vectorv ∈ V can be obtained by a linear combination of vectors inA. These vectors are special vectors, and in the following, we will characterize them.
2.6.1 Generating Set and Basis
Definition 2.13(Generating Set and Span). Consider a vector spaceV = (V,+,·) and set of vectors A = {x1, . . . ,xk} ⊆ V. If every vector v ∈ V can be expressed as a linear combination ofx1, . . . ,xk, A is called a generating set of V. The set of all linear combinations of vectors inA is
generating set
called thespanofA. IfAspans the vector spaceV, we writeV = span[A]
span
orV = span[x1, . . . ,xk].
Generating sets are sets of vectors that span vector (sub)spaces, i.e., every vector can be represented as a linear combination of the vectors in the generating set. Now, we will be more specific and characterize the smallest generating set that spans a vector (sub)space.
Definition 2.14(Basis). Consider a vector spaceV = (V,+,·) andA ⊆ V. A generating setAofV is calledminimalif there exists no smaller set
minimal
A˜(A ⊆ V that spansV. Every linearly independent generating set ofV is minimal and is called abasisofV.
basis
2.6 Basis and Rank 45 Let V = (V,+,·) be a vector space and B ⊆ V,B 6= ∅. Then, the
following statements are equivalent: A basis is a minimal
generating set and a maximal linearly independent set of vectors.
Bis a basis ofV.
Bis a minimal generating set.
Bis a maximal linearly independent set of vectors inV, i.e., adding any other vector to this set will make it linearly dependent.
Every vectorx∈V is a linear combination of vectors fromB, and every linear combination is unique, i.e., with
x=
InR3, the canonical/standard basisis canonical basis
B=
Different bases inR3are B1=
is linearly independent, but not a generating set (and no basis) ofR4: For instance, the vector[1,0,0,0]>cannot be obtained by a linear com-bination of elements inA.
Remark. Every vector space V possesses a basisB. The preceding exam-ples show that there can be many bases of a vector spaceV, i.e., there is no unique basis. However, all bases possess the same number of elements,
thebasis vectors. ♦ basis vector
We only consider finite-dimensional vector spaces V. In this case, the
dimensionofV is the number of basis vectors ofV, and we writedim(V). dimension
If U ⊆ V is a subspace of V, then dim(U) 6 dim(V) and dim(U) =
dim(V)if and only ifU =V. Intuitively, the dimension of a vector space can be thought of as the number of independent directions in this vector space.
The dimension of a vector space corresponds to the number of its basis vectors.
Remark. The dimension of a vector space is not necessarily the number of elements in a vector. For instance, the vector spaceV = span[
0 1
]is one-dimensional, although the basis vector possesses two elements. ♦ Remark. A basis of a subspaceU = span[x1, . . . ,xm]⊆Rncan be found by executing the following steps:
1. Write the spanning vectors as columns of a matrixA 2. Determine the row-echelon form ofA.
3. The spanning vectors associated with the pivot columns are a basis of U.
♦
Example 2.17 (Determining a Basis)
For a vector subspaceU ⊆R5, spanned by the vectors
x1=
Therefore, we need to solve
4
X
i=1
λixi =0, (2.82)
which leads to a homogeneous system of equations with matrix
x1,x2,x3,x4
With the basic transformation rules for systems of linear equations, we obtain the row-echelon form
2.6 Basis and Rank 47 Since the pivot columns indicate which set of vectors is linearly indepen-dent, we see from the row-echelon form thatx1,x2,x4 are linearly inde-pendent (because the system of linear equationsλ1x1+λ2x2+λ4x4=0 can only be solved withλ1 =λ2 =λ4 = 0). Therefore,{x1,x2,x4}is a basis ofU.
2.6.2 Rank
The number of linearly independent columns of a matrix A ∈ Rm×n equals the number of linearly independent rows and is called the rank rank
ofAand is denoted byrk(A).
Remark. The rank of a matrix has some important properties:
rk(A) = rk(A>), i.e., the column rank equals the row rank.
The columns of A ∈ Rm×n span a subspace U ⊆ Rm withdim(U) = rk(A). Later we will call this subspace the imageor range. A basis of U can be found by applying Gaussian elimination to Ato identify the pivot columns.
The rows of A ∈ Rm×n span a subspace W ⊆ Rn with dim(W) = rk(A). A basis ofW can be found by applying Gaussian elimination to A>.
For all A ∈ Rn×n it holds thatA is regular (invertible) if and only if rk(A) =n.
For all A ∈ Rm×n and all b ∈ Rm it holds that the linear equation system Ax = bcan be solved if and only if rk(A) = rk(A|b), where A|bdenotes the augmented system.
ForA∈Rm×n the subspace of solutions forAx=0possesses
dimen-sion n−rk(A). Later, we will call this subspace thekernelor the null kernel null space
space.
A matrix A∈Rm×n hasfull rankif its rank equals the largest possible full rank
rank for a matrix of the same dimensions. This means that the rank of a full-rank matrix is the lesser of the number of rows and columns, i.e.,
rk(A) = min(m, n). A matrix is said to berank deficientif it does not rank deficient
have full rank.
♦ Example 2.18 (Rank)
A=
1 0 1 0 1 1 0 0 0
.
Ahas two linearly independent rows/columns so thatrk(A) = 2.
A=
1 2 1
−2 −3 1
3 5 0
.
We use Gaussian elimination to determine the rank:
1 2 1
−2 −3 1
3 5 0
· · ·
1 2 1 0 1 3 0 0 0
. (2.84)
Here, we see that the number of linearly independent rows and columns is 2, such thatrk(A) = 2.