Exercise 1
Let $ S \subseteq V $ be a subset of vectors in a vector space $ V $. A finite subset $ S' \subseteq S $ is maximally linearly independent in $ S $ if
$ S' $ is linearly independent, and
for any $ v \in S \setminus S' $ the set $ S' \cup \{v\} $ is not linearly independent.
Prove that:
(i) $ S' $ is maximally linearly independent in $ S $ if and only if $ S' $ (viewed as a sequence of vectors) is a basis for $ \operatorname{span}(S) $.
(ii) A finite subset $ S \subseteq V $ constitutes a basis for $ V $ if and only if $ S $ is maximally linearly independent in $ V $.
(ii) Every finite $ S $ must have a maximally linearly independent $ S' \subseteq S $ (without assuming that $ V $ is finite dimensional).
Proof:
(i)
If $ S' $ is maximally linearly independent in $ S $, then it is linearly independent. For any $ v \in S \setminus S' $, $ S' \cup \{v\} $ is dependent, so $ v \in \operatorname{span}(S') $. Thus, $ \operatorname{span}(S) = \operatorname{span}(S') $, making $ S' $ a basis for $ \operatorname{span}(S) $.
Conversely, if $ S' $ is a basis for $ \operatorname{span}(S) $, it is linearly independent. For any $ v \in S \setminus S' $, $ v \in \operatorname{span}(S) = \operatorname{span}(S') $, so $ S' \cup \{v\} $ is dependent.
(ii)
If $ S $ is a finite basis for $ V $, it is linearly independent and spans $ V $. For any $ v \in V \setminus S $, $ v \in \operatorname{span}(S) $, so $ S \cup \{v\} $ is dependent, making $ S $ maximally linearly independent in $ V $.
Conversely, if $ S $ is maximally linearly independent in $ V $, it is linearly independent. For any $ v \in V \setminus S $, $ S \cup \{v\} $ is dependent, so $ v \in \operatorname{span}(S) $, hence $ \operatorname{span}(S) = V $, making $ S $ a basis.
(iii)
Start with the empty set, which is linearly independent. Iteratively add vectors from $ S $ that preserve independence until no more can be added. The resulting finite subset is maximally linearly independent in $ S $.
Exercise 2
Let $ u_1, \ldots, u_n, v, w \in V $ be linearly dependent. Assume that $ u_1, \ldots, u_n $ are linearly independent. Then one of the following holds.
(i) $ v $ is a linear combination of $ u_1, \ldots, u_n $.
(ii) $ w $ is a linear combination of $ u_1, \ldots, u_n $.
(iii) $ u_1, \ldots, u_n, v $ are all linear combinations of $ u_1, \ldots, u_n, w $, and vice versa.
Since $ \{u_1, \ldots, u_n, v, w\} $ is dependent, there exist scalars not all zero such that $ \sum a_i u_i + b v + c w = 0 $. If $ b = c = 0 $, this contradicts the independence of the $ u_i $. Thus, $ b \neq 0 $ or $ c \neq 0 $ (or both).
- If $ b \neq 0 $ and $ c = 0 $, then $ v = -\frac{1}{b} \sum a_i u_i $, so (i) holds.
- If $ b = 0 $ and $ c \neq 0 $, then $ w = -\frac{1}{c} \sum a_i u_i $, so (ii) holds.
- If $ b \neq 0 $ and $ c \neq 0 $, then $ v = -\frac{1}{b} (\sum a_i u_i + c w) $ and $ w = -\frac{1}{c} (\sum a_i u_i + b v) $, so $ \operatorname{span}\{u_1, \ldots, u_n, v\} = \operatorname{span}\{u_1, \ldots, u_n, w\} $, and (iii) holds.
Exercise 3
Given $ v_1 = \begin{pmatrix} 1 \\ 1 \\ 0 \\ 1 \end{pmatrix}, \quad v_2 = \begin{pmatrix} 1 \\ 3 \\ 0 \\ 2 \end{pmatrix}, \quad v_3 = \begin{pmatrix} 2 \\ 3 \\ 2 \\ 1 \end{pmatrix}, \quad v_4 = \begin{pmatrix} -1 \\ 1 \\ 0 \\ 0 \end{pmatrix}, $
find the vector that is not a linear combination of the rest.
The vector $ v_3 $ is not a linear combination of the rest, as $ \operatorname{span}\{v_1, v_2, v_4\} $ lies in the subspace where the third coordinate is zero, but $ v_3 $ has third coordinate 2.
Exercise 4
Let $ V $ be finite-dimensional. Under what condition the basis of $ V $ is unique?
The basis is unique if and only if $ \dim V = 0 $, i.e., $ V = \{0\} $, where the unique basis is the empty set. For $ \dim V \geq 1 $, there are infinitely many bases.
Exercise 5
Consider the vector space $ \mathbb{R}^4 $. Construct a basis containing the following two vectors.
$ (1, 1, 0, 1), \ (10, 7, 2, 3). $
One such basis is $ \{(1,1,0,1), (10,7,2,3), (0,0,1,0), (0,0,0,1)\} $.
Exercise 6
Let $ V $ be a finite dimensional vector space. Let $ W $ be a subspace of $ V $. Prove that $ W \subsetneq V \iff \dim(W) < \dim(V) $.
Here, $ W \subsetneq V $ means that $ W $ is a proper subset of $ V $, i.e., $ W \subseteq V $ but $ W \neq V $.
If $ \dim W < \dim V $, then $ W \neq V $, as subspaces of equal dimension coincide.
Conversely, if $ W \subsetneq V $, suppose $ \dim W = \dim V $. Then extending a basis of $ W $ to $ V $ would require no additional vectors, implying $ W = V $, a contradiction. Thus, $ \dim W < \dim V $.
Exercise 7
Let $ A $ be an $ m \times n $ matrix with $ m < n $.
(i) Prove that the column vectors of $ A $ are linearly dependent.
(ii) Using (i) to show that for every $ b \in \mathbb{R}^m $ the system of linear equations $ Ax = b $ either has no solution or has infinitely many solutions.
(iii) Prove that the column rank of $ A $ is $ m $ if and only if for every $ b \in \mathbb{R}^m $ the system $ Ax = b $ has infinitely many solutions.
(i)
The $ n $ columns are vectors in $ \mathbb{R}^m $ with $ n > m $, so they are linearly dependent.
(ii)
By (i), $ \dim \ker A \geq 1 $. The solution set to $ Ax = b $ is either empty (no solution) or an affine subspace of dimension $ \dim \ker A \geq 1 $ (infinitely many solutions).
(iii)
If column rank is $ m $, then $ \operatorname{span} $ of columns is $ \mathbb{R}^m $, so $ Ax = b $ is always consistent. By (ii), it has infinitely many solutions.
Conversely, if always infinitely many solutions, then always consistent, so column space is $ \mathbb{R}^m $, hence column rank $ m $.
Exercise 8
Let $ A $ be an $ m \times n $ matrix. Prove that there is an $ n \times m $ matrix $ B $ such that $ BA = I $ if and only if the row rank of $ A $ is $ n $.
The equation $ BA = I_n $ means $ A $ has a left inverse, equivalent to $ A $ being injective (full column rank), i.e., rank $ n $. Since row rank equals column rank, this is equivalent to row rank $ n $.