Exercise 1
Give a basis for the vector space $ M_{m\times n}(\mathbb{R}) $ consisting of all $ m \times n $ matrices.
A basis consists of the matrices $ E_{ij} $ for $ 1 \leq i \leq m $ and $ 1 \leq j \leq n $, where $ E_{ij} $ has a 1 in the $ (i,j) $-th position and 0 elsewhere.
Exercise 2
Give a basis for $ V = \mathbb{R}^+ = \{x \mid x \in \mathbb{R} \text{ with } x > 0\} $ with $ x \oplus x' = x \times x' $ and $ c \otimes x = x^c $.
This is a one-dimensional vector space. A basis is $ \{e\} $, where $ e $ is the base of the natural logarithm, since any $ x > 0 $ can be written as $ (\ln x) \otimes e = e^{\ln x} = x $, and the “zero” element is 1.
Exercise 3
Is there a vector space $ V $ such that
(i) $ |V| > 1 $, i.e., $ V \neq \{0\} $, and
(ii) $ |V| $ is finite, i.e., it contains only finitely many vectors?
No such vector space exists over $ \mathbb{R} $. If $ V $ contains a nonzero vector $ v $, then the scalars multiples $ \{c v \mid c \in \mathbb{R}\} $ form an infinite set, contradicting the finiteness.
Exercise 4
Find the largest possible number of linearly independent vectors among
$ v_1 = \begin{pmatrix} 1 \\ -1 \\ 0 \\ 0 \end{pmatrix}, \quad v_2 = \begin{pmatrix} 1 \\ 0 \\ -1 \\ 0 \end{pmatrix}, \quad v_3 = \begin{pmatrix} 1 \\ 0 \\ 0 \\ -1 \end{pmatrix}, \quad v_4 = \begin{pmatrix} 0 \\ 1 \\ -1 \\ 0 \end{pmatrix}, \quad v_5 = \begin{pmatrix} 0 \\ 1 \\ 0 \\ -1 \end{pmatrix}, \quad v_6 = \begin{pmatrix} 0 \\ 0 \\ 1 \\ -1 \end{pmatrix}. $
These vectors lie in the three-dimensional subspace of $ \mathbb{R}^4 $ where the sum of components is zero. Thus, the largest number of linearly independent vectors is 3.
Exercise 5
Find the null space $ N(A) $ and column space $ C(A) $ for the following matrices and give a basis for $ N(A) $ and $ C(A) $ respectively.
(i) $ A = \begin{pmatrix} 2 & 3 & 4 & 5 & 1 \\ 0 & 1 & 0 & 1 & 0 \end{pmatrix} $
(ii) $ A = \begin{pmatrix} 1 & 0 & 2 & 3 \\ 2 & 2 & 4 & 5 \\ 1 & 2 & 4 & 3 \\ 5 & 6 & 16 & 15 \\ 4 & 4 & 10 & 11 \end{pmatrix} $
(i)
The null space $ N(A) $ has dimension 3, and a basis is $ \left\{ \begin{pmatrix} 1 \\ 1 \\ 0 \\ -1 \\ 0\end{pmatrix}, \begin{pmatrix}-2 \\ 0 \\ 1 \\ 0 \\ 0 \end{pmatrix}, \begin{pmatrix}1 \\ 0 \\ 0 \\ 0 \\ -2\end{pmatrix} \right\} $.
The column space $ C(A) $ is the span of the columns of $ A $, and a basis is $ \left\{ \begin{pmatrix} 2 \\ 4 \\ 1 \\ 1 \\ 1 \end{pmatrix}, \begin{pmatrix} 3 \\ 5 \\ 0 \\ 0 \\ 0 \end{pmatrix} \right\} $.
(ii)
The null space $ N(A) $ has dimension 1, and a basis is $ \left\{ \begin{pmatrix} -4 \\ 1 \\ -1 \\ 2 \end{pmatrix} \right\} $.
The column space $ C(A) $ has dimension 3, and a basis is $ \left\{ \begin{pmatrix} 1 \\ 2 \\ 1 \\ 5 \\ 4 \end{pmatrix}, \begin{pmatrix} 0 \\ 2 \\ 2 \\ 6 \\ 4 \end{pmatrix}, \begin{pmatrix} 2 \\ 4 \\ 4 \\ 16 \\ 10 \end{pmatrix} \right\} $.
Exercise 6
Calculate $ k \in \mathbb{R} $ such that the sequence $ v_1 - k v_2 $, $ v_2 - k v_3 $, $ v_3 - k v_4 $, $ v_4 - k v_1 $ is linearly independent, where
$ v_1 = \begin{pmatrix} 1 \\ 1 \\ 1 \\ 1 \end{pmatrix}, \quad v_2 = \begin{pmatrix} 0 \\ 1 \\ 1 \\ 1 \end{pmatrix}, \quad v_3 = \begin{pmatrix} 0 \\ 0 \\ 1 \\ 1 \end{pmatrix}, \quad v_4 = \begin{pmatrix} 0 \\ 0 \\ 0 \\ 1 \end{pmatrix}. $
Sol: $$ \begin{align*} w_{1} & = v_{1}-kv_{2} = \begin{pmatrix} 1 \\ 1-k \\ 1-k \\ 1-k \end{pmatrix}, w_{2}= v_{2} - kv_{3} = \begin{pmatrix} 0 \\ 1 \\ 1-k \\ 1-k \end{pmatrix}, \\ \\ w_{3} & = v_{3} - kv_{4} = \begin{pmatrix} 0\\ 0 \\ 1 \\ 1-k \end{pmatrix}, w_{4} = v_{4} - kv_{1} = \begin{pmatrix} -k \\ -k \\ -k \\ 1-k \end{pmatrix}, \\ \\ M & = \begin{pmatrix} w_{1} & w_{2} & w_{3} & w_{4} \end{pmatrix} = \begin{pmatrix} 1 & 0 & 0 & -k \\ 1-k & 1 & 0 & -k \\ 1-k & 1-k & 1 & -k \\ 1-k & 1-k & 1-k & 1-k \end{pmatrix} \end{align*} $$ Then we calculate the determinant of matrix $ M $, which is $$ \det(M) = 1-k^{4} $$ So the sequence is linearly independent if and only if $ k^4 \neq 1 $, i.e., $ k \neq \pm 1 $.
Exercise 7
Let $ V $ be a vector space and $ S_1, S_2 \subseteq V $. Prove that:
(i) If $ S_1 \subseteq S_2 $, then $ \operatorname{span}(S_1) \subseteq \operatorname{span}(S_2) $.
(ii) $ \operatorname{span}(S_1) = \operatorname{span}(\operatorname{span}(S_1)) $.
(iii) If $ S_1 \subseteq \operatorname{span}(S_2) $, then $ \operatorname{span}(S_1) \subseteq \operatorname{span}(S_2) $.
Proof:
(i)
Any linear combination of elements from $ S_1 $ is also a linear combination of elements from $ S_2 $, using zero coefficients for elements in $ S_2 \setminus S_1 $. Thus, $ \operatorname{span}(S_1) \subseteq \operatorname{span}(S_2) $.
(ii)
Elements of $ \operatorname{span}(\operatorname{span}(S_1)) $ are linear combinations of elements from $ \operatorname{span}(S_1) $, which are themselves linear combinations from $ S_1 $, hence linear combinations from $ S_1 $. So $ \operatorname{span}(\operatorname{span}(S_1)) \subseteq \operatorname{span}(S_1) $. The reverse inclusion is immediate.
(iii)
Each element of $ S_1 $ is a linear combination from $ S_2 $, so any linear combination from $ S_1 $ is a linear combination of linear combinations from $ S_2 $, hence a linear combination from $ S_2 $. Thus, $ \operatorname{span}(S_1) \subseteq \operatorname{span}(S_2) $.
Exercise 8
Let $ V $ be a vector space and $ v, e_1, \ldots, e_n \in V $. We have already used the following terminology in our class. We say that $ v $ can be represented by $ e_1, \ldots, e_n $ if $ v $ can be written as a linear combination of $ e_1, \ldots, e_n $, or equivalently
$ v \in \operatorname{span}(\{e_1, \ldots, e_n\}) $.
This can be extended as follows. Let $ v \in V $ and $ S, S_1, S_2 \subseteq V $
$ v $ can be represented by $ S $ if $ v \in \operatorname{span}(S) $.
$ S_1 $ can be represented by $ S_2 $ if every $ v \in S_1 $ can be represented by $ S_2 $.
Prove that:
(i) $ S_1 $ can be represented by $ S_2 $ if and only if $ \operatorname{span}(S_1) \subseteq \operatorname{span}(S_2) $.
(ii) If $ S $ can be represented by $ S_1 $, and $ S_1 $ can be represented by $ S_2 $, then $ S $ can be represented by $ S_2 $.
(iii) Assume $ v \in S $ and $ v $ can be represented by $ S \setminus \{v\} $, then
$ \operatorname{span}(S \setminus \{v\}) = \operatorname{span}(S) $.
Proof:
(i)
$ S_1 $ can be represented by $ S_2 $ means $ S_1 \subseteq \operatorname{span}(S_2) $, which by Exercise 7(iii) is equivalent to $ \operatorname{span}(S_1) \subseteq \operatorname{span}(S_2) $.
(ii)
If $ S $ is represented by $ S_1 $, then $ S \subseteq \operatorname{span}(S_1) $. If $ S_1 $ is represented by $ S_2 $, then $ \operatorname{span}(S_1) \subseteq \operatorname{span}(S_2) $ by (i). Thus, $ S \subseteq \operatorname{span}(S_1) \subseteq \operatorname{span}(S_2) $, so $ S $ is represented by $ S_2 $.
(iii)
Let $ T = S \setminus \{v\} $. Since $ v \in \operatorname{span}(T) $, $ \operatorname{span}(S) = \operatorname{span}(T \cup \{v\}) \subseteq \operatorname{span}(T) + \operatorname{span}(\{v\}) \subseteq \operatorname{span}(T) + \operatorname{span}(T) = \operatorname{span}(T) $. Also, $ \operatorname{span}(T) \subseteq \operatorname{span}(S) $, so equality holds.