8.8 Vector Space

A vector space is a set of vectors which is closed under vector addition and scalar multiplication. They must also adhere to typical axioms such as associativity, commutativity, etc (8 in all). We will focus on real vector spaces on \(\mathbb{R}^k\).

The vector \(\mathbf{y} = a_1\mathbf{x_1}+a_2\mathbf{x_2}+\cdots+a_k\mathbf{x_k}\) is a linear combination of the vectors \(\mathbf{x_1},\mathbf{x_2},...,\mathbf{x_k}\). The set of all linear combinations of \(\mathbf{x_1},\mathbf{x_2},...,\mathbf{x_k}\) is their linear span.

A set of vectors is said to be linearly dependent if one of the vectors in the set can be defined as a linear combination of the other vectors. In other words, if there exists \(k\) numbers (\(a_1,a_2,...,a_k\)), not all zero, such that \[a_1\mathbf{x_1}+a_2\mathbf{x_2}+\cdots+a_k\mathbf{x_k} = \mathbf{0} \] If no vector in the set can be written in this way such that \(a_1,...,a_k\) have to be 0, then the vectors are said to be linearly independent.

  • Theorem: For a vector space \(V\), a set of vectors are linearly dependent if and only if the matrix of the vectors is singular (see below for the definition of singularity).

  • Basis: A basis is a set of vectors that spans the whole vector space (i.e. any vector in the vector space can be written as a linear combination of basis elements) and are linearly independent.