Linear Independence

Suppose u and v are two vectors in [math] \mathbb{R}^3 [/math] such that [math]\text{Span}\{u,v\} [/math] is a plane through the origin. Now we add another vector w in the mix. There are two possible outcomes for [math] \text{Span}\{u,v,w\} [/math]: Either the same plane through the origin or a set properly containing the plane (Note: in this case, it is actually the whole [math]\mathbb{R}^3 [/math]). How can we know which outcome we have? This leads us to an important notion in linear algebra called [b]linear independence[/b].[br][br][u]First case[/u]: Suppose [math]\text{Span}\{u,v,w\}=\text{Span}\{u,v\} [/math]. Then it means that you cannot get any new vectors by taking linear combinations of u, v, and w. In particular, w must be in [math]\text{Span}\{u,v\} [/math] I.e. there exist [math] c_1,c_2 \in \mathbb{R} [/math] such that [math]w = c_1u+c_2v [/math].[br][br]When this happens, we say that u, v, and w are [b] linearly dependent [/b]. More generally, for any vectors [math]v_1,v_2,\ldots,v_p\in\mathbb{R}^n [/math] , they are linearly dependent if there exists at least one vector [math] v_i [/math] such that it is a linear combination of the remaining vectors. [br][br]However, such definition is not very convenient to use because usually we do not know in advance exactly which vector is a linear combination of others. And by the above example, we can rewrite the condition as follows: [math] c_1u+c_2v+(-1)w=0 [/math]. (Note: Here “0” means the zero vector.). Therefore, we use the following equivalent definition which is much more useful:[br][br]Let [math]S=\{v_1,v_2,\ldots,v_p\}[/math] be set of vectors in [math]\mathbb{R}^n\[/math]. Then the set is [b]linearly dependent[/b] if there exist [math]c_1,c_2,\ldots,c_p\in\mathbb{R}[/math] such that at least one of them is non-zero and [math]c_1v_1+c_2v_2+\cdots+c_pv_p=0[/math].[br][br][b]Question[/b]: Why the above definition is equivalent to saying that there exists a vector in [math]S[/math] that is a linear combination of the others?[br][br][u]Second case[/u]: Suppose [math]\text{Span}\{u,v\} \subsetneq\text{Span}\{u,v,w\}[/math]. Then it means that new vectors are generated by taking linear combinations of u, v, and w. In other words, u, v, and w are [b]NOT[/b] linearly dependent. Any set that is not linearly dependent is [b]linearly independent[/b]. More generally, we have the following definition:[br][br]Let [math]S=\{v_1,v_2,\ldots,v_p\}[/math] be set of vectors in [math]\mathbb{R}^n\[/math]. Then the set is [b]linearly independent[/b] if for any [math]c_1,c_2,\ldots,c_p\in\mathbb{R}[/math] such that [math]c_1v_1+c_2v_2+\cdots+c_pv_p=0[/math], then [math]c_1=c_2=\cdots=c_p=0[/math].[br][br]Heuristically speaking, a set of vectors is linearly independent means that there is no redundant vector in the set i.e. if you throw away any one of them, the span will be smaller.[br][br]The following are some T/F questions that test your understanding of the concepts:[br]
If u, v, and w are vectors in [math]\mathbb{R}^3[/math] such that [math]\text{Span}\{u,v,w\} \subsetneq \mathbb{R}^3[/math], then [math]\{u, v, w\}[/math] are linearly dependent.
For any vector v in [math]\mathbb{R}^n[/math], [math]\{v\}[/math] is linearly independent.
Any set of vectors in [math]\mathbb{R}^n[/math] that contains zero vector is linearly dependent.
For any vectors u and v in [math]\mathbb{R}^3[/math], if [math]\{u,v\}[/math] is linearly dependent, then either u is a scaling of v or v is a scaling of u.
For any vectors u and v in [math]\mathbb{R}^3[/math], if [math]\{u,v\}[/math] is linearly dependent, then [math]\text{Span}\{u,v\}[/math] is a line through the origin.
Close

Information: Linear Independence