A set of vectors [math]\left\{w_1,w_2,\ldots,w_p\right\}[/math] in [math]\mathbb{R}^n[/math] is said to be an [b]orthogonal set[/b] if [math]w_i\perp w_j[/math] whenever [math]i\ne j[/math].[br][br]Orthogonal sets enjoy some really nice properties. The following is one of them:[br][br][u]Theorem[/u]: Any orthogonal set of nonzero vectors is a linearly independent set.[br][br]Proof: Let [math]\left\{w_1,w_2,\ldots,w_p\right\}[/math] be an orthogonal set of nonzero vectors in [math]\mathbb{R}^n[/math]. Suppose there exist real numbers [math]c_1,c_2,\ldots,c_p[/math] such that [br][br][math]c_1w_1+c_2w_2+\cdots+c_pw_p=0[/math][br][br]Take the inner product with [math]w_i[/math], where [math]i=1,2,\ldots,p[/math] on both sides, we get[br][br][math]c_1(w_1\cdot w_i)+c_2(w_2\cdot w_i)+\cdots+c_i(w_i\cdot w_i)+\cdots+c_p(w_p\cdot w_i)=0\cdot w_i[/math][br][br]By definition, [math]w_j\cdot w_i=0[/math] whenever [math]i\ne j[/math]. Therefore, we have[br][br][math]c_i(w_i\cdot w_i)=0[/math][br][br]Since [math]w_i\ne 0[/math], [math]c_i=0[/math] for [math]i=1,2,\ldots,p[/math], which implies that the orthogonal set is a linearly independent set.[br][br][br][u]Definition[/u]: An [b]orthogonal basis[/b] for a subspace [math]W[/math] of [math]\mathbb{R}^n[/math] is a basis for [math]W[/math] that is also an orthogonal set.[br][br]An orthogonal basis is more convenient to use than other bases because when a vector is expressed as a linear combination of an orthogonal basis, the weights can be computed easily using inner products.[br][br][u]Theorem[/u]: Let [math]\left\{w_1,w_2,\ldots,w_p\right\}[/math] be an orthogonal basis for a subspace [math]W[/math] of [math]\mathbb{R}^n[/math]. For any [math]v[/math] in [math]W[/math] such that [br][br][math]v=c_1w_1+c_2w_2+\cdots+c_pw_p[/math],[br][br]the weights can be computed by the formula: [math]c_j=\frac{v\cdot w_j}{w_j \cdot w_j}[/math], for [math]j=1,2,\ldots,p[/math].[br]
Prove the above formula for [math]c_j[/math]. (Hint: Use the same idea as in the proof of the previous theorem).
We take a closer look at the formula for the weights: [math]c_j=\frac{v\cdot w_j}{w_j \cdot w_j}[/math], for [math]j=1,2,\ldots,p[/math]. Let [math]\hat{w}_j=\frac1{\|w_j\|}w_j[/math] be the unit vector in the direction of [math]w_j[/math] and [math]L_j[/math] be the line containing [math]w_j[/math]. In other words, [math]L_j=\text{Span}\{w_j\}[/math]. Recall the projection map [math]P_{\hat{w}_j}(v)=\hat{w}_j\cdot v[/math], which is the signed distance from the origin to the point obtained by projecting the arrowhead of [math]v[/math] to [math]L_j[/math] [b]orthogonally[/b]. Hence, [math]P_{\hat{w}_j}(v)\hat{w}_j[/math] is exactly the vector from the origin to that projection point. This vector is said to be the [b]orthogonal projection of [math]v[/math] onto [math]L_j[/math][/b], denoted by [math]\text{proj}_{L_j}v[/math]. Then we have[br][br][math]\text{proj}_{L_j}v=(\hat{w}_j\cdot v)\hat{w}_j=\left(\frac1{\|w_j\|}w_j\cdot v\right)\left(\frac1{\|w_j\|}w_j\right)=\left(\frac{w_j\cdot v}{\|w_j\|^2}\right)w_j=\left(\frac{w_j\cdot v}{w_j\cdot w_j}\right)w_j=c_jw_j[/math][br][br]In other words, [math]v=c_1w_1+c_2w_2+\cdots+c_pw_p[/math] is regarded as the sum of all orthogonal projection of [math]v[/math] onto [math]L_j[/math] for [math]j=1,2,\ldots,p[/math].[br][br]In the applet below, you can change the vector [math]v[/math] and the orthogonal basis [math]\left\{w_1,w_2,w_3\right\}[/math] and see the orthogonal projection of [math]v[/math] onto the line spanned by [math]w_j[/math] for [math]j=1,2,3[/math].[br]