Let $V$ be an inner prod. sp. and $W$ be its finite dim. linear subspace w/ orthonormal basis $leftw_1,dots,w_nright$
Clash Royale CLAN TAG#URR8PPP
up vote
1
down vote
favorite
Let $V$ be an inner product space, and let $W$ be its finite dimensional linear subspace that has an orthonormal basis $ w_1, dots , w_n $. Define $sigma : V to W$ by
$$sigma(v) = sum_i=1^n langle v,w_irangle w_i$$
Show that $|sigma(v) - v |< | w - v | $ for all $vin V$ and for all $w in W$, with $w neq sigma(v)$.
Take an arbitrary $vin V$ such that $sigma(v)neq w$ and then,
beginalign*
|sigma(v) - v | &= biggl|sum_i=1^n langle v,w_irangle w_i - v biggr| \
&leq sum_i=1^n | langle v,w_irangle w_i - v|\
endalign*
I don't know how I utilize the fact that $ w_1, dots , w_n $ is an orthonormal basis for $W$. I know that means that $langle w_i , w_jrangle = 1$ if $i=j$ and $0$ otherwise. Also, $sigma(v) subseteq W$, so we should be able to represent it as $sigma(v) = (alpha_1 w_1 , alpha_2 w_2 , dots , alpha_n w_n)$ where $alpha_i$ are scalars. Any help would be appreciated.
linear-algebra inner-product-space
add a comment |Â
up vote
1
down vote
favorite
Let $V$ be an inner product space, and let $W$ be its finite dimensional linear subspace that has an orthonormal basis $ w_1, dots , w_n $. Define $sigma : V to W$ by
$$sigma(v) = sum_i=1^n langle v,w_irangle w_i$$
Show that $|sigma(v) - v |< | w - v | $ for all $vin V$ and for all $w in W$, with $w neq sigma(v)$.
Take an arbitrary $vin V$ such that $sigma(v)neq w$ and then,
beginalign*
|sigma(v) - v | &= biggl|sum_i=1^n langle v,w_irangle w_i - v biggr| \
&leq sum_i=1^n | langle v,w_irangle w_i - v|\
endalign*
I don't know how I utilize the fact that $ w_1, dots , w_n $ is an orthonormal basis for $W$. I know that means that $langle w_i , w_jrangle = 1$ if $i=j$ and $0$ otherwise. Also, $sigma(v) subseteq W$, so we should be able to represent it as $sigma(v) = (alpha_1 w_1 , alpha_2 w_2 , dots , alpha_n w_n)$ where $alpha_i$ are scalars. Any help would be appreciated.
linear-algebra inner-product-space
add a comment |Â
up vote
1
down vote
favorite
up vote
1
down vote
favorite
Let $V$ be an inner product space, and let $W$ be its finite dimensional linear subspace that has an orthonormal basis $ w_1, dots , w_n $. Define $sigma : V to W$ by
$$sigma(v) = sum_i=1^n langle v,w_irangle w_i$$
Show that $|sigma(v) - v |< | w - v | $ for all $vin V$ and for all $w in W$, with $w neq sigma(v)$.
Take an arbitrary $vin V$ such that $sigma(v)neq w$ and then,
beginalign*
|sigma(v) - v | &= biggl|sum_i=1^n langle v,w_irangle w_i - v biggr| \
&leq sum_i=1^n | langle v,w_irangle w_i - v|\
endalign*
I don't know how I utilize the fact that $ w_1, dots , w_n $ is an orthonormal basis for $W$. I know that means that $langle w_i , w_jrangle = 1$ if $i=j$ and $0$ otherwise. Also, $sigma(v) subseteq W$, so we should be able to represent it as $sigma(v) = (alpha_1 w_1 , alpha_2 w_2 , dots , alpha_n w_n)$ where $alpha_i$ are scalars. Any help would be appreciated.
linear-algebra inner-product-space
Let $V$ be an inner product space, and let $W$ be its finite dimensional linear subspace that has an orthonormal basis $ w_1, dots , w_n $. Define $sigma : V to W$ by
$$sigma(v) = sum_i=1^n langle v,w_irangle w_i$$
Show that $|sigma(v) - v |< | w - v | $ for all $vin V$ and for all $w in W$, with $w neq sigma(v)$.
Take an arbitrary $vin V$ such that $sigma(v)neq w$ and then,
beginalign*
|sigma(v) - v | &= biggl|sum_i=1^n langle v,w_irangle w_i - v biggr| \
&leq sum_i=1^n | langle v,w_irangle w_i - v|\
endalign*
I don't know how I utilize the fact that $ w_1, dots , w_n $ is an orthonormal basis for $W$. I know that means that $langle w_i , w_jrangle = 1$ if $i=j$ and $0$ otherwise. Also, $sigma(v) subseteq W$, so we should be able to represent it as $sigma(v) = (alpha_1 w_1 , alpha_2 w_2 , dots , alpha_n w_n)$ where $alpha_i$ are scalars. Any help would be appreciated.
linear-algebra inner-product-space
linear-algebra inner-product-space
edited Sep 10 at 21:54
egreg
167k1281190
167k1281190
asked Sep 10 at 20:51
Dragonite
1,122219
1,122219
add a comment |Â
add a comment |Â
2 Answers
2
active
oldest
votes
up vote
2
down vote
accepted
The key observation is that $sigma(v)-v,perp,W$.
Indeed, $langlesigma(v),w_irangle=langle v,w_irangle$ for each basis element $w_i$ of $W$.
Now, $w-v=(w-sigma(v)),+,(sigma(v)-v)$, and these two summands are orthogonal to each other by the above, as $w-sigma(v)in W$, so we have
$$|w-v|^2=|w-sigma(v)|^2,+,|sigma(v)-v|^2,.$$
On why $sigma(v) - v perp W$. This means that $langle sigma(v) - v , w_i rangle = 0$ for all $1 leq i leq n$. Then note that:
beginalign*
sum_i=1^n langle sigma(v) - v , w_i rangle &= sum_i=1^n langle sigma(v) , w_i rangle - sum_i=1^n langle v , w_i rangle \
&= sum_i=1^n langle sum_j=1^n langle v, w_j rangle w_j , w_i rangle - sum_i=1^n langle v,w_i rangle \
&= sum_i=1^n langle langle v,w_irangle w_i , w_i rangle- sum_i=1^n langle v, w_i rangle \
&= sum_i=1^n langle v,w_i rangle - sum_i=1^n langle v,w_i rangle \
&=0.
endalign*
Why is $sigma(v) - v perp W$, what in the question gives that away? I must be missing a foundational concept. Given that I see how the rest of the argument is formed. Thank you!
â Dragonite
Sep 11 at 12:45
add a comment |Â
up vote
1
down vote
You can write any vector of V as the sum of a vector that lies in W and one that is orthogonal to W. Geometrically, you're trying to show that the orthogonal projection of v onto W is the vector in W closest to v. Think of the vector v as one leg of a triangle and start connecting v to vectors w in W. You should be able to at least convince yourself that the shortest distance from w to v is when you have a right triangle. The argument in the previous comment proves it.
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
accepted
The key observation is that $sigma(v)-v,perp,W$.
Indeed, $langlesigma(v),w_irangle=langle v,w_irangle$ for each basis element $w_i$ of $W$.
Now, $w-v=(w-sigma(v)),+,(sigma(v)-v)$, and these two summands are orthogonal to each other by the above, as $w-sigma(v)in W$, so we have
$$|w-v|^2=|w-sigma(v)|^2,+,|sigma(v)-v|^2,.$$
On why $sigma(v) - v perp W$. This means that $langle sigma(v) - v , w_i rangle = 0$ for all $1 leq i leq n$. Then note that:
beginalign*
sum_i=1^n langle sigma(v) - v , w_i rangle &= sum_i=1^n langle sigma(v) , w_i rangle - sum_i=1^n langle v , w_i rangle \
&= sum_i=1^n langle sum_j=1^n langle v, w_j rangle w_j , w_i rangle - sum_i=1^n langle v,w_i rangle \
&= sum_i=1^n langle langle v,w_irangle w_i , w_i rangle- sum_i=1^n langle v, w_i rangle \
&= sum_i=1^n langle v,w_i rangle - sum_i=1^n langle v,w_i rangle \
&=0.
endalign*
Why is $sigma(v) - v perp W$, what in the question gives that away? I must be missing a foundational concept. Given that I see how the rest of the argument is formed. Thank you!
â Dragonite
Sep 11 at 12:45
add a comment |Â
up vote
2
down vote
accepted
The key observation is that $sigma(v)-v,perp,W$.
Indeed, $langlesigma(v),w_irangle=langle v,w_irangle$ for each basis element $w_i$ of $W$.
Now, $w-v=(w-sigma(v)),+,(sigma(v)-v)$, and these two summands are orthogonal to each other by the above, as $w-sigma(v)in W$, so we have
$$|w-v|^2=|w-sigma(v)|^2,+,|sigma(v)-v|^2,.$$
On why $sigma(v) - v perp W$. This means that $langle sigma(v) - v , w_i rangle = 0$ for all $1 leq i leq n$. Then note that:
beginalign*
sum_i=1^n langle sigma(v) - v , w_i rangle &= sum_i=1^n langle sigma(v) , w_i rangle - sum_i=1^n langle v , w_i rangle \
&= sum_i=1^n langle sum_j=1^n langle v, w_j rangle w_j , w_i rangle - sum_i=1^n langle v,w_i rangle \
&= sum_i=1^n langle langle v,w_irangle w_i , w_i rangle- sum_i=1^n langle v, w_i rangle \
&= sum_i=1^n langle v,w_i rangle - sum_i=1^n langle v,w_i rangle \
&=0.
endalign*
Why is $sigma(v) - v perp W$, what in the question gives that away? I must be missing a foundational concept. Given that I see how the rest of the argument is formed. Thank you!
â Dragonite
Sep 11 at 12:45
add a comment |Â
up vote
2
down vote
accepted
up vote
2
down vote
accepted
The key observation is that $sigma(v)-v,perp,W$.
Indeed, $langlesigma(v),w_irangle=langle v,w_irangle$ for each basis element $w_i$ of $W$.
Now, $w-v=(w-sigma(v)),+,(sigma(v)-v)$, and these two summands are orthogonal to each other by the above, as $w-sigma(v)in W$, so we have
$$|w-v|^2=|w-sigma(v)|^2,+,|sigma(v)-v|^2,.$$
On why $sigma(v) - v perp W$. This means that $langle sigma(v) - v , w_i rangle = 0$ for all $1 leq i leq n$. Then note that:
beginalign*
sum_i=1^n langle sigma(v) - v , w_i rangle &= sum_i=1^n langle sigma(v) , w_i rangle - sum_i=1^n langle v , w_i rangle \
&= sum_i=1^n langle sum_j=1^n langle v, w_j rangle w_j , w_i rangle - sum_i=1^n langle v,w_i rangle \
&= sum_i=1^n langle langle v,w_irangle w_i , w_i rangle- sum_i=1^n langle v, w_i rangle \
&= sum_i=1^n langle v,w_i rangle - sum_i=1^n langle v,w_i rangle \
&=0.
endalign*
The key observation is that $sigma(v)-v,perp,W$.
Indeed, $langlesigma(v),w_irangle=langle v,w_irangle$ for each basis element $w_i$ of $W$.
Now, $w-v=(w-sigma(v)),+,(sigma(v)-v)$, and these two summands are orthogonal to each other by the above, as $w-sigma(v)in W$, so we have
$$|w-v|^2=|w-sigma(v)|^2,+,|sigma(v)-v|^2,.$$
On why $sigma(v) - v perp W$. This means that $langle sigma(v) - v , w_i rangle = 0$ for all $1 leq i leq n$. Then note that:
beginalign*
sum_i=1^n langle sigma(v) - v , w_i rangle &= sum_i=1^n langle sigma(v) , w_i rangle - sum_i=1^n langle v , w_i rangle \
&= sum_i=1^n langle sum_j=1^n langle v, w_j rangle w_j , w_i rangle - sum_i=1^n langle v,w_i rangle \
&= sum_i=1^n langle langle v,w_irangle w_i , w_i rangle- sum_i=1^n langle v, w_i rangle \
&= sum_i=1^n langle v,w_i rangle - sum_i=1^n langle v,w_i rangle \
&=0.
endalign*
edited Sep 11 at 20:28
Dragonite
1,122219
1,122219
answered Sep 10 at 21:12
Berci
57.3k23670
57.3k23670
Why is $sigma(v) - v perp W$, what in the question gives that away? I must be missing a foundational concept. Given that I see how the rest of the argument is formed. Thank you!
â Dragonite
Sep 11 at 12:45
add a comment |Â
Why is $sigma(v) - v perp W$, what in the question gives that away? I must be missing a foundational concept. Given that I see how the rest of the argument is formed. Thank you!
â Dragonite
Sep 11 at 12:45
Why is $sigma(v) - v perp W$, what in the question gives that away? I must be missing a foundational concept. Given that I see how the rest of the argument is formed. Thank you!
â Dragonite
Sep 11 at 12:45
Why is $sigma(v) - v perp W$, what in the question gives that away? I must be missing a foundational concept. Given that I see how the rest of the argument is formed. Thank you!
â Dragonite
Sep 11 at 12:45
add a comment |Â
up vote
1
down vote
You can write any vector of V as the sum of a vector that lies in W and one that is orthogonal to W. Geometrically, you're trying to show that the orthogonal projection of v onto W is the vector in W closest to v. Think of the vector v as one leg of a triangle and start connecting v to vectors w in W. You should be able to at least convince yourself that the shortest distance from w to v is when you have a right triangle. The argument in the previous comment proves it.
add a comment |Â
up vote
1
down vote
You can write any vector of V as the sum of a vector that lies in W and one that is orthogonal to W. Geometrically, you're trying to show that the orthogonal projection of v onto W is the vector in W closest to v. Think of the vector v as one leg of a triangle and start connecting v to vectors w in W. You should be able to at least convince yourself that the shortest distance from w to v is when you have a right triangle. The argument in the previous comment proves it.
add a comment |Â
up vote
1
down vote
up vote
1
down vote
You can write any vector of V as the sum of a vector that lies in W and one that is orthogonal to W. Geometrically, you're trying to show that the orthogonal projection of v onto W is the vector in W closest to v. Think of the vector v as one leg of a triangle and start connecting v to vectors w in W. You should be able to at least convince yourself that the shortest distance from w to v is when you have a right triangle. The argument in the previous comment proves it.
You can write any vector of V as the sum of a vector that lies in W and one that is orthogonal to W. Geometrically, you're trying to show that the orthogonal projection of v onto W is the vector in W closest to v. Think of the vector v as one leg of a triangle and start connecting v to vectors w in W. You should be able to at least convince yourself that the shortest distance from w to v is when you have a right triangle. The argument in the previous comment proves it.
answered Sep 10 at 22:28
Joel Pereira
484
484
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2912344%2flet-v-be-an-inner-prod-sp-and-w-be-its-finite-dim-linear-subspace-w-orth%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password