Prove or disprove that this set is a base
Clash Royale CLAN TAG#URR8PPP
up vote
3
down vote
favorite
Let $Bs=e_1,e_2,...,e_n$ is standard base $mathbb R^n$. If $x_1,x_2,...,x_n$ vectors from space $R^n$ such that $e_iin L(x_1,x_2,...,x_n)$, $i=1:n$ then set $x_1,x_2,...x_n$ is base of space $mathbb R^n$?.
My answer is yes. Now we know that $alpha_1e_1+alpha_2e_2,...,alpha _ne_n=0_v$ because $e_1,e_2,...,e_n$ linear independent then $alpha_1=alpha_2=...=alpha _n=0$ know we know that $e_iin L(x_1,x_2,...,x_n)$, $i=1:n$ so then,$alpha_1^`x_1+alpha_2^`x_2,...,alpha _n^`x_n=e_1$, $beta_1x_1+beta_2x_2,...,beta _nx_n=e_2,...,gamma_1x_1+gamma_2x_2,...,gamma _nx_n=e_n$ then $x_1(alpha_1alpha_1^`+alpha_2beta_1...alpha _ngamma _1)+x_2(alpha_2alpha_2^`+alpha_2beta_2...alpha _ngamma _2)+...+x_n(alpha_1alpha _n^`+alpha_2beta _n...alpha _ngamma _n)=0_v$, from here $0x_1+0x_2+..0x_n=0v$ so $x_1,x_2,...x_n $is base. Is this ok?
linear-algebra
add a comment |Â
up vote
3
down vote
favorite
Let $Bs=e_1,e_2,...,e_n$ is standard base $mathbb R^n$. If $x_1,x_2,...,x_n$ vectors from space $R^n$ such that $e_iin L(x_1,x_2,...,x_n)$, $i=1:n$ then set $x_1,x_2,...x_n$ is base of space $mathbb R^n$?.
My answer is yes. Now we know that $alpha_1e_1+alpha_2e_2,...,alpha _ne_n=0_v$ because $e_1,e_2,...,e_n$ linear independent then $alpha_1=alpha_2=...=alpha _n=0$ know we know that $e_iin L(x_1,x_2,...,x_n)$, $i=1:n$ so then,$alpha_1^`x_1+alpha_2^`x_2,...,alpha _n^`x_n=e_1$, $beta_1x_1+beta_2x_2,...,beta _nx_n=e_2,...,gamma_1x_1+gamma_2x_2,...,gamma _nx_n=e_n$ then $x_1(alpha_1alpha_1^`+alpha_2beta_1...alpha _ngamma _1)+x_2(alpha_2alpha_2^`+alpha_2beta_2...alpha _ngamma _2)+...+x_n(alpha_1alpha _n^`+alpha_2beta _n...alpha _ngamma _n)=0_v$, from here $0x_1+0x_2+..0x_n=0v$ so $x_1,x_2,...x_n $is base. Is this ok?
linear-algebra
add a comment |Â
up vote
3
down vote
favorite
up vote
3
down vote
favorite
Let $Bs=e_1,e_2,...,e_n$ is standard base $mathbb R^n$. If $x_1,x_2,...,x_n$ vectors from space $R^n$ such that $e_iin L(x_1,x_2,...,x_n)$, $i=1:n$ then set $x_1,x_2,...x_n$ is base of space $mathbb R^n$?.
My answer is yes. Now we know that $alpha_1e_1+alpha_2e_2,...,alpha _ne_n=0_v$ because $e_1,e_2,...,e_n$ linear independent then $alpha_1=alpha_2=...=alpha _n=0$ know we know that $e_iin L(x_1,x_2,...,x_n)$, $i=1:n$ so then,$alpha_1^`x_1+alpha_2^`x_2,...,alpha _n^`x_n=e_1$, $beta_1x_1+beta_2x_2,...,beta _nx_n=e_2,...,gamma_1x_1+gamma_2x_2,...,gamma _nx_n=e_n$ then $x_1(alpha_1alpha_1^`+alpha_2beta_1...alpha _ngamma _1)+x_2(alpha_2alpha_2^`+alpha_2beta_2...alpha _ngamma _2)+...+x_n(alpha_1alpha _n^`+alpha_2beta _n...alpha _ngamma _n)=0_v$, from here $0x_1+0x_2+..0x_n=0v$ so $x_1,x_2,...x_n $is base. Is this ok?
linear-algebra
Let $Bs=e_1,e_2,...,e_n$ is standard base $mathbb R^n$. If $x_1,x_2,...,x_n$ vectors from space $R^n$ such that $e_iin L(x_1,x_2,...,x_n)$, $i=1:n$ then set $x_1,x_2,...x_n$ is base of space $mathbb R^n$?.
My answer is yes. Now we know that $alpha_1e_1+alpha_2e_2,...,alpha _ne_n=0_v$ because $e_1,e_2,...,e_n$ linear independent then $alpha_1=alpha_2=...=alpha _n=0$ know we know that $e_iin L(x_1,x_2,...,x_n)$, $i=1:n$ so then,$alpha_1^`x_1+alpha_2^`x_2,...,alpha _n^`x_n=e_1$, $beta_1x_1+beta_2x_2,...,beta _nx_n=e_2,...,gamma_1x_1+gamma_2x_2,...,gamma _nx_n=e_n$ then $x_1(alpha_1alpha_1^`+alpha_2beta_1...alpha _ngamma _1)+x_2(alpha_2alpha_2^`+alpha_2beta_2...alpha _ngamma _2)+...+x_n(alpha_1alpha _n^`+alpha_2beta _n...alpha _ngamma _n)=0_v$, from here $0x_1+0x_2+..0x_n=0v$ so $x_1,x_2,...x_n $is base. Is this ok?
linear-algebra
asked Aug 23 at 9:08
Marko à  koriÃÂ
3037
3037
add a comment |Â
add a comment |Â
3 Answers
3
active
oldest
votes
up vote
2
down vote
accepted
It's a bit hard to follow what you're doing.
The set is a basis for $mathbbR^n$ if it spans $mathbbR^n$ and if it is linearly independent.
But because we know that the dimension of $mathbbR^n$ is $n$ and the set $x_1,x_2,...x_n$ contains exactly $n$ elements, it suffices to show either of these properties since:
- $n$ linearly independent vectors necessarily span $mathbbR^n$ or, the other way around;
- a spanning set of $n$ elements for $mathbbR^n$ is necessarily linearly independent.
Now, an arbitrary $x in mathbbR^n$ can be written as $alpha_1e_1 + cdots + alpha_ne_n$ but all of the $e_i$'s can be written as linear combinations of the $x_i$'s, so substitute and rearrange a bit - it's only a matter of writing it down carefully now.
Just a small warning. This assumes the OP is already familiar with the dimension theorem for vector spaces, which is probably a safe bet, but for future readers, a small warning might help. en.wikipedia.org/wiki/Dimension_theorem_for_vector_spaces
â 5xum
Aug 23 at 9:32
I know and I was wondering whether to mention that explicitly; but now this comment does the job - thanks!
â StackTD
Aug 23 at 9:32
can you write how would you prove?
â Marko à  koriÃÂ
Aug 23 at 10:03
@Markoà  koriàDo you understand the last paragraph; have you tried writing it out?
â StackTD
Aug 23 at 10:48
add a comment |Â
up vote
2
down vote
Your proof is a little confusing. What you need to prove are two things:
1:
You need to prove that every $xinmathbb R^n$ can be written as a linear combination of $x_1,dots, x_n$ (you sort of almost did this part, but you should make it clear).
2:
You need to prove that if $alpha_1 x_1 + cdots + alpha_n x_n = 0$, then $alpha_i=0$ for all $i$ (you didn't do this part yet).
I suggest you rewrite your proof to make it clear which of the two facts is being proven at which point.
add a comment |Â
up vote
0
down vote
I write that every $ei, i:n$ can write as linear combination, after that I want to see if $x1,x2,...xn$ is linear independent, maybe its confusing why i write $0x1+0x2...+0xn=0$ because first I write $alpha1=alpha2=...=alpha n=0$ so when you multiply with $alpha1,alpha2,...alpha n$ in brackets you have 0, so they are linear independent.
add a comment |Â
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
accepted
It's a bit hard to follow what you're doing.
The set is a basis for $mathbbR^n$ if it spans $mathbbR^n$ and if it is linearly independent.
But because we know that the dimension of $mathbbR^n$ is $n$ and the set $x_1,x_2,...x_n$ contains exactly $n$ elements, it suffices to show either of these properties since:
- $n$ linearly independent vectors necessarily span $mathbbR^n$ or, the other way around;
- a spanning set of $n$ elements for $mathbbR^n$ is necessarily linearly independent.
Now, an arbitrary $x in mathbbR^n$ can be written as $alpha_1e_1 + cdots + alpha_ne_n$ but all of the $e_i$'s can be written as linear combinations of the $x_i$'s, so substitute and rearrange a bit - it's only a matter of writing it down carefully now.
Just a small warning. This assumes the OP is already familiar with the dimension theorem for vector spaces, which is probably a safe bet, but for future readers, a small warning might help. en.wikipedia.org/wiki/Dimension_theorem_for_vector_spaces
â 5xum
Aug 23 at 9:32
I know and I was wondering whether to mention that explicitly; but now this comment does the job - thanks!
â StackTD
Aug 23 at 9:32
can you write how would you prove?
â Marko à  koriÃÂ
Aug 23 at 10:03
@Markoà  koriàDo you understand the last paragraph; have you tried writing it out?
â StackTD
Aug 23 at 10:48
add a comment |Â
up vote
2
down vote
accepted
It's a bit hard to follow what you're doing.
The set is a basis for $mathbbR^n$ if it spans $mathbbR^n$ and if it is linearly independent.
But because we know that the dimension of $mathbbR^n$ is $n$ and the set $x_1,x_2,...x_n$ contains exactly $n$ elements, it suffices to show either of these properties since:
- $n$ linearly independent vectors necessarily span $mathbbR^n$ or, the other way around;
- a spanning set of $n$ elements for $mathbbR^n$ is necessarily linearly independent.
Now, an arbitrary $x in mathbbR^n$ can be written as $alpha_1e_1 + cdots + alpha_ne_n$ but all of the $e_i$'s can be written as linear combinations of the $x_i$'s, so substitute and rearrange a bit - it's only a matter of writing it down carefully now.
Just a small warning. This assumes the OP is already familiar with the dimension theorem for vector spaces, which is probably a safe bet, but for future readers, a small warning might help. en.wikipedia.org/wiki/Dimension_theorem_for_vector_spaces
â 5xum
Aug 23 at 9:32
I know and I was wondering whether to mention that explicitly; but now this comment does the job - thanks!
â StackTD
Aug 23 at 9:32
can you write how would you prove?
â Marko à  koriÃÂ
Aug 23 at 10:03
@Markoà  koriàDo you understand the last paragraph; have you tried writing it out?
â StackTD
Aug 23 at 10:48
add a comment |Â
up vote
2
down vote
accepted
up vote
2
down vote
accepted
It's a bit hard to follow what you're doing.
The set is a basis for $mathbbR^n$ if it spans $mathbbR^n$ and if it is linearly independent.
But because we know that the dimension of $mathbbR^n$ is $n$ and the set $x_1,x_2,...x_n$ contains exactly $n$ elements, it suffices to show either of these properties since:
- $n$ linearly independent vectors necessarily span $mathbbR^n$ or, the other way around;
- a spanning set of $n$ elements for $mathbbR^n$ is necessarily linearly independent.
Now, an arbitrary $x in mathbbR^n$ can be written as $alpha_1e_1 + cdots + alpha_ne_n$ but all of the $e_i$'s can be written as linear combinations of the $x_i$'s, so substitute and rearrange a bit - it's only a matter of writing it down carefully now.
It's a bit hard to follow what you're doing.
The set is a basis for $mathbbR^n$ if it spans $mathbbR^n$ and if it is linearly independent.
But because we know that the dimension of $mathbbR^n$ is $n$ and the set $x_1,x_2,...x_n$ contains exactly $n$ elements, it suffices to show either of these properties since:
- $n$ linearly independent vectors necessarily span $mathbbR^n$ or, the other way around;
- a spanning set of $n$ elements for $mathbbR^n$ is necessarily linearly independent.
Now, an arbitrary $x in mathbbR^n$ can be written as $alpha_1e_1 + cdots + alpha_ne_n$ but all of the $e_i$'s can be written as linear combinations of the $x_i$'s, so substitute and rearrange a bit - it's only a matter of writing it down carefully now.
answered Aug 23 at 9:19
StackTD
20.4k1544
20.4k1544
Just a small warning. This assumes the OP is already familiar with the dimension theorem for vector spaces, which is probably a safe bet, but for future readers, a small warning might help. en.wikipedia.org/wiki/Dimension_theorem_for_vector_spaces
â 5xum
Aug 23 at 9:32
I know and I was wondering whether to mention that explicitly; but now this comment does the job - thanks!
â StackTD
Aug 23 at 9:32
can you write how would you prove?
â Marko à  koriÃÂ
Aug 23 at 10:03
@Markoà  koriàDo you understand the last paragraph; have you tried writing it out?
â StackTD
Aug 23 at 10:48
add a comment |Â
Just a small warning. This assumes the OP is already familiar with the dimension theorem for vector spaces, which is probably a safe bet, but for future readers, a small warning might help. en.wikipedia.org/wiki/Dimension_theorem_for_vector_spaces
â 5xum
Aug 23 at 9:32
I know and I was wondering whether to mention that explicitly; but now this comment does the job - thanks!
â StackTD
Aug 23 at 9:32
can you write how would you prove?
â Marko à  koriÃÂ
Aug 23 at 10:03
@Markoà  koriàDo you understand the last paragraph; have you tried writing it out?
â StackTD
Aug 23 at 10:48
Just a small warning. This assumes the OP is already familiar with the dimension theorem for vector spaces, which is probably a safe bet, but for future readers, a small warning might help. en.wikipedia.org/wiki/Dimension_theorem_for_vector_spaces
â 5xum
Aug 23 at 9:32
Just a small warning. This assumes the OP is already familiar with the dimension theorem for vector spaces, which is probably a safe bet, but for future readers, a small warning might help. en.wikipedia.org/wiki/Dimension_theorem_for_vector_spaces
â 5xum
Aug 23 at 9:32
I know and I was wondering whether to mention that explicitly; but now this comment does the job - thanks!
â StackTD
Aug 23 at 9:32
I know and I was wondering whether to mention that explicitly; but now this comment does the job - thanks!
â StackTD
Aug 23 at 9:32
can you write how would you prove?
â Marko à  koriÃÂ
Aug 23 at 10:03
can you write how would you prove?
â Marko à  koriÃÂ
Aug 23 at 10:03
@Markoà  koriàDo you understand the last paragraph; have you tried writing it out?
â StackTD
Aug 23 at 10:48
@Markoà  koriàDo you understand the last paragraph; have you tried writing it out?
â StackTD
Aug 23 at 10:48
add a comment |Â
up vote
2
down vote
Your proof is a little confusing. What you need to prove are two things:
1:
You need to prove that every $xinmathbb R^n$ can be written as a linear combination of $x_1,dots, x_n$ (you sort of almost did this part, but you should make it clear).
2:
You need to prove that if $alpha_1 x_1 + cdots + alpha_n x_n = 0$, then $alpha_i=0$ for all $i$ (you didn't do this part yet).
I suggest you rewrite your proof to make it clear which of the two facts is being proven at which point.
add a comment |Â
up vote
2
down vote
Your proof is a little confusing. What you need to prove are two things:
1:
You need to prove that every $xinmathbb R^n$ can be written as a linear combination of $x_1,dots, x_n$ (you sort of almost did this part, but you should make it clear).
2:
You need to prove that if $alpha_1 x_1 + cdots + alpha_n x_n = 0$, then $alpha_i=0$ for all $i$ (you didn't do this part yet).
I suggest you rewrite your proof to make it clear which of the two facts is being proven at which point.
add a comment |Â
up vote
2
down vote
up vote
2
down vote
Your proof is a little confusing. What you need to prove are two things:
1:
You need to prove that every $xinmathbb R^n$ can be written as a linear combination of $x_1,dots, x_n$ (you sort of almost did this part, but you should make it clear).
2:
You need to prove that if $alpha_1 x_1 + cdots + alpha_n x_n = 0$, then $alpha_i=0$ for all $i$ (you didn't do this part yet).
I suggest you rewrite your proof to make it clear which of the two facts is being proven at which point.
Your proof is a little confusing. What you need to prove are two things:
1:
You need to prove that every $xinmathbb R^n$ can be written as a linear combination of $x_1,dots, x_n$ (you sort of almost did this part, but you should make it clear).
2:
You need to prove that if $alpha_1 x_1 + cdots + alpha_n x_n = 0$, then $alpha_i=0$ for all $i$ (you didn't do this part yet).
I suggest you rewrite your proof to make it clear which of the two facts is being proven at which point.
answered Aug 23 at 9:13
5xum
82.5k383147
82.5k383147
add a comment |Â
add a comment |Â
up vote
0
down vote
I write that every $ei, i:n$ can write as linear combination, after that I want to see if $x1,x2,...xn$ is linear independent, maybe its confusing why i write $0x1+0x2...+0xn=0$ because first I write $alpha1=alpha2=...=alpha n=0$ so when you multiply with $alpha1,alpha2,...alpha n$ in brackets you have 0, so they are linear independent.
add a comment |Â
up vote
0
down vote
I write that every $ei, i:n$ can write as linear combination, after that I want to see if $x1,x2,...xn$ is linear independent, maybe its confusing why i write $0x1+0x2...+0xn=0$ because first I write $alpha1=alpha2=...=alpha n=0$ so when you multiply with $alpha1,alpha2,...alpha n$ in brackets you have 0, so they are linear independent.
add a comment |Â
up vote
0
down vote
up vote
0
down vote
I write that every $ei, i:n$ can write as linear combination, after that I want to see if $x1,x2,...xn$ is linear independent, maybe its confusing why i write $0x1+0x2...+0xn=0$ because first I write $alpha1=alpha2=...=alpha n=0$ so when you multiply with $alpha1,alpha2,...alpha n$ in brackets you have 0, so they are linear independent.
I write that every $ei, i:n$ can write as linear combination, after that I want to see if $x1,x2,...xn$ is linear independent, maybe its confusing why i write $0x1+0x2...+0xn=0$ because first I write $alpha1=alpha2=...=alpha n=0$ so when you multiply with $alpha1,alpha2,...alpha n$ in brackets you have 0, so they are linear independent.
answered Aug 23 at 9:25
Marko à  koriÃÂ
3037
3037
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2891892%2fprove-or-disprove-that-this-set-is-a-base%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password