Big O for Functions Approaching 0
Clash Royale CLAN TAG#URR8PPP
up vote
1
down vote
favorite
$$f(x) = textthe Taylor series approximation for sin(x)$$
$$f_2 (x) = x$$
where $f_2(x)$ is an approximation for $f(x)$ as x is the first term of $f(x)$. Then:
$$g(x) = f(x) - f_2(x)$$
where the first term of $g(x)$ is $-(x^3)/6$.
What is the big $O$ of $g(x)$ as $x$ approaches $0$?
The professor of the lecture I'm watching claims it is $O(x^3)$ as the $x^3$ is the dominant term (largest valued term, slowest in approaching 0), which makes sense to me.
The second question is, $h(x) = 2x^2 + 27x + 1000$, what is the big $O$ as $x$ approaches $0$?
The professor claims it is $O(1)$ as the $1000$ is a constant.
I don't see why $g(x)$ can't be $O(1)$ for some delta where $0 < |x - 0| < delta$, and I don't see why $h(x)$ can't be $O(x^2)$ by the same logic applied for $g(x)$ being $O(x^2)$.
Any help will be much appreciated, thanks!
approximation
 |Â
show 1 more comment
up vote
1
down vote
favorite
$$f(x) = textthe Taylor series approximation for sin(x)$$
$$f_2 (x) = x$$
where $f_2(x)$ is an approximation for $f(x)$ as x is the first term of $f(x)$. Then:
$$g(x) = f(x) - f_2(x)$$
where the first term of $g(x)$ is $-(x^3)/6$.
What is the big $O$ of $g(x)$ as $x$ approaches $0$?
The professor of the lecture I'm watching claims it is $O(x^3)$ as the $x^3$ is the dominant term (largest valued term, slowest in approaching 0), which makes sense to me.
The second question is, $h(x) = 2x^2 + 27x + 1000$, what is the big $O$ as $x$ approaches $0$?
The professor claims it is $O(1)$ as the $1000$ is a constant.
I don't see why $g(x)$ can't be $O(1)$ for some delta where $0 < |x - 0| < delta$, and I don't see why $h(x)$ can't be $O(x^2)$ by the same logic applied for $g(x)$ being $O(x^2)$.
Any help will be much appreciated, thanks!
approximation
Well, the logic is the same. The term that approaches $0$ slowest is $1000$ isn't it?
â saulspatz
Aug 17 at 4:52
Taylor series is not approximation. Taylor polynomial is. We would just say $f(x)=sin(x)$.
â edm
Aug 17 at 5:01
@saulspatz So you're saying that if $g(x)$ ended with a $10^-10000000$ term, it would be $O(1)$, too?
â StopReadingThisUsername
Aug 17 at 6:58
Yes, that's right. Constants don't matter.
â saulspatz
Aug 17 at 6:59
@saulspatz So constants $do$ matter...right? In its current form $g(x)$ does $not$ end with a constant.
â StopReadingThisUsername
Aug 17 at 7:01
 |Â
show 1 more comment
up vote
1
down vote
favorite
up vote
1
down vote
favorite
$$f(x) = textthe Taylor series approximation for sin(x)$$
$$f_2 (x) = x$$
where $f_2(x)$ is an approximation for $f(x)$ as x is the first term of $f(x)$. Then:
$$g(x) = f(x) - f_2(x)$$
where the first term of $g(x)$ is $-(x^3)/6$.
What is the big $O$ of $g(x)$ as $x$ approaches $0$?
The professor of the lecture I'm watching claims it is $O(x^3)$ as the $x^3$ is the dominant term (largest valued term, slowest in approaching 0), which makes sense to me.
The second question is, $h(x) = 2x^2 + 27x + 1000$, what is the big $O$ as $x$ approaches $0$?
The professor claims it is $O(1)$ as the $1000$ is a constant.
I don't see why $g(x)$ can't be $O(1)$ for some delta where $0 < |x - 0| < delta$, and I don't see why $h(x)$ can't be $O(x^2)$ by the same logic applied for $g(x)$ being $O(x^2)$.
Any help will be much appreciated, thanks!
approximation
$$f(x) = textthe Taylor series approximation for sin(x)$$
$$f_2 (x) = x$$
where $f_2(x)$ is an approximation for $f(x)$ as x is the first term of $f(x)$. Then:
$$g(x) = f(x) - f_2(x)$$
where the first term of $g(x)$ is $-(x^3)/6$.
What is the big $O$ of $g(x)$ as $x$ approaches $0$?
The professor of the lecture I'm watching claims it is $O(x^3)$ as the $x^3$ is the dominant term (largest valued term, slowest in approaching 0), which makes sense to me.
The second question is, $h(x) = 2x^2 + 27x + 1000$, what is the big $O$ as $x$ approaches $0$?
The professor claims it is $O(1)$ as the $1000$ is a constant.
I don't see why $g(x)$ can't be $O(1)$ for some delta where $0 < |x - 0| < delta$, and I don't see why $h(x)$ can't be $O(x^2)$ by the same logic applied for $g(x)$ being $O(x^2)$.
Any help will be much appreciated, thanks!
approximation
edited Aug 17 at 6:40
Cornman
2,60721128
2,60721128
asked Aug 17 at 4:35
StopReadingThisUsername
762724
762724
Well, the logic is the same. The term that approaches $0$ slowest is $1000$ isn't it?
â saulspatz
Aug 17 at 4:52
Taylor series is not approximation. Taylor polynomial is. We would just say $f(x)=sin(x)$.
â edm
Aug 17 at 5:01
@saulspatz So you're saying that if $g(x)$ ended with a $10^-10000000$ term, it would be $O(1)$, too?
â StopReadingThisUsername
Aug 17 at 6:58
Yes, that's right. Constants don't matter.
â saulspatz
Aug 17 at 6:59
@saulspatz So constants $do$ matter...right? In its current form $g(x)$ does $not$ end with a constant.
â StopReadingThisUsername
Aug 17 at 7:01
 |Â
show 1 more comment
Well, the logic is the same. The term that approaches $0$ slowest is $1000$ isn't it?
â saulspatz
Aug 17 at 4:52
Taylor series is not approximation. Taylor polynomial is. We would just say $f(x)=sin(x)$.
â edm
Aug 17 at 5:01
@saulspatz So you're saying that if $g(x)$ ended with a $10^-10000000$ term, it would be $O(1)$, too?
â StopReadingThisUsername
Aug 17 at 6:58
Yes, that's right. Constants don't matter.
â saulspatz
Aug 17 at 6:59
@saulspatz So constants $do$ matter...right? In its current form $g(x)$ does $not$ end with a constant.
â StopReadingThisUsername
Aug 17 at 7:01
Well, the logic is the same. The term that approaches $0$ slowest is $1000$ isn't it?
â saulspatz
Aug 17 at 4:52
Well, the logic is the same. The term that approaches $0$ slowest is $1000$ isn't it?
â saulspatz
Aug 17 at 4:52
Taylor series is not approximation. Taylor polynomial is. We would just say $f(x)=sin(x)$.
â edm
Aug 17 at 5:01
Taylor series is not approximation. Taylor polynomial is. We would just say $f(x)=sin(x)$.
â edm
Aug 17 at 5:01
@saulspatz So you're saying that if $g(x)$ ended with a $10^-10000000$ term, it would be $O(1)$, too?
â StopReadingThisUsername
Aug 17 at 6:58
@saulspatz So you're saying that if $g(x)$ ended with a $10^-10000000$ term, it would be $O(1)$, too?
â StopReadingThisUsername
Aug 17 at 6:58
Yes, that's right. Constants don't matter.
â saulspatz
Aug 17 at 6:59
Yes, that's right. Constants don't matter.
â saulspatz
Aug 17 at 6:59
@saulspatz So constants $do$ matter...right? In its current form $g(x)$ does $not$ end with a constant.
â StopReadingThisUsername
Aug 17 at 7:01
@saulspatz So constants $do$ matter...right? In its current form $g(x)$ does $not$ end with a constant.
â StopReadingThisUsername
Aug 17 at 7:01
 |Â
show 1 more comment
2 Answers
2
active
oldest
votes
up vote
2
down vote
The thing is, $x^3$ goes to $0$ slower than $x^4$ and so on. So you keep the lowest order.
add a comment |Â
up vote
0
down vote
I like to think about big-$mathcal O$ as the limit definition:
$f=mathcal O(g)$ at $aifflimsup_xto a left|fracf(x)g(x)right|<infty$.
Or, in words, "near" the point $a$ the function $g$ grow faster or equal to $f$.
So near $0$ we have that $x^n=mathcal O(x^n-h)$ for all $n$ positive $h$(and multiplying by constant doesn't matter) because$$limsup_xto 0 left|fracx^nx^n-hright|=lim_xto 0 left|fracx^nx^n-hright|=lim_xto 0|x^h|=0<infty$$
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
The thing is, $x^3$ goes to $0$ slower than $x^4$ and so on. So you keep the lowest order.
add a comment |Â
up vote
2
down vote
The thing is, $x^3$ goes to $0$ slower than $x^4$ and so on. So you keep the lowest order.
add a comment |Â
up vote
2
down vote
up vote
2
down vote
The thing is, $x^3$ goes to $0$ slower than $x^4$ and so on. So you keep the lowest order.
The thing is, $x^3$ goes to $0$ slower than $x^4$ and so on. So you keep the lowest order.
answered Aug 17 at 5:01
Trebor
33310
33310
add a comment |Â
add a comment |Â
up vote
0
down vote
I like to think about big-$mathcal O$ as the limit definition:
$f=mathcal O(g)$ at $aifflimsup_xto a left|fracf(x)g(x)right|<infty$.
Or, in words, "near" the point $a$ the function $g$ grow faster or equal to $f$.
So near $0$ we have that $x^n=mathcal O(x^n-h)$ for all $n$ positive $h$(and multiplying by constant doesn't matter) because$$limsup_xto 0 left|fracx^nx^n-hright|=lim_xto 0 left|fracx^nx^n-hright|=lim_xto 0|x^h|=0<infty$$
add a comment |Â
up vote
0
down vote
I like to think about big-$mathcal O$ as the limit definition:
$f=mathcal O(g)$ at $aifflimsup_xto a left|fracf(x)g(x)right|<infty$.
Or, in words, "near" the point $a$ the function $g$ grow faster or equal to $f$.
So near $0$ we have that $x^n=mathcal O(x^n-h)$ for all $n$ positive $h$(and multiplying by constant doesn't matter) because$$limsup_xto 0 left|fracx^nx^n-hright|=lim_xto 0 left|fracx^nx^n-hright|=lim_xto 0|x^h|=0<infty$$
add a comment |Â
up vote
0
down vote
up vote
0
down vote
I like to think about big-$mathcal O$ as the limit definition:
$f=mathcal O(g)$ at $aifflimsup_xto a left|fracf(x)g(x)right|<infty$.
Or, in words, "near" the point $a$ the function $g$ grow faster or equal to $f$.
So near $0$ we have that $x^n=mathcal O(x^n-h)$ for all $n$ positive $h$(and multiplying by constant doesn't matter) because$$limsup_xto 0 left|fracx^nx^n-hright|=lim_xto 0 left|fracx^nx^n-hright|=lim_xto 0|x^h|=0<infty$$
I like to think about big-$mathcal O$ as the limit definition:
$f=mathcal O(g)$ at $aifflimsup_xto a left|fracf(x)g(x)right|<infty$.
Or, in words, "near" the point $a$ the function $g$ grow faster or equal to $f$.
So near $0$ we have that $x^n=mathcal O(x^n-h)$ for all $n$ positive $h$(and multiplying by constant doesn't matter) because$$limsup_xto 0 left|fracx^nx^n-hright|=lim_xto 0 left|fracx^nx^n-hright|=lim_xto 0|x^h|=0<infty$$
edited Aug 17 at 5:00
answered Aug 17 at 4:54
Holo
4,2972629
4,2972629
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2885405%2fbig-o-for-functions-approaching-0%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Well, the logic is the same. The term that approaches $0$ slowest is $1000$ isn't it?
â saulspatz
Aug 17 at 4:52
Taylor series is not approximation. Taylor polynomial is. We would just say $f(x)=sin(x)$.
â edm
Aug 17 at 5:01
@saulspatz So you're saying that if $g(x)$ ended with a $10^-10000000$ term, it would be $O(1)$, too?
â StopReadingThisUsername
Aug 17 at 6:58
Yes, that's right. Constants don't matter.
â saulspatz
Aug 17 at 6:59
@saulspatz So constants $do$ matter...right? In its current form $g(x)$ does $not$ end with a constant.
â StopReadingThisUsername
Aug 17 at 7:01