Big O for Functions Approaching 0

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite












$$f(x) = textthe Taylor series approximation for sin(x)$$



$$f_2 (x) = x$$



where $f_2(x)$ is an approximation for $f(x)$ as x is the first term of $f(x)$. Then:
$$g(x) = f(x) - f_2(x)$$
where the first term of $g(x)$ is $-(x^3)/6$.




What is the big $O$ of $g(x)$ as $x$ approaches $0$?




The professor of the lecture I'm watching claims it is $O(x^3)$ as the $x^3$ is the dominant term (largest valued term, slowest in approaching 0), which makes sense to me.




The second question is, $h(x) = 2x^2 + 27x + 1000$, what is the big $O$ as $x$ approaches $0$?




The professor claims it is $O(1)$ as the $1000$ is a constant.



I don't see why $g(x)$ can't be $O(1)$ for some delta where $0 < |x - 0| < delta$, and I don't see why $h(x)$ can't be $O(x^2)$ by the same logic applied for $g(x)$ being $O(x^2)$.



Any help will be much appreciated, thanks!







share|cite|improve this question






















  • Well, the logic is the same. The term that approaches $0$ slowest is $1000$ isn't it?
    – saulspatz
    Aug 17 at 4:52










  • Taylor series is not approximation. Taylor polynomial is. We would just say $f(x)=sin(x)$.
    – edm
    Aug 17 at 5:01










  • @saulspatz So you're saying that if $g(x)$ ended with a $10^-10000000$ term, it would be $O(1)$, too?
    – StopReadingThisUsername
    Aug 17 at 6:58











  • Yes, that's right. Constants don't matter.
    – saulspatz
    Aug 17 at 6:59










  • @saulspatz So constants $do$ matter...right? In its current form $g(x)$ does $not$ end with a constant.
    – StopReadingThisUsername
    Aug 17 at 7:01















up vote
1
down vote

favorite












$$f(x) = textthe Taylor series approximation for sin(x)$$



$$f_2 (x) = x$$



where $f_2(x)$ is an approximation for $f(x)$ as x is the first term of $f(x)$. Then:
$$g(x) = f(x) - f_2(x)$$
where the first term of $g(x)$ is $-(x^3)/6$.




What is the big $O$ of $g(x)$ as $x$ approaches $0$?




The professor of the lecture I'm watching claims it is $O(x^3)$ as the $x^3$ is the dominant term (largest valued term, slowest in approaching 0), which makes sense to me.




The second question is, $h(x) = 2x^2 + 27x + 1000$, what is the big $O$ as $x$ approaches $0$?




The professor claims it is $O(1)$ as the $1000$ is a constant.



I don't see why $g(x)$ can't be $O(1)$ for some delta where $0 < |x - 0| < delta$, and I don't see why $h(x)$ can't be $O(x^2)$ by the same logic applied for $g(x)$ being $O(x^2)$.



Any help will be much appreciated, thanks!







share|cite|improve this question






















  • Well, the logic is the same. The term that approaches $0$ slowest is $1000$ isn't it?
    – saulspatz
    Aug 17 at 4:52










  • Taylor series is not approximation. Taylor polynomial is. We would just say $f(x)=sin(x)$.
    – edm
    Aug 17 at 5:01










  • @saulspatz So you're saying that if $g(x)$ ended with a $10^-10000000$ term, it would be $O(1)$, too?
    – StopReadingThisUsername
    Aug 17 at 6:58











  • Yes, that's right. Constants don't matter.
    – saulspatz
    Aug 17 at 6:59










  • @saulspatz So constants $do$ matter...right? In its current form $g(x)$ does $not$ end with a constant.
    – StopReadingThisUsername
    Aug 17 at 7:01













up vote
1
down vote

favorite









up vote
1
down vote

favorite











$$f(x) = textthe Taylor series approximation for sin(x)$$



$$f_2 (x) = x$$



where $f_2(x)$ is an approximation for $f(x)$ as x is the first term of $f(x)$. Then:
$$g(x) = f(x) - f_2(x)$$
where the first term of $g(x)$ is $-(x^3)/6$.




What is the big $O$ of $g(x)$ as $x$ approaches $0$?




The professor of the lecture I'm watching claims it is $O(x^3)$ as the $x^3$ is the dominant term (largest valued term, slowest in approaching 0), which makes sense to me.




The second question is, $h(x) = 2x^2 + 27x + 1000$, what is the big $O$ as $x$ approaches $0$?




The professor claims it is $O(1)$ as the $1000$ is a constant.



I don't see why $g(x)$ can't be $O(1)$ for some delta where $0 < |x - 0| < delta$, and I don't see why $h(x)$ can't be $O(x^2)$ by the same logic applied for $g(x)$ being $O(x^2)$.



Any help will be much appreciated, thanks!







share|cite|improve this question














$$f(x) = textthe Taylor series approximation for sin(x)$$



$$f_2 (x) = x$$



where $f_2(x)$ is an approximation for $f(x)$ as x is the first term of $f(x)$. Then:
$$g(x) = f(x) - f_2(x)$$
where the first term of $g(x)$ is $-(x^3)/6$.




What is the big $O$ of $g(x)$ as $x$ approaches $0$?




The professor of the lecture I'm watching claims it is $O(x^3)$ as the $x^3$ is the dominant term (largest valued term, slowest in approaching 0), which makes sense to me.




The second question is, $h(x) = 2x^2 + 27x + 1000$, what is the big $O$ as $x$ approaches $0$?




The professor claims it is $O(1)$ as the $1000$ is a constant.



I don't see why $g(x)$ can't be $O(1)$ for some delta where $0 < |x - 0| < delta$, and I don't see why $h(x)$ can't be $O(x^2)$ by the same logic applied for $g(x)$ being $O(x^2)$.



Any help will be much appreciated, thanks!









share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Aug 17 at 6:40









Cornman

2,60721128




2,60721128










asked Aug 17 at 4:35









StopReadingThisUsername

762724




762724











  • Well, the logic is the same. The term that approaches $0$ slowest is $1000$ isn't it?
    – saulspatz
    Aug 17 at 4:52










  • Taylor series is not approximation. Taylor polynomial is. We would just say $f(x)=sin(x)$.
    – edm
    Aug 17 at 5:01










  • @saulspatz So you're saying that if $g(x)$ ended with a $10^-10000000$ term, it would be $O(1)$, too?
    – StopReadingThisUsername
    Aug 17 at 6:58











  • Yes, that's right. Constants don't matter.
    – saulspatz
    Aug 17 at 6:59










  • @saulspatz So constants $do$ matter...right? In its current form $g(x)$ does $not$ end with a constant.
    – StopReadingThisUsername
    Aug 17 at 7:01

















  • Well, the logic is the same. The term that approaches $0$ slowest is $1000$ isn't it?
    – saulspatz
    Aug 17 at 4:52










  • Taylor series is not approximation. Taylor polynomial is. We would just say $f(x)=sin(x)$.
    – edm
    Aug 17 at 5:01










  • @saulspatz So you're saying that if $g(x)$ ended with a $10^-10000000$ term, it would be $O(1)$, too?
    – StopReadingThisUsername
    Aug 17 at 6:58











  • Yes, that's right. Constants don't matter.
    – saulspatz
    Aug 17 at 6:59










  • @saulspatz So constants $do$ matter...right? In its current form $g(x)$ does $not$ end with a constant.
    – StopReadingThisUsername
    Aug 17 at 7:01
















Well, the logic is the same. The term that approaches $0$ slowest is $1000$ isn't it?
– saulspatz
Aug 17 at 4:52




Well, the logic is the same. The term that approaches $0$ slowest is $1000$ isn't it?
– saulspatz
Aug 17 at 4:52












Taylor series is not approximation. Taylor polynomial is. We would just say $f(x)=sin(x)$.
– edm
Aug 17 at 5:01




Taylor series is not approximation. Taylor polynomial is. We would just say $f(x)=sin(x)$.
– edm
Aug 17 at 5:01












@saulspatz So you're saying that if $g(x)$ ended with a $10^-10000000$ term, it would be $O(1)$, too?
– StopReadingThisUsername
Aug 17 at 6:58





@saulspatz So you're saying that if $g(x)$ ended with a $10^-10000000$ term, it would be $O(1)$, too?
– StopReadingThisUsername
Aug 17 at 6:58













Yes, that's right. Constants don't matter.
– saulspatz
Aug 17 at 6:59




Yes, that's right. Constants don't matter.
– saulspatz
Aug 17 at 6:59












@saulspatz So constants $do$ matter...right? In its current form $g(x)$ does $not$ end with a constant.
– StopReadingThisUsername
Aug 17 at 7:01





@saulspatz So constants $do$ matter...right? In its current form $g(x)$ does $not$ end with a constant.
– StopReadingThisUsername
Aug 17 at 7:01











2 Answers
2






active

oldest

votes

















up vote
2
down vote













The thing is, $x^3$ goes to $0$ slower than $x^4$ and so on. So you keep the lowest order.






share|cite|improve this answer



























    up vote
    0
    down vote













    I like to think about big-$mathcal O$ as the limit definition:



    $f=mathcal O(g)$ at $aifflimsup_xto a left|fracf(x)g(x)right|<infty$.



    Or, in words, "near" the point $a$ the function $g$ grow faster or equal to $f$.



    So near $0$ we have that $x^n=mathcal O(x^n-h)$ for all $n$ positive $h$(and multiplying by constant doesn't matter) because$$limsup_xto 0 left|fracx^nx^n-hright|=lim_xto 0 left|fracx^nx^n-hright|=lim_xto 0|x^h|=0<infty$$






    share|cite|improve this answer






















      Your Answer




      StackExchange.ifUsing("editor", function ()
      return StackExchange.using("mathjaxEditing", function ()
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      );
      );
      , "mathjax-editing");

      StackExchange.ready(function()
      var channelOptions =
      tags: "".split(" "),
      id: "69"
      ;
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function()
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled)
      StackExchange.using("snippets", function()
      createEditor();
      );

      else
      createEditor();

      );

      function createEditor()
      StackExchange.prepareEditor(
      heartbeatType: 'answer',
      convertImagesToLinks: true,
      noModals: false,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      );



      );








       

      draft saved


      draft discarded


















      StackExchange.ready(
      function ()
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2885405%2fbig-o-for-functions-approaching-0%23new-answer', 'question_page');

      );

      Post as a guest






























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes








      up vote
      2
      down vote













      The thing is, $x^3$ goes to $0$ slower than $x^4$ and so on. So you keep the lowest order.






      share|cite|improve this answer
























        up vote
        2
        down vote













        The thing is, $x^3$ goes to $0$ slower than $x^4$ and so on. So you keep the lowest order.






        share|cite|improve this answer






















          up vote
          2
          down vote










          up vote
          2
          down vote









          The thing is, $x^3$ goes to $0$ slower than $x^4$ and so on. So you keep the lowest order.






          share|cite|improve this answer












          The thing is, $x^3$ goes to $0$ slower than $x^4$ and so on. So you keep the lowest order.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Aug 17 at 5:01









          Trebor

          33310




          33310




















              up vote
              0
              down vote













              I like to think about big-$mathcal O$ as the limit definition:



              $f=mathcal O(g)$ at $aifflimsup_xto a left|fracf(x)g(x)right|<infty$.



              Or, in words, "near" the point $a$ the function $g$ grow faster or equal to $f$.



              So near $0$ we have that $x^n=mathcal O(x^n-h)$ for all $n$ positive $h$(and multiplying by constant doesn't matter) because$$limsup_xto 0 left|fracx^nx^n-hright|=lim_xto 0 left|fracx^nx^n-hright|=lim_xto 0|x^h|=0<infty$$






              share|cite|improve this answer


























                up vote
                0
                down vote













                I like to think about big-$mathcal O$ as the limit definition:



                $f=mathcal O(g)$ at $aifflimsup_xto a left|fracf(x)g(x)right|<infty$.



                Or, in words, "near" the point $a$ the function $g$ grow faster or equal to $f$.



                So near $0$ we have that $x^n=mathcal O(x^n-h)$ for all $n$ positive $h$(and multiplying by constant doesn't matter) because$$limsup_xto 0 left|fracx^nx^n-hright|=lim_xto 0 left|fracx^nx^n-hright|=lim_xto 0|x^h|=0<infty$$






                share|cite|improve this answer
























                  up vote
                  0
                  down vote










                  up vote
                  0
                  down vote









                  I like to think about big-$mathcal O$ as the limit definition:



                  $f=mathcal O(g)$ at $aifflimsup_xto a left|fracf(x)g(x)right|<infty$.



                  Or, in words, "near" the point $a$ the function $g$ grow faster or equal to $f$.



                  So near $0$ we have that $x^n=mathcal O(x^n-h)$ for all $n$ positive $h$(and multiplying by constant doesn't matter) because$$limsup_xto 0 left|fracx^nx^n-hright|=lim_xto 0 left|fracx^nx^n-hright|=lim_xto 0|x^h|=0<infty$$






                  share|cite|improve this answer














                  I like to think about big-$mathcal O$ as the limit definition:



                  $f=mathcal O(g)$ at $aifflimsup_xto a left|fracf(x)g(x)right|<infty$.



                  Or, in words, "near" the point $a$ the function $g$ grow faster or equal to $f$.



                  So near $0$ we have that $x^n=mathcal O(x^n-h)$ for all $n$ positive $h$(and multiplying by constant doesn't matter) because$$limsup_xto 0 left|fracx^nx^n-hright|=lim_xto 0 left|fracx^nx^n-hright|=lim_xto 0|x^h|=0<infty$$







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Aug 17 at 5:00

























                  answered Aug 17 at 4:54









                  Holo

                  4,2972629




                  4,2972629






















                       

                      draft saved


                      draft discarded


























                       


                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2885405%2fbig-o-for-functions-approaching-0%23new-answer', 'question_page');

                      );

                      Post as a guest













































































                      這個網誌中的熱門文章

                      Is there any way to eliminate the singular point to solve this integral by hand or by approximations?

                      Why am i infinitely getting the same tweet with the Twitter Search API?

                      Carbon dioxide