About $lim left(1+frac xnright)^n$

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
23
down vote

favorite
16












I was wondering if it is possible to get a link to a rigorous proof that
$$displaystyle lim_ntoinfty left(1+frac xnright)^n=exp x$$










share|cite|improve this question



















  • 6




    Well often this is taken as the definition of exp(x), so I suppose it depends on your definition.
    – Three
    Apr 11 '13 at 22:43






  • 3




    @LordSoth Consider $xmapsto 0$.
    – Git Gud
    Apr 11 '13 at 22:46







  • 1




    @LordSoth, actually that's false. $exp(x)$ was originally discovered by a Bernoulli as the limit of compound interest -- in fact, exactly as the OP has written it. Only later was the calculus studied: en.wikipedia.org/wiki/Exponential_function
    – Three
    Apr 11 '13 at 22:56






  • 1




    @Three I suggest you read www-history.mcs.st-and.ac.uk/HistTopics/e.html
    – Lord Soth
    Apr 11 '13 at 22:59






  • 3




    How do you define $exp$? This is really a matter of definition. What tools do you have available? Can you use continuity of $exp$? Can you use $log$? &c... Whenever you make this kind of questions, you must state what definitions and available tools are, always. Else we're just guessing what you want.
    – Pedro Tamaroff♦
    Apr 11 '13 at 23:56















up vote
23
down vote

favorite
16












I was wondering if it is possible to get a link to a rigorous proof that
$$displaystyle lim_ntoinfty left(1+frac xnright)^n=exp x$$










share|cite|improve this question



















  • 6




    Well often this is taken as the definition of exp(x), so I suppose it depends on your definition.
    – Three
    Apr 11 '13 at 22:43






  • 3




    @LordSoth Consider $xmapsto 0$.
    – Git Gud
    Apr 11 '13 at 22:46







  • 1




    @LordSoth, actually that's false. $exp(x)$ was originally discovered by a Bernoulli as the limit of compound interest -- in fact, exactly as the OP has written it. Only later was the calculus studied: en.wikipedia.org/wiki/Exponential_function
    – Three
    Apr 11 '13 at 22:56






  • 1




    @Three I suggest you read www-history.mcs.st-and.ac.uk/HistTopics/e.html
    – Lord Soth
    Apr 11 '13 at 22:59






  • 3




    How do you define $exp$? This is really a matter of definition. What tools do you have available? Can you use continuity of $exp$? Can you use $log$? &c... Whenever you make this kind of questions, you must state what definitions and available tools are, always. Else we're just guessing what you want.
    – Pedro Tamaroff♦
    Apr 11 '13 at 23:56













up vote
23
down vote

favorite
16









up vote
23
down vote

favorite
16






16





I was wondering if it is possible to get a link to a rigorous proof that
$$displaystyle lim_ntoinfty left(1+frac xnright)^n=exp x$$










share|cite|improve this question















I was wondering if it is possible to get a link to a rigorous proof that
$$displaystyle lim_ntoinfty left(1+frac xnright)^n=exp x$$







limits exponential-function faq






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 11 '17 at 14:04









Jack

26.7k1678192




26.7k1678192










asked Apr 11 '13 at 22:40









Mai09el

1891210




1891210







  • 6




    Well often this is taken as the definition of exp(x), so I suppose it depends on your definition.
    – Three
    Apr 11 '13 at 22:43






  • 3




    @LordSoth Consider $xmapsto 0$.
    – Git Gud
    Apr 11 '13 at 22:46







  • 1




    @LordSoth, actually that's false. $exp(x)$ was originally discovered by a Bernoulli as the limit of compound interest -- in fact, exactly as the OP has written it. Only later was the calculus studied: en.wikipedia.org/wiki/Exponential_function
    – Three
    Apr 11 '13 at 22:56






  • 1




    @Three I suggest you read www-history.mcs.st-and.ac.uk/HistTopics/e.html
    – Lord Soth
    Apr 11 '13 at 22:59






  • 3




    How do you define $exp$? This is really a matter of definition. What tools do you have available? Can you use continuity of $exp$? Can you use $log$? &c... Whenever you make this kind of questions, you must state what definitions and available tools are, always. Else we're just guessing what you want.
    – Pedro Tamaroff♦
    Apr 11 '13 at 23:56













  • 6




    Well often this is taken as the definition of exp(x), so I suppose it depends on your definition.
    – Three
    Apr 11 '13 at 22:43






  • 3




    @LordSoth Consider $xmapsto 0$.
    – Git Gud
    Apr 11 '13 at 22:46







  • 1




    @LordSoth, actually that's false. $exp(x)$ was originally discovered by a Bernoulli as the limit of compound interest -- in fact, exactly as the OP has written it. Only later was the calculus studied: en.wikipedia.org/wiki/Exponential_function
    – Three
    Apr 11 '13 at 22:56






  • 1




    @Three I suggest you read www-history.mcs.st-and.ac.uk/HistTopics/e.html
    – Lord Soth
    Apr 11 '13 at 22:59






  • 3




    How do you define $exp$? This is really a matter of definition. What tools do you have available? Can you use continuity of $exp$? Can you use $log$? &c... Whenever you make this kind of questions, you must state what definitions and available tools are, always. Else we're just guessing what you want.
    – Pedro Tamaroff♦
    Apr 11 '13 at 23:56








6




6




Well often this is taken as the definition of exp(x), so I suppose it depends on your definition.
– Three
Apr 11 '13 at 22:43




Well often this is taken as the definition of exp(x), so I suppose it depends on your definition.
– Three
Apr 11 '13 at 22:43




3




3




@LordSoth Consider $xmapsto 0$.
– Git Gud
Apr 11 '13 at 22:46





@LordSoth Consider $xmapsto 0$.
– Git Gud
Apr 11 '13 at 22:46





1




1




@LordSoth, actually that's false. $exp(x)$ was originally discovered by a Bernoulli as the limit of compound interest -- in fact, exactly as the OP has written it. Only later was the calculus studied: en.wikipedia.org/wiki/Exponential_function
– Three
Apr 11 '13 at 22:56




@LordSoth, actually that's false. $exp(x)$ was originally discovered by a Bernoulli as the limit of compound interest -- in fact, exactly as the OP has written it. Only later was the calculus studied: en.wikipedia.org/wiki/Exponential_function
– Three
Apr 11 '13 at 22:56




1




1




@Three I suggest you read www-history.mcs.st-and.ac.uk/HistTopics/e.html
– Lord Soth
Apr 11 '13 at 22:59




@Three I suggest you read www-history.mcs.st-and.ac.uk/HistTopics/e.html
– Lord Soth
Apr 11 '13 at 22:59




3




3




How do you define $exp$? This is really a matter of definition. What tools do you have available? Can you use continuity of $exp$? Can you use $log$? &c... Whenever you make this kind of questions, you must state what definitions and available tools are, always. Else we're just guessing what you want.
– Pedro Tamaroff♦
Apr 11 '13 at 23:56





How do you define $exp$? This is really a matter of definition. What tools do you have available? Can you use continuity of $exp$? Can you use $log$? &c... Whenever you make this kind of questions, you must state what definitions and available tools are, always. Else we're just guessing what you want.
– Pedro Tamaroff♦
Apr 11 '13 at 23:56











9 Answers
9






active

oldest

votes

















up vote
19
down vote













From the very definition (one of many, I know):



$$e:=lim_ntoinftyleft(1+frac1nright)^n$$



we can try the following, depending on what you have read so far in this subject:



(1) Deduce that



$$e=lim_ntoinftyleft(1+frac1f(n)right)^f(n);,;;textas long as;;f(n)xrightarrow[ntoinfty]infty$$



and then from here ($,xneq0,$ , but this is only a light technicality)



$$left(1+fracxnright)^n=left[;left(1+frac1fracnxright)^fracnx;right]^xxrightarrow[ntoinfty]e^x$$



2) For $,x>0,$ , substitute $,mx=n,$ . Note that $,ntoinftyimplies mtoinfty,$ , and



$$left(1+fracxnright)^n=left(left(1+frac1mright)^mright)^xxrightarrow[ntoinftyiff mtoinfty]e^x$$



I'll leave it to you to work out the case $,x<0,$ (hint: arithmetic of limits and "going" to denominators)






share|cite|improve this answer



























    up vote
    10
    down vote













    I would like to cite here an awesome German mathematician, Konrad Königsberger. He writes in his book ,,Analysis I'' as follows:




    Fundamentallemma. For every sequence of complex numbers $w_n$ with a limit $w$ it is true that $$lim_n to infty Bigl(1 + fracw_nnBigr)^n = sum_k=0^infty fracw^kk!.$$ Proof. For every $varepsilon > 0$ and sufficiently large index $K$ we have the following estimations: $$sum_k=K^infty frac(k! < frac varepsilon 3 quadmboxandquad |w_n| le |w|+1.$$Therefore if $n ge K$ then $$left|Bigl(1 + fracw_nnBig)^n - exp w right| le sum_k=0^K-1 left|n choose kfracw_n^kn^k - fracw^kk!right| + sum_k=K^nnchoose k fracw_nn^k + sum_k=K^infty frac^kk!.$$ The third sum is smaller than $varepsilon / 3$ based on our assumptions. We can find an upper bound for the middle one using $$n choose k frac 1 n^k = frac1k! prod_i = 1^k-1 Bigl(1 - frac i n Bigr) le frac 1 k!.$$ Combining this with $|w_n| le |w| + 1$, $$sum_k=K^n n choose k fracw_nn^k < sum_k=K^n frac(k! < frac varepsilon 3$$ Finally, the first sum converges to $0$ due to $w_n to w$ and $n choose k n^-k to frac 1 k!$. We can choose $N > K$ such that it's smaller than $varepsilon / 3$ as soon as $n > N$.




    Really brilliant.






    share|cite|improve this answer
















    • 2




      I examined proof technique for $w_n = w$ (no sequence) then went full bore. Agree - brilliant! I can use it to show the exp power series takes addition to multiplication. Did not have to get in the weeds with rearrangements, absolute convergence/ commutativity, etc.
      – CopyPasteIt
      Jul 9 '17 at 23:10











    • +1.... This appears to be the only complete and rigorous answer ( so far) that considers all complex $w$. And provides a reference (as the proposer requested). And actually has more than what was asked... A masterful exposition by K.K.
      – DanielWainfleet
      Aug 30 at 15:49

















    up vote
    9
    down vote













    Firstly, let us give a definition to the exponential function, so we know the function has various properties:



    $$ exp(x) := sum_n=0^infty fracx^nn!$$



    so that we can prove that (as exp is a power series) :



    • The exponential function has radius of convergence $infty$, and is thus defined on all of $mathbb R$

    • As a power series is infinitely differentiable inside its circle of convergence, the exponential function is infinitely differentiable on all of $mathbb R$

    • We can then prove that the function is strictly increasing, and thus by the inverse function theorem (http://en.wikipedia.org/wiki/Inverse_function_theorem) we can define what we know as the "log" function

    Knowing all of this, here is hopefully a sufficiently rigorous proof (at least for positive a):



    As $log(x)$ is continuous and differentiable on $(0,infty)$, we have that $log(1+x)$ is continuous and differentiable on $[0,fracan]$, so by the mean value theorem we know there exists a $c in [0,fracan]$ with



    $$f'(c) = frac log(1+ fracan ) - log(1) frac an - 0 $$
    $$ Longrightarrow log[(1+fracan)^n] = fraca1+c$$
    $$ Longrightarrow (1+fracan)^n = exp(fraca1+c)$$



    for some $c in [0,fracan]$ . As we then want to take the limit as $n rightarrow infty$, we get that:



    • As $c in [0,fracan]$ and $fracan rightarrow 0$ as $n rightarrow infty$, by the squeeze theorem we get that $ c rightarrow 0$ as $n rightarrow infty$

    • As $ c rightarrow 0$ as $n rightarrow infty$, $fraca1+c rightarrow a$ as $n rightarrow infty$

    • As the exponential function is continuous on $mathbb R$, the limit can pass inside the function, so we get that as $fraca1+c rightarrow a$ as $n rightarrow infty$

    $$ exp(fraca1+c) rightarrow exp(a) $$
    as $n rightarrow infty$. Thus we can conclude that



    $$ lim_n to infty (1+fracan)^n = e^a$$



    (Of course, this is ignoring that one needs to prove that $exp(a)=e^a$, but this is hardly vital for this question)






    share|cite|improve this answer






















    • If we're just about to define the exponential function (or at least show that it equals something), it seems to me the assumption of its continuity is highly suspicious...
      – DonAntonio
      Apr 11 '13 at 23:36










    • This is true - although I can't see how this proof is nothing more than showing that the various definitions of the exponential function are equilivant, and thus I would presume continuity would have been proved before trying to prove statements such as this one (for example, in our lectures we defined it in terms of a power series, which means that we can prove it is continuous fairly straightforwardly)
      – Andrew D
      Apr 11 '13 at 23:39










    • I agree with that, @Andrew D, but then perhaps mentioning some other definition from which continuity follows and then use that in it...perhaps too long a detour for a beginner, but absolutely possible indeed.
      – DonAntonio
      Apr 11 '13 at 23:42











    • @DonAntonio The log's continuity assumption is just fine, though. Since $exp$ is its inverse, it is continuous.
      – Pedro Tamaroff♦
      Apr 11 '13 at 23:50










    • Yeah, thankfully that is covered by the inverse function theorem (which I've now linked/discussed above, along with some other things)
      – Andrew D
      Apr 11 '13 at 23:51

















    up vote
    3
    down vote













    Another answer, assuming $x>0$:



    Let $f(x)=ln(x)$. Then we know that $f'(x)=1/x$. Also, by the definition of derivative, we can write
    $$
    beginalign
    f'(x)&=lim_hto 0fracf(x+h)-f(x)h\
    &=lim_hto 0fracln(x+h)-ln(x)h\
    &=lim_hto 0frac1hlnfracx+hx\
    &=lim_h to 0lnleft(fracx+hxright)^frac1h\
    &=lim_hto 0lnleft(1+frachxright)^frac1h
    endalign
    $$
    Then, using the fact that $ln(x)$ is a continuous function for all $x$ in its domain, we can exchange the $lim$ and $ln$:
    $$
    f'(x)=lnlim_hto 0left(1+frachxright)^frac1h
    $$
    Now, let $m=1/h$. Then $mtoinfty$ as $hto 0^+$, and
    $$
    f'(x)=lnlim_mtoinftyleft(1+frac1mxright)^m
    $$
    Now, assuming $x>0$, define $n=mx^2$, and so $ntoinfty$ as $mtoinfty$. Then we can write
    $$
    f'(x)=lnlim_ntoinftyleft[left(1+fracxnright)^nright]^1/x^2
    $$
    and from before, we still have $f'(x)=1/x$, so
    $$
    lnlim_ntoinftyleft[left(1+fracxnright)^nright]^1/x^2=frac1x
    $$
    Exponentiating both sides, we find
    $$
    lim_ntoinftyleft[left(1+fracxnright)^nright]^1/x^2=e^1/x
    $$
    Finally, raising both sides to the $x^2$, we find
    $$
    lim_ntoinftyleft(1+fracxnright)^n=e^x
    $$






    share|cite|improve this answer



























      up vote
      3
      down vote














      Aaah... The sweet sound of silent revenge downvotes... Always a pleasure!




      Consider the functions $u$ and $v$ defined for every $|t|ltfrac12$ by
      $$
      u(t)=t-log(1+t),qquad v(t)=t-t^2-log(1+t).
      $$
      The derivative of $u$ is $u'(t)=fract1+t$, which has the sign of $t$, hence $u(t)geqslant0$. The derivative of $v$ is $v'(t)=1-2t-frac11+t$, which has the sign of $(1+t)(1-2t)-1=-t(1+2t)$ which has the sign of $-t$ on the domain $|t|ltfrac12$ hence $v(t)leqslant0$.
      Thus:




      For every $|t|ltfrac12$,
      $$
      t-t^2leqslantlog (1+t)leqslant t.
      $$




      The function $zmapstoexp(nz)$ is nondecreasing on the same domain hence
      $$
      expleft(nt-nt^2right)leqslant(1+t)^nleqslantexpleft(ntright).
      $$
      In particular, using this for $t=x/n$, one gets:




      For every $|x|<frac12n$,
      $$
      expleft(x-fracx^2nright)leqslantleft(1+fracxnright)^nleqslantmathrm e^x.
      $$




      Finally, $x^2/nto 0$ when $ntoinfty$ and the exponential is continuous at $0$, hence we are done.



      Facts/Definitions used:



      • The logarithm has derivative $tmapsto1/t$.

      • The exponential is the inverse of the logarithm.





      share|cite|improve this answer






















      • We need to evangelize the use of $leqslant$ and $geqslant$ in MSE.
        – Pedro Tamaroff♦
        Aug 10 '13 at 4:11











      • I used this in an application to lower bound $(1+x/n)^n$, thank you.
        – JP McCarthy
        Aug 16 '16 at 11:55










      • Didier, I like this approach. So (+1). I was wondering if you've seen a way to establish the same lower bound for $left(1+frac xnright)^n$ by using the limit definition of the exponential function and without appealing to calculus. The upper bound is trivial for $x>-n$.
        – Mark Viola
        Jan 5 '17 at 19:39











      • @Dr.MV This reduces to showing $1+tgeqslant exp(t-t^2)$, that is, $frac11+tleqslantexp(-t+t^2)$. What you call the trivial upper bound yields $frac11+t=1-fract1+tleqslantexpleft(-fract1+tright)$ hence if $fract1+tgeqslant t-t^2$, we are done. This is asking that $tgeqslant (t-t^2)(1+t)=t(1-t^2)$ hence indeed, we are done for every $t$ in $(0,1)$. This fails for $t$ negative but similar arguments might work.
        – Did
        Jan 8 '17 at 9:17


















      up vote
      1
      down vote













      $ (1+x/n)^n = sum_k=0^n binomnkfracx^kn^k $



      Now just prove that $binomnkfracx^kn^k$ approaches $fracx^kk!$ as n approaches infinity, and you will have proven that your limit matches the Taylor series for $exp(x)$






      share|cite|improve this answer


















      • 5




        This is not enough; there are infinitely many terms, so you need to show that you can swap two limits here.
        – Qiaochu Yuan
        Apr 11 '13 at 23:17






      • 1




        What you want to do is work with $limsup$ and $liminf$ here, and show $e^xleqliminf $ and $e^xgeq limsup$
        – Pedro Tamaroff♦
        Apr 11 '13 at 23:53










      • How would you show that you can swap the two limits?
        – amarney
        Mar 26 '17 at 22:54

















      up vote
      1
      down vote













      For any fixed value of $x$, define



      $$f(u)= ln(1+ux)over u$$



      By L'Hopital's Rule,



      $$lim_urightarrow0^+f(u)=lim_urightarrow0^+x/(1+ux)over1=x$$



      Now exponentiate $f$:



      $$e^f(u)=(1+ux)^1/u$$



      By continuity of the exponential function, we have



      $$lim_urightarrow0^+(1+ux)^1/u=lim_urightarrow0^+e^f(u)=e^lim_urightarrow0^+f(u)=e^x$$



      All these limits have been shown to exist for the (positive) real variable $u$ tending to $0$, hence they must exist, and be the same, for the sequence of reciprocals of integers, $u=1/n$, as $n$ tends to infinity, and the result follows:



      $$lim_nrightarrowinftyleft(1+xover nright)^n = e^x$$






      share|cite|improve this answer



























        up vote
        0
        down vote













        This one of the ways in which it is defined. The equivalence of the definitions can be proved easily, I guess.
        If for example you take the exponential function to be the inverse of the logarithm:



        $log(lim_n(1 + fracxn)^n) = lim_n n log(1 + fracxn) = lim_n n cdot[fracxn - fracx^22n^2 + dots] = x$



        EDIT: The logarithm is defined as usual: $log x = int_1^x fracdtt$. The first identity follows from the continuity of the logarithm, the second it's just an application of one of the property of the logarithm ($log a^b = b log a $), while to obtain the third it sufficies to have the Taylor expansion of $log(1+x)$.






        share|cite|improve this answer






















        • The very first equality requires, me believes, a justification that I cannot see as very easy unless we already assume quite a bit (say, continuity...). After that things get even tougher as we need power series and then also, apparently, differentiation.
          – DonAntonio
          Apr 11 '13 at 23:40






        • 2




          The logarithm is defined as $int_1^x fracdtt$, therefore, if we have integration we can also have continuity and differentiation, I suppose.
          – user67133
          Apr 11 '13 at 23:45











        • Perhaps so and also perhaps mentioning this could clear things out a little, since we don't know, apparently, what the OP's background is.
          – DonAntonio
          Apr 11 '13 at 23:47






        • 1




          I cannot but totally agree. Thank you for your suggestions, I am going to edit the post to make it clearer!
          – user67133
          Apr 12 '13 at 0:07

















        up vote
        0
        down vote













        There is at most one function $g$ on $mathbbR$ such that
        $$g'(x)=g(x)text for all xtext in mathbbRquadtextandquad g(0)=1,.$$
        If you let $f_n(x)=(1+x/n)^n$ and you can demonstrate that it compactly converges to some function $f$, you can demonstrate that $f'(x)=f(x)$ and $f(0)=1$. Likewise, if you take $f_n(x)=sum_k=0^n x^k/k!$ and demonstrate this sequence converges compactly, you can show that this limit satisfies the same conditions. Thus it doesn't matter what your definition is. The uniqueness criteria is what you should probably have in mind when you think of "the exponential".






        share|cite|improve this answer



















          protected by John Ma Nov 11 '17 at 4:03



          Thank you for your interest in this question.
          Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).



          Would you like to answer one of these unanswered questions instead?














          9 Answers
          9






          active

          oldest

          votes








          9 Answers
          9






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          19
          down vote













          From the very definition (one of many, I know):



          $$e:=lim_ntoinftyleft(1+frac1nright)^n$$



          we can try the following, depending on what you have read so far in this subject:



          (1) Deduce that



          $$e=lim_ntoinftyleft(1+frac1f(n)right)^f(n);,;;textas long as;;f(n)xrightarrow[ntoinfty]infty$$



          and then from here ($,xneq0,$ , but this is only a light technicality)



          $$left(1+fracxnright)^n=left[;left(1+frac1fracnxright)^fracnx;right]^xxrightarrow[ntoinfty]e^x$$



          2) For $,x>0,$ , substitute $,mx=n,$ . Note that $,ntoinftyimplies mtoinfty,$ , and



          $$left(1+fracxnright)^n=left(left(1+frac1mright)^mright)^xxrightarrow[ntoinftyiff mtoinfty]e^x$$



          I'll leave it to you to work out the case $,x<0,$ (hint: arithmetic of limits and "going" to denominators)






          share|cite|improve this answer
























            up vote
            19
            down vote













            From the very definition (one of many, I know):



            $$e:=lim_ntoinftyleft(1+frac1nright)^n$$



            we can try the following, depending on what you have read so far in this subject:



            (1) Deduce that



            $$e=lim_ntoinftyleft(1+frac1f(n)right)^f(n);,;;textas long as;;f(n)xrightarrow[ntoinfty]infty$$



            and then from here ($,xneq0,$ , but this is only a light technicality)



            $$left(1+fracxnright)^n=left[;left(1+frac1fracnxright)^fracnx;right]^xxrightarrow[ntoinfty]e^x$$



            2) For $,x>0,$ , substitute $,mx=n,$ . Note that $,ntoinftyimplies mtoinfty,$ , and



            $$left(1+fracxnright)^n=left(left(1+frac1mright)^mright)^xxrightarrow[ntoinftyiff mtoinfty]e^x$$



            I'll leave it to you to work out the case $,x<0,$ (hint: arithmetic of limits and "going" to denominators)






            share|cite|improve this answer






















              up vote
              19
              down vote










              up vote
              19
              down vote









              From the very definition (one of many, I know):



              $$e:=lim_ntoinftyleft(1+frac1nright)^n$$



              we can try the following, depending on what you have read so far in this subject:



              (1) Deduce that



              $$e=lim_ntoinftyleft(1+frac1f(n)right)^f(n);,;;textas long as;;f(n)xrightarrow[ntoinfty]infty$$



              and then from here ($,xneq0,$ , but this is only a light technicality)



              $$left(1+fracxnright)^n=left[;left(1+frac1fracnxright)^fracnx;right]^xxrightarrow[ntoinfty]e^x$$



              2) For $,x>0,$ , substitute $,mx=n,$ . Note that $,ntoinftyimplies mtoinfty,$ , and



              $$left(1+fracxnright)^n=left(left(1+frac1mright)^mright)^xxrightarrow[ntoinftyiff mtoinfty]e^x$$



              I'll leave it to you to work out the case $,x<0,$ (hint: arithmetic of limits and "going" to denominators)






              share|cite|improve this answer












              From the very definition (one of many, I know):



              $$e:=lim_ntoinftyleft(1+frac1nright)^n$$



              we can try the following, depending on what you have read so far in this subject:



              (1) Deduce that



              $$e=lim_ntoinftyleft(1+frac1f(n)right)^f(n);,;;textas long as;;f(n)xrightarrow[ntoinfty]infty$$



              and then from here ($,xneq0,$ , but this is only a light technicality)



              $$left(1+fracxnright)^n=left[;left(1+frac1fracnxright)^fracnx;right]^xxrightarrow[ntoinfty]e^x$$



              2) For $,x>0,$ , substitute $,mx=n,$ . Note that $,ntoinftyimplies mtoinfty,$ , and



              $$left(1+fracxnright)^n=left(left(1+frac1mright)^mright)^xxrightarrow[ntoinftyiff mtoinfty]e^x$$



              I'll leave it to you to work out the case $,x<0,$ (hint: arithmetic of limits and "going" to denominators)







              share|cite|improve this answer












              share|cite|improve this answer



              share|cite|improve this answer










              answered Apr 11 '13 at 23:23









              DonAntonio

              173k1486220




              173k1486220




















                  up vote
                  10
                  down vote













                  I would like to cite here an awesome German mathematician, Konrad Königsberger. He writes in his book ,,Analysis I'' as follows:




                  Fundamentallemma. For every sequence of complex numbers $w_n$ with a limit $w$ it is true that $$lim_n to infty Bigl(1 + fracw_nnBigr)^n = sum_k=0^infty fracw^kk!.$$ Proof. For every $varepsilon > 0$ and sufficiently large index $K$ we have the following estimations: $$sum_k=K^infty frac(k! < frac varepsilon 3 quadmboxandquad |w_n| le |w|+1.$$Therefore if $n ge K$ then $$left|Bigl(1 + fracw_nnBig)^n - exp w right| le sum_k=0^K-1 left|n choose kfracw_n^kn^k - fracw^kk!right| + sum_k=K^nnchoose k fracw_nn^k + sum_k=K^infty frac^kk!.$$ The third sum is smaller than $varepsilon / 3$ based on our assumptions. We can find an upper bound for the middle one using $$n choose k frac 1 n^k = frac1k! prod_i = 1^k-1 Bigl(1 - frac i n Bigr) le frac 1 k!.$$ Combining this with $|w_n| le |w| + 1$, $$sum_k=K^n n choose k fracw_nn^k < sum_k=K^n frac(k! < frac varepsilon 3$$ Finally, the first sum converges to $0$ due to $w_n to w$ and $n choose k n^-k to frac 1 k!$. We can choose $N > K$ such that it's smaller than $varepsilon / 3$ as soon as $n > N$.




                  Really brilliant.






                  share|cite|improve this answer
















                  • 2




                    I examined proof technique for $w_n = w$ (no sequence) then went full bore. Agree - brilliant! I can use it to show the exp power series takes addition to multiplication. Did not have to get in the weeds with rearrangements, absolute convergence/ commutativity, etc.
                    – CopyPasteIt
                    Jul 9 '17 at 23:10











                  • +1.... This appears to be the only complete and rigorous answer ( so far) that considers all complex $w$. And provides a reference (as the proposer requested). And actually has more than what was asked... A masterful exposition by K.K.
                    – DanielWainfleet
                    Aug 30 at 15:49














                  up vote
                  10
                  down vote













                  I would like to cite here an awesome German mathematician, Konrad Königsberger. He writes in his book ,,Analysis I'' as follows:




                  Fundamentallemma. For every sequence of complex numbers $w_n$ with a limit $w$ it is true that $$lim_n to infty Bigl(1 + fracw_nnBigr)^n = sum_k=0^infty fracw^kk!.$$ Proof. For every $varepsilon > 0$ and sufficiently large index $K$ we have the following estimations: $$sum_k=K^infty frac(k! < frac varepsilon 3 quadmboxandquad |w_n| le |w|+1.$$Therefore if $n ge K$ then $$left|Bigl(1 + fracw_nnBig)^n - exp w right| le sum_k=0^K-1 left|n choose kfracw_n^kn^k - fracw^kk!right| + sum_k=K^nnchoose k fracw_nn^k + sum_k=K^infty frac^kk!.$$ The third sum is smaller than $varepsilon / 3$ based on our assumptions. We can find an upper bound for the middle one using $$n choose k frac 1 n^k = frac1k! prod_i = 1^k-1 Bigl(1 - frac i n Bigr) le frac 1 k!.$$ Combining this with $|w_n| le |w| + 1$, $$sum_k=K^n n choose k fracw_nn^k < sum_k=K^n frac(k! < frac varepsilon 3$$ Finally, the first sum converges to $0$ due to $w_n to w$ and $n choose k n^-k to frac 1 k!$. We can choose $N > K$ such that it's smaller than $varepsilon / 3$ as soon as $n > N$.




                  Really brilliant.






                  share|cite|improve this answer
















                  • 2




                    I examined proof technique for $w_n = w$ (no sequence) then went full bore. Agree - brilliant! I can use it to show the exp power series takes addition to multiplication. Did not have to get in the weeds with rearrangements, absolute convergence/ commutativity, etc.
                    – CopyPasteIt
                    Jul 9 '17 at 23:10











                  • +1.... This appears to be the only complete and rigorous answer ( so far) that considers all complex $w$. And provides a reference (as the proposer requested). And actually has more than what was asked... A masterful exposition by K.K.
                    – DanielWainfleet
                    Aug 30 at 15:49












                  up vote
                  10
                  down vote










                  up vote
                  10
                  down vote









                  I would like to cite here an awesome German mathematician, Konrad Königsberger. He writes in his book ,,Analysis I'' as follows:




                  Fundamentallemma. For every sequence of complex numbers $w_n$ with a limit $w$ it is true that $$lim_n to infty Bigl(1 + fracw_nnBigr)^n = sum_k=0^infty fracw^kk!.$$ Proof. For every $varepsilon > 0$ and sufficiently large index $K$ we have the following estimations: $$sum_k=K^infty frac(k! < frac varepsilon 3 quadmboxandquad |w_n| le |w|+1.$$Therefore if $n ge K$ then $$left|Bigl(1 + fracw_nnBig)^n - exp w right| le sum_k=0^K-1 left|n choose kfracw_n^kn^k - fracw^kk!right| + sum_k=K^nnchoose k fracw_nn^k + sum_k=K^infty frac^kk!.$$ The third sum is smaller than $varepsilon / 3$ based on our assumptions. We can find an upper bound for the middle one using $$n choose k frac 1 n^k = frac1k! prod_i = 1^k-1 Bigl(1 - frac i n Bigr) le frac 1 k!.$$ Combining this with $|w_n| le |w| + 1$, $$sum_k=K^n n choose k fracw_nn^k < sum_k=K^n frac(k! < frac varepsilon 3$$ Finally, the first sum converges to $0$ due to $w_n to w$ and $n choose k n^-k to frac 1 k!$. We can choose $N > K$ such that it's smaller than $varepsilon / 3$ as soon as $n > N$.




                  Really brilliant.






                  share|cite|improve this answer












                  I would like to cite here an awesome German mathematician, Konrad Königsberger. He writes in his book ,,Analysis I'' as follows:




                  Fundamentallemma. For every sequence of complex numbers $w_n$ with a limit $w$ it is true that $$lim_n to infty Bigl(1 + fracw_nnBigr)^n = sum_k=0^infty fracw^kk!.$$ Proof. For every $varepsilon > 0$ and sufficiently large index $K$ we have the following estimations: $$sum_k=K^infty frac(k! < frac varepsilon 3 quadmboxandquad |w_n| le |w|+1.$$Therefore if $n ge K$ then $$left|Bigl(1 + fracw_nnBig)^n - exp w right| le sum_k=0^K-1 left|n choose kfracw_n^kn^k - fracw^kk!right| + sum_k=K^nnchoose k fracw_nn^k + sum_k=K^infty frac^kk!.$$ The third sum is smaller than $varepsilon / 3$ based on our assumptions. We can find an upper bound for the middle one using $$n choose k frac 1 n^k = frac1k! prod_i = 1^k-1 Bigl(1 - frac i n Bigr) le frac 1 k!.$$ Combining this with $|w_n| le |w| + 1$, $$sum_k=K^n n choose k fracw_nn^k < sum_k=K^n frac(k! < frac varepsilon 3$$ Finally, the first sum converges to $0$ due to $w_n to w$ and $n choose k n^-k to frac 1 k!$. We can choose $N > K$ such that it's smaller than $varepsilon / 3$ as soon as $n > N$.




                  Really brilliant.







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Aug 20 '16 at 20:07









                  Santiago

                  955519




                  955519







                  • 2




                    I examined proof technique for $w_n = w$ (no sequence) then went full bore. Agree - brilliant! I can use it to show the exp power series takes addition to multiplication. Did not have to get in the weeds with rearrangements, absolute convergence/ commutativity, etc.
                    – CopyPasteIt
                    Jul 9 '17 at 23:10











                  • +1.... This appears to be the only complete and rigorous answer ( so far) that considers all complex $w$. And provides a reference (as the proposer requested). And actually has more than what was asked... A masterful exposition by K.K.
                    – DanielWainfleet
                    Aug 30 at 15:49












                  • 2




                    I examined proof technique for $w_n = w$ (no sequence) then went full bore. Agree - brilliant! I can use it to show the exp power series takes addition to multiplication. Did not have to get in the weeds with rearrangements, absolute convergence/ commutativity, etc.
                    – CopyPasteIt
                    Jul 9 '17 at 23:10











                  • +1.... This appears to be the only complete and rigorous answer ( so far) that considers all complex $w$. And provides a reference (as the proposer requested). And actually has more than what was asked... A masterful exposition by K.K.
                    – DanielWainfleet
                    Aug 30 at 15:49







                  2




                  2




                  I examined proof technique for $w_n = w$ (no sequence) then went full bore. Agree - brilliant! I can use it to show the exp power series takes addition to multiplication. Did not have to get in the weeds with rearrangements, absolute convergence/ commutativity, etc.
                  – CopyPasteIt
                  Jul 9 '17 at 23:10





                  I examined proof technique for $w_n = w$ (no sequence) then went full bore. Agree - brilliant! I can use it to show the exp power series takes addition to multiplication. Did not have to get in the weeds with rearrangements, absolute convergence/ commutativity, etc.
                  – CopyPasteIt
                  Jul 9 '17 at 23:10













                  +1.... This appears to be the only complete and rigorous answer ( so far) that considers all complex $w$. And provides a reference (as the proposer requested). And actually has more than what was asked... A masterful exposition by K.K.
                  – DanielWainfleet
                  Aug 30 at 15:49




                  +1.... This appears to be the only complete and rigorous answer ( so far) that considers all complex $w$. And provides a reference (as the proposer requested). And actually has more than what was asked... A masterful exposition by K.K.
                  – DanielWainfleet
                  Aug 30 at 15:49










                  up vote
                  9
                  down vote













                  Firstly, let us give a definition to the exponential function, so we know the function has various properties:



                  $$ exp(x) := sum_n=0^infty fracx^nn!$$



                  so that we can prove that (as exp is a power series) :



                  • The exponential function has radius of convergence $infty$, and is thus defined on all of $mathbb R$

                  • As a power series is infinitely differentiable inside its circle of convergence, the exponential function is infinitely differentiable on all of $mathbb R$

                  • We can then prove that the function is strictly increasing, and thus by the inverse function theorem (http://en.wikipedia.org/wiki/Inverse_function_theorem) we can define what we know as the "log" function

                  Knowing all of this, here is hopefully a sufficiently rigorous proof (at least for positive a):



                  As $log(x)$ is continuous and differentiable on $(0,infty)$, we have that $log(1+x)$ is continuous and differentiable on $[0,fracan]$, so by the mean value theorem we know there exists a $c in [0,fracan]$ with



                  $$f'(c) = frac log(1+ fracan ) - log(1) frac an - 0 $$
                  $$ Longrightarrow log[(1+fracan)^n] = fraca1+c$$
                  $$ Longrightarrow (1+fracan)^n = exp(fraca1+c)$$



                  for some $c in [0,fracan]$ . As we then want to take the limit as $n rightarrow infty$, we get that:



                  • As $c in [0,fracan]$ and $fracan rightarrow 0$ as $n rightarrow infty$, by the squeeze theorem we get that $ c rightarrow 0$ as $n rightarrow infty$

                  • As $ c rightarrow 0$ as $n rightarrow infty$, $fraca1+c rightarrow a$ as $n rightarrow infty$

                  • As the exponential function is continuous on $mathbb R$, the limit can pass inside the function, so we get that as $fraca1+c rightarrow a$ as $n rightarrow infty$

                  $$ exp(fraca1+c) rightarrow exp(a) $$
                  as $n rightarrow infty$. Thus we can conclude that



                  $$ lim_n to infty (1+fracan)^n = e^a$$



                  (Of course, this is ignoring that one needs to prove that $exp(a)=e^a$, but this is hardly vital for this question)






                  share|cite|improve this answer






















                  • If we're just about to define the exponential function (or at least show that it equals something), it seems to me the assumption of its continuity is highly suspicious...
                    – DonAntonio
                    Apr 11 '13 at 23:36










                  • This is true - although I can't see how this proof is nothing more than showing that the various definitions of the exponential function are equilivant, and thus I would presume continuity would have been proved before trying to prove statements such as this one (for example, in our lectures we defined it in terms of a power series, which means that we can prove it is continuous fairly straightforwardly)
                    – Andrew D
                    Apr 11 '13 at 23:39










                  • I agree with that, @Andrew D, but then perhaps mentioning some other definition from which continuity follows and then use that in it...perhaps too long a detour for a beginner, but absolutely possible indeed.
                    – DonAntonio
                    Apr 11 '13 at 23:42











                  • @DonAntonio The log's continuity assumption is just fine, though. Since $exp$ is its inverse, it is continuous.
                    – Pedro Tamaroff♦
                    Apr 11 '13 at 23:50










                  • Yeah, thankfully that is covered by the inverse function theorem (which I've now linked/discussed above, along with some other things)
                    – Andrew D
                    Apr 11 '13 at 23:51














                  up vote
                  9
                  down vote













                  Firstly, let us give a definition to the exponential function, so we know the function has various properties:



                  $$ exp(x) := sum_n=0^infty fracx^nn!$$



                  so that we can prove that (as exp is a power series) :



                  • The exponential function has radius of convergence $infty$, and is thus defined on all of $mathbb R$

                  • As a power series is infinitely differentiable inside its circle of convergence, the exponential function is infinitely differentiable on all of $mathbb R$

                  • We can then prove that the function is strictly increasing, and thus by the inverse function theorem (http://en.wikipedia.org/wiki/Inverse_function_theorem) we can define what we know as the "log" function

                  Knowing all of this, here is hopefully a sufficiently rigorous proof (at least for positive a):



                  As $log(x)$ is continuous and differentiable on $(0,infty)$, we have that $log(1+x)$ is continuous and differentiable on $[0,fracan]$, so by the mean value theorem we know there exists a $c in [0,fracan]$ with



                  $$f'(c) = frac log(1+ fracan ) - log(1) frac an - 0 $$
                  $$ Longrightarrow log[(1+fracan)^n] = fraca1+c$$
                  $$ Longrightarrow (1+fracan)^n = exp(fraca1+c)$$



                  for some $c in [0,fracan]$ . As we then want to take the limit as $n rightarrow infty$, we get that:



                  • As $c in [0,fracan]$ and $fracan rightarrow 0$ as $n rightarrow infty$, by the squeeze theorem we get that $ c rightarrow 0$ as $n rightarrow infty$

                  • As $ c rightarrow 0$ as $n rightarrow infty$, $fraca1+c rightarrow a$ as $n rightarrow infty$

                  • As the exponential function is continuous on $mathbb R$, the limit can pass inside the function, so we get that as $fraca1+c rightarrow a$ as $n rightarrow infty$

                  $$ exp(fraca1+c) rightarrow exp(a) $$
                  as $n rightarrow infty$. Thus we can conclude that



                  $$ lim_n to infty (1+fracan)^n = e^a$$



                  (Of course, this is ignoring that one needs to prove that $exp(a)=e^a$, but this is hardly vital for this question)






                  share|cite|improve this answer






















                  • If we're just about to define the exponential function (or at least show that it equals something), it seems to me the assumption of its continuity is highly suspicious...
                    – DonAntonio
                    Apr 11 '13 at 23:36










                  • This is true - although I can't see how this proof is nothing more than showing that the various definitions of the exponential function are equilivant, and thus I would presume continuity would have been proved before trying to prove statements such as this one (for example, in our lectures we defined it in terms of a power series, which means that we can prove it is continuous fairly straightforwardly)
                    – Andrew D
                    Apr 11 '13 at 23:39










                  • I agree with that, @Andrew D, but then perhaps mentioning some other definition from which continuity follows and then use that in it...perhaps too long a detour for a beginner, but absolutely possible indeed.
                    – DonAntonio
                    Apr 11 '13 at 23:42











                  • @DonAntonio The log's continuity assumption is just fine, though. Since $exp$ is its inverse, it is continuous.
                    – Pedro Tamaroff♦
                    Apr 11 '13 at 23:50










                  • Yeah, thankfully that is covered by the inverse function theorem (which I've now linked/discussed above, along with some other things)
                    – Andrew D
                    Apr 11 '13 at 23:51












                  up vote
                  9
                  down vote










                  up vote
                  9
                  down vote









                  Firstly, let us give a definition to the exponential function, so we know the function has various properties:



                  $$ exp(x) := sum_n=0^infty fracx^nn!$$



                  so that we can prove that (as exp is a power series) :



                  • The exponential function has radius of convergence $infty$, and is thus defined on all of $mathbb R$

                  • As a power series is infinitely differentiable inside its circle of convergence, the exponential function is infinitely differentiable on all of $mathbb R$

                  • We can then prove that the function is strictly increasing, and thus by the inverse function theorem (http://en.wikipedia.org/wiki/Inverse_function_theorem) we can define what we know as the "log" function

                  Knowing all of this, here is hopefully a sufficiently rigorous proof (at least for positive a):



                  As $log(x)$ is continuous and differentiable on $(0,infty)$, we have that $log(1+x)$ is continuous and differentiable on $[0,fracan]$, so by the mean value theorem we know there exists a $c in [0,fracan]$ with



                  $$f'(c) = frac log(1+ fracan ) - log(1) frac an - 0 $$
                  $$ Longrightarrow log[(1+fracan)^n] = fraca1+c$$
                  $$ Longrightarrow (1+fracan)^n = exp(fraca1+c)$$



                  for some $c in [0,fracan]$ . As we then want to take the limit as $n rightarrow infty$, we get that:



                  • As $c in [0,fracan]$ and $fracan rightarrow 0$ as $n rightarrow infty$, by the squeeze theorem we get that $ c rightarrow 0$ as $n rightarrow infty$

                  • As $ c rightarrow 0$ as $n rightarrow infty$, $fraca1+c rightarrow a$ as $n rightarrow infty$

                  • As the exponential function is continuous on $mathbb R$, the limit can pass inside the function, so we get that as $fraca1+c rightarrow a$ as $n rightarrow infty$

                  $$ exp(fraca1+c) rightarrow exp(a) $$
                  as $n rightarrow infty$. Thus we can conclude that



                  $$ lim_n to infty (1+fracan)^n = e^a$$



                  (Of course, this is ignoring that one needs to prove that $exp(a)=e^a$, but this is hardly vital for this question)






                  share|cite|improve this answer














                  Firstly, let us give a definition to the exponential function, so we know the function has various properties:



                  $$ exp(x) := sum_n=0^infty fracx^nn!$$



                  so that we can prove that (as exp is a power series) :



                  • The exponential function has radius of convergence $infty$, and is thus defined on all of $mathbb R$

                  • As a power series is infinitely differentiable inside its circle of convergence, the exponential function is infinitely differentiable on all of $mathbb R$

                  • We can then prove that the function is strictly increasing, and thus by the inverse function theorem (http://en.wikipedia.org/wiki/Inverse_function_theorem) we can define what we know as the "log" function

                  Knowing all of this, here is hopefully a sufficiently rigorous proof (at least for positive a):



                  As $log(x)$ is continuous and differentiable on $(0,infty)$, we have that $log(1+x)$ is continuous and differentiable on $[0,fracan]$, so by the mean value theorem we know there exists a $c in [0,fracan]$ with



                  $$f'(c) = frac log(1+ fracan ) - log(1) frac an - 0 $$
                  $$ Longrightarrow log[(1+fracan)^n] = fraca1+c$$
                  $$ Longrightarrow (1+fracan)^n = exp(fraca1+c)$$



                  for some $c in [0,fracan]$ . As we then want to take the limit as $n rightarrow infty$, we get that:



                  • As $c in [0,fracan]$ and $fracan rightarrow 0$ as $n rightarrow infty$, by the squeeze theorem we get that $ c rightarrow 0$ as $n rightarrow infty$

                  • As $ c rightarrow 0$ as $n rightarrow infty$, $fraca1+c rightarrow a$ as $n rightarrow infty$

                  • As the exponential function is continuous on $mathbb R$, the limit can pass inside the function, so we get that as $fraca1+c rightarrow a$ as $n rightarrow infty$

                  $$ exp(fraca1+c) rightarrow exp(a) $$
                  as $n rightarrow infty$. Thus we can conclude that



                  $$ lim_n to infty (1+fracan)^n = e^a$$



                  (Of course, this is ignoring that one needs to prove that $exp(a)=e^a$, but this is hardly vital for this question)







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Apr 11 '13 at 23:56

























                  answered Apr 11 '13 at 23:29









                  Andrew D

                  1,731830




                  1,731830











                  • If we're just about to define the exponential function (or at least show that it equals something), it seems to me the assumption of its continuity is highly suspicious...
                    – DonAntonio
                    Apr 11 '13 at 23:36










                  • This is true - although I can't see how this proof is nothing more than showing that the various definitions of the exponential function are equilivant, and thus I would presume continuity would have been proved before trying to prove statements such as this one (for example, in our lectures we defined it in terms of a power series, which means that we can prove it is continuous fairly straightforwardly)
                    – Andrew D
                    Apr 11 '13 at 23:39










                  • I agree with that, @Andrew D, but then perhaps mentioning some other definition from which continuity follows and then use that in it...perhaps too long a detour for a beginner, but absolutely possible indeed.
                    – DonAntonio
                    Apr 11 '13 at 23:42











                  • @DonAntonio The log's continuity assumption is just fine, though. Since $exp$ is its inverse, it is continuous.
                    – Pedro Tamaroff♦
                    Apr 11 '13 at 23:50










                  • Yeah, thankfully that is covered by the inverse function theorem (which I've now linked/discussed above, along with some other things)
                    – Andrew D
                    Apr 11 '13 at 23:51
















                  • If we're just about to define the exponential function (or at least show that it equals something), it seems to me the assumption of its continuity is highly suspicious...
                    – DonAntonio
                    Apr 11 '13 at 23:36










                  • This is true - although I can't see how this proof is nothing more than showing that the various definitions of the exponential function are equilivant, and thus I would presume continuity would have been proved before trying to prove statements such as this one (for example, in our lectures we defined it in terms of a power series, which means that we can prove it is continuous fairly straightforwardly)
                    – Andrew D
                    Apr 11 '13 at 23:39










                  • I agree with that, @Andrew D, but then perhaps mentioning some other definition from which continuity follows and then use that in it...perhaps too long a detour for a beginner, but absolutely possible indeed.
                    – DonAntonio
                    Apr 11 '13 at 23:42











                  • @DonAntonio The log's continuity assumption is just fine, though. Since $exp$ is its inverse, it is continuous.
                    – Pedro Tamaroff♦
                    Apr 11 '13 at 23:50










                  • Yeah, thankfully that is covered by the inverse function theorem (which I've now linked/discussed above, along with some other things)
                    – Andrew D
                    Apr 11 '13 at 23:51















                  If we're just about to define the exponential function (or at least show that it equals something), it seems to me the assumption of its continuity is highly suspicious...
                  – DonAntonio
                  Apr 11 '13 at 23:36




                  If we're just about to define the exponential function (or at least show that it equals something), it seems to me the assumption of its continuity is highly suspicious...
                  – DonAntonio
                  Apr 11 '13 at 23:36












                  This is true - although I can't see how this proof is nothing more than showing that the various definitions of the exponential function are equilivant, and thus I would presume continuity would have been proved before trying to prove statements such as this one (for example, in our lectures we defined it in terms of a power series, which means that we can prove it is continuous fairly straightforwardly)
                  – Andrew D
                  Apr 11 '13 at 23:39




                  This is true - although I can't see how this proof is nothing more than showing that the various definitions of the exponential function are equilivant, and thus I would presume continuity would have been proved before trying to prove statements such as this one (for example, in our lectures we defined it in terms of a power series, which means that we can prove it is continuous fairly straightforwardly)
                  – Andrew D
                  Apr 11 '13 at 23:39












                  I agree with that, @Andrew D, but then perhaps mentioning some other definition from which continuity follows and then use that in it...perhaps too long a detour for a beginner, but absolutely possible indeed.
                  – DonAntonio
                  Apr 11 '13 at 23:42





                  I agree with that, @Andrew D, but then perhaps mentioning some other definition from which continuity follows and then use that in it...perhaps too long a detour for a beginner, but absolutely possible indeed.
                  – DonAntonio
                  Apr 11 '13 at 23:42













                  @DonAntonio The log's continuity assumption is just fine, though. Since $exp$ is its inverse, it is continuous.
                  – Pedro Tamaroff♦
                  Apr 11 '13 at 23:50




                  @DonAntonio The log's continuity assumption is just fine, though. Since $exp$ is its inverse, it is continuous.
                  – Pedro Tamaroff♦
                  Apr 11 '13 at 23:50












                  Yeah, thankfully that is covered by the inverse function theorem (which I've now linked/discussed above, along with some other things)
                  – Andrew D
                  Apr 11 '13 at 23:51




                  Yeah, thankfully that is covered by the inverse function theorem (which I've now linked/discussed above, along with some other things)
                  – Andrew D
                  Apr 11 '13 at 23:51










                  up vote
                  3
                  down vote













                  Another answer, assuming $x>0$:



                  Let $f(x)=ln(x)$. Then we know that $f'(x)=1/x$. Also, by the definition of derivative, we can write
                  $$
                  beginalign
                  f'(x)&=lim_hto 0fracf(x+h)-f(x)h\
                  &=lim_hto 0fracln(x+h)-ln(x)h\
                  &=lim_hto 0frac1hlnfracx+hx\
                  &=lim_h to 0lnleft(fracx+hxright)^frac1h\
                  &=lim_hto 0lnleft(1+frachxright)^frac1h
                  endalign
                  $$
                  Then, using the fact that $ln(x)$ is a continuous function for all $x$ in its domain, we can exchange the $lim$ and $ln$:
                  $$
                  f'(x)=lnlim_hto 0left(1+frachxright)^frac1h
                  $$
                  Now, let $m=1/h$. Then $mtoinfty$ as $hto 0^+$, and
                  $$
                  f'(x)=lnlim_mtoinftyleft(1+frac1mxright)^m
                  $$
                  Now, assuming $x>0$, define $n=mx^2$, and so $ntoinfty$ as $mtoinfty$. Then we can write
                  $$
                  f'(x)=lnlim_ntoinftyleft[left(1+fracxnright)^nright]^1/x^2
                  $$
                  and from before, we still have $f'(x)=1/x$, so
                  $$
                  lnlim_ntoinftyleft[left(1+fracxnright)^nright]^1/x^2=frac1x
                  $$
                  Exponentiating both sides, we find
                  $$
                  lim_ntoinftyleft[left(1+fracxnright)^nright]^1/x^2=e^1/x
                  $$
                  Finally, raising both sides to the $x^2$, we find
                  $$
                  lim_ntoinftyleft(1+fracxnright)^n=e^x
                  $$






                  share|cite|improve this answer
























                    up vote
                    3
                    down vote













                    Another answer, assuming $x>0$:



                    Let $f(x)=ln(x)$. Then we know that $f'(x)=1/x$. Also, by the definition of derivative, we can write
                    $$
                    beginalign
                    f'(x)&=lim_hto 0fracf(x+h)-f(x)h\
                    &=lim_hto 0fracln(x+h)-ln(x)h\
                    &=lim_hto 0frac1hlnfracx+hx\
                    &=lim_h to 0lnleft(fracx+hxright)^frac1h\
                    &=lim_hto 0lnleft(1+frachxright)^frac1h
                    endalign
                    $$
                    Then, using the fact that $ln(x)$ is a continuous function for all $x$ in its domain, we can exchange the $lim$ and $ln$:
                    $$
                    f'(x)=lnlim_hto 0left(1+frachxright)^frac1h
                    $$
                    Now, let $m=1/h$. Then $mtoinfty$ as $hto 0^+$, and
                    $$
                    f'(x)=lnlim_mtoinftyleft(1+frac1mxright)^m
                    $$
                    Now, assuming $x>0$, define $n=mx^2$, and so $ntoinfty$ as $mtoinfty$. Then we can write
                    $$
                    f'(x)=lnlim_ntoinftyleft[left(1+fracxnright)^nright]^1/x^2
                    $$
                    and from before, we still have $f'(x)=1/x$, so
                    $$
                    lnlim_ntoinftyleft[left(1+fracxnright)^nright]^1/x^2=frac1x
                    $$
                    Exponentiating both sides, we find
                    $$
                    lim_ntoinftyleft[left(1+fracxnright)^nright]^1/x^2=e^1/x
                    $$
                    Finally, raising both sides to the $x^2$, we find
                    $$
                    lim_ntoinftyleft(1+fracxnright)^n=e^x
                    $$






                    share|cite|improve this answer






















                      up vote
                      3
                      down vote










                      up vote
                      3
                      down vote









                      Another answer, assuming $x>0$:



                      Let $f(x)=ln(x)$. Then we know that $f'(x)=1/x$. Also, by the definition of derivative, we can write
                      $$
                      beginalign
                      f'(x)&=lim_hto 0fracf(x+h)-f(x)h\
                      &=lim_hto 0fracln(x+h)-ln(x)h\
                      &=lim_hto 0frac1hlnfracx+hx\
                      &=lim_h to 0lnleft(fracx+hxright)^frac1h\
                      &=lim_hto 0lnleft(1+frachxright)^frac1h
                      endalign
                      $$
                      Then, using the fact that $ln(x)$ is a continuous function for all $x$ in its domain, we can exchange the $lim$ and $ln$:
                      $$
                      f'(x)=lnlim_hto 0left(1+frachxright)^frac1h
                      $$
                      Now, let $m=1/h$. Then $mtoinfty$ as $hto 0^+$, and
                      $$
                      f'(x)=lnlim_mtoinftyleft(1+frac1mxright)^m
                      $$
                      Now, assuming $x>0$, define $n=mx^2$, and so $ntoinfty$ as $mtoinfty$. Then we can write
                      $$
                      f'(x)=lnlim_ntoinftyleft[left(1+fracxnright)^nright]^1/x^2
                      $$
                      and from before, we still have $f'(x)=1/x$, so
                      $$
                      lnlim_ntoinftyleft[left(1+fracxnright)^nright]^1/x^2=frac1x
                      $$
                      Exponentiating both sides, we find
                      $$
                      lim_ntoinftyleft[left(1+fracxnright)^nright]^1/x^2=e^1/x
                      $$
                      Finally, raising both sides to the $x^2$, we find
                      $$
                      lim_ntoinftyleft(1+fracxnright)^n=e^x
                      $$






                      share|cite|improve this answer












                      Another answer, assuming $x>0$:



                      Let $f(x)=ln(x)$. Then we know that $f'(x)=1/x$. Also, by the definition of derivative, we can write
                      $$
                      beginalign
                      f'(x)&=lim_hto 0fracf(x+h)-f(x)h\
                      &=lim_hto 0fracln(x+h)-ln(x)h\
                      &=lim_hto 0frac1hlnfracx+hx\
                      &=lim_h to 0lnleft(fracx+hxright)^frac1h\
                      &=lim_hto 0lnleft(1+frachxright)^frac1h
                      endalign
                      $$
                      Then, using the fact that $ln(x)$ is a continuous function for all $x$ in its domain, we can exchange the $lim$ and $ln$:
                      $$
                      f'(x)=lnlim_hto 0left(1+frachxright)^frac1h
                      $$
                      Now, let $m=1/h$. Then $mtoinfty$ as $hto 0^+$, and
                      $$
                      f'(x)=lnlim_mtoinftyleft(1+frac1mxright)^m
                      $$
                      Now, assuming $x>0$, define $n=mx^2$, and so $ntoinfty$ as $mtoinfty$. Then we can write
                      $$
                      f'(x)=lnlim_ntoinftyleft[left(1+fracxnright)^nright]^1/x^2
                      $$
                      and from before, we still have $f'(x)=1/x$, so
                      $$
                      lnlim_ntoinftyleft[left(1+fracxnright)^nright]^1/x^2=frac1x
                      $$
                      Exponentiating both sides, we find
                      $$
                      lim_ntoinftyleft[left(1+fracxnright)^nright]^1/x^2=e^1/x
                      $$
                      Finally, raising both sides to the $x^2$, we find
                      $$
                      lim_ntoinftyleft(1+fracxnright)^n=e^x
                      $$







                      share|cite|improve this answer












                      share|cite|improve this answer



                      share|cite|improve this answer










                      answered Sep 25 '15 at 1:22









                      Mike Bell

                      235313




                      235313




















                          up vote
                          3
                          down vote














                          Aaah... The sweet sound of silent revenge downvotes... Always a pleasure!




                          Consider the functions $u$ and $v$ defined for every $|t|ltfrac12$ by
                          $$
                          u(t)=t-log(1+t),qquad v(t)=t-t^2-log(1+t).
                          $$
                          The derivative of $u$ is $u'(t)=fract1+t$, which has the sign of $t$, hence $u(t)geqslant0$. The derivative of $v$ is $v'(t)=1-2t-frac11+t$, which has the sign of $(1+t)(1-2t)-1=-t(1+2t)$ which has the sign of $-t$ on the domain $|t|ltfrac12$ hence $v(t)leqslant0$.
                          Thus:




                          For every $|t|ltfrac12$,
                          $$
                          t-t^2leqslantlog (1+t)leqslant t.
                          $$




                          The function $zmapstoexp(nz)$ is nondecreasing on the same domain hence
                          $$
                          expleft(nt-nt^2right)leqslant(1+t)^nleqslantexpleft(ntright).
                          $$
                          In particular, using this for $t=x/n$, one gets:




                          For every $|x|<frac12n$,
                          $$
                          expleft(x-fracx^2nright)leqslantleft(1+fracxnright)^nleqslantmathrm e^x.
                          $$




                          Finally, $x^2/nto 0$ when $ntoinfty$ and the exponential is continuous at $0$, hence we are done.



                          Facts/Definitions used:



                          • The logarithm has derivative $tmapsto1/t$.

                          • The exponential is the inverse of the logarithm.





                          share|cite|improve this answer






















                          • We need to evangelize the use of $leqslant$ and $geqslant$ in MSE.
                            – Pedro Tamaroff♦
                            Aug 10 '13 at 4:11











                          • I used this in an application to lower bound $(1+x/n)^n$, thank you.
                            – JP McCarthy
                            Aug 16 '16 at 11:55










                          • Didier, I like this approach. So (+1). I was wondering if you've seen a way to establish the same lower bound for $left(1+frac xnright)^n$ by using the limit definition of the exponential function and without appealing to calculus. The upper bound is trivial for $x>-n$.
                            – Mark Viola
                            Jan 5 '17 at 19:39











                          • @Dr.MV This reduces to showing $1+tgeqslant exp(t-t^2)$, that is, $frac11+tleqslantexp(-t+t^2)$. What you call the trivial upper bound yields $frac11+t=1-fract1+tleqslantexpleft(-fract1+tright)$ hence if $fract1+tgeqslant t-t^2$, we are done. This is asking that $tgeqslant (t-t^2)(1+t)=t(1-t^2)$ hence indeed, we are done for every $t$ in $(0,1)$. This fails for $t$ negative but similar arguments might work.
                            – Did
                            Jan 8 '17 at 9:17















                          up vote
                          3
                          down vote














                          Aaah... The sweet sound of silent revenge downvotes... Always a pleasure!




                          Consider the functions $u$ and $v$ defined for every $|t|ltfrac12$ by
                          $$
                          u(t)=t-log(1+t),qquad v(t)=t-t^2-log(1+t).
                          $$
                          The derivative of $u$ is $u'(t)=fract1+t$, which has the sign of $t$, hence $u(t)geqslant0$. The derivative of $v$ is $v'(t)=1-2t-frac11+t$, which has the sign of $(1+t)(1-2t)-1=-t(1+2t)$ which has the sign of $-t$ on the domain $|t|ltfrac12$ hence $v(t)leqslant0$.
                          Thus:




                          For every $|t|ltfrac12$,
                          $$
                          t-t^2leqslantlog (1+t)leqslant t.
                          $$




                          The function $zmapstoexp(nz)$ is nondecreasing on the same domain hence
                          $$
                          expleft(nt-nt^2right)leqslant(1+t)^nleqslantexpleft(ntright).
                          $$
                          In particular, using this for $t=x/n$, one gets:




                          For every $|x|<frac12n$,
                          $$
                          expleft(x-fracx^2nright)leqslantleft(1+fracxnright)^nleqslantmathrm e^x.
                          $$




                          Finally, $x^2/nto 0$ when $ntoinfty$ and the exponential is continuous at $0$, hence we are done.



                          Facts/Definitions used:



                          • The logarithm has derivative $tmapsto1/t$.

                          • The exponential is the inverse of the logarithm.





                          share|cite|improve this answer






















                          • We need to evangelize the use of $leqslant$ and $geqslant$ in MSE.
                            – Pedro Tamaroff♦
                            Aug 10 '13 at 4:11











                          • I used this in an application to lower bound $(1+x/n)^n$, thank you.
                            – JP McCarthy
                            Aug 16 '16 at 11:55










                          • Didier, I like this approach. So (+1). I was wondering if you've seen a way to establish the same lower bound for $left(1+frac xnright)^n$ by using the limit definition of the exponential function and without appealing to calculus. The upper bound is trivial for $x>-n$.
                            – Mark Viola
                            Jan 5 '17 at 19:39











                          • @Dr.MV This reduces to showing $1+tgeqslant exp(t-t^2)$, that is, $frac11+tleqslantexp(-t+t^2)$. What you call the trivial upper bound yields $frac11+t=1-fract1+tleqslantexpleft(-fract1+tright)$ hence if $fract1+tgeqslant t-t^2$, we are done. This is asking that $tgeqslant (t-t^2)(1+t)=t(1-t^2)$ hence indeed, we are done for every $t$ in $(0,1)$. This fails for $t$ negative but similar arguments might work.
                            – Did
                            Jan 8 '17 at 9:17













                          up vote
                          3
                          down vote










                          up vote
                          3
                          down vote










                          Aaah... The sweet sound of silent revenge downvotes... Always a pleasure!




                          Consider the functions $u$ and $v$ defined for every $|t|ltfrac12$ by
                          $$
                          u(t)=t-log(1+t),qquad v(t)=t-t^2-log(1+t).
                          $$
                          The derivative of $u$ is $u'(t)=fract1+t$, which has the sign of $t$, hence $u(t)geqslant0$. The derivative of $v$ is $v'(t)=1-2t-frac11+t$, which has the sign of $(1+t)(1-2t)-1=-t(1+2t)$ which has the sign of $-t$ on the domain $|t|ltfrac12$ hence $v(t)leqslant0$.
                          Thus:




                          For every $|t|ltfrac12$,
                          $$
                          t-t^2leqslantlog (1+t)leqslant t.
                          $$




                          The function $zmapstoexp(nz)$ is nondecreasing on the same domain hence
                          $$
                          expleft(nt-nt^2right)leqslant(1+t)^nleqslantexpleft(ntright).
                          $$
                          In particular, using this for $t=x/n$, one gets:




                          For every $|x|<frac12n$,
                          $$
                          expleft(x-fracx^2nright)leqslantleft(1+fracxnright)^nleqslantmathrm e^x.
                          $$




                          Finally, $x^2/nto 0$ when $ntoinfty$ and the exponential is continuous at $0$, hence we are done.



                          Facts/Definitions used:



                          • The logarithm has derivative $tmapsto1/t$.

                          • The exponential is the inverse of the logarithm.





                          share|cite|improve this answer















                          Aaah... The sweet sound of silent revenge downvotes... Always a pleasure!




                          Consider the functions $u$ and $v$ defined for every $|t|ltfrac12$ by
                          $$
                          u(t)=t-log(1+t),qquad v(t)=t-t^2-log(1+t).
                          $$
                          The derivative of $u$ is $u'(t)=fract1+t$, which has the sign of $t$, hence $u(t)geqslant0$. The derivative of $v$ is $v'(t)=1-2t-frac11+t$, which has the sign of $(1+t)(1-2t)-1=-t(1+2t)$ which has the sign of $-t$ on the domain $|t|ltfrac12$ hence $v(t)leqslant0$.
                          Thus:




                          For every $|t|ltfrac12$,
                          $$
                          t-t^2leqslantlog (1+t)leqslant t.
                          $$




                          The function $zmapstoexp(nz)$ is nondecreasing on the same domain hence
                          $$
                          expleft(nt-nt^2right)leqslant(1+t)^nleqslantexpleft(ntright).
                          $$
                          In particular, using this for $t=x/n$, one gets:




                          For every $|x|<frac12n$,
                          $$
                          expleft(x-fracx^2nright)leqslantleft(1+fracxnright)^nleqslantmathrm e^x.
                          $$




                          Finally, $x^2/nto 0$ when $ntoinfty$ and the exponential is continuous at $0$, hence we are done.



                          Facts/Definitions used:



                          • The logarithm has derivative $tmapsto1/t$.

                          • The exponential is the inverse of the logarithm.






                          share|cite|improve this answer














                          share|cite|improve this answer



                          share|cite|improve this answer








                          edited Aug 30 at 7:11

























                          answered Apr 12 '13 at 13:48









                          Did

                          243k23209444




                          243k23209444











                          • We need to evangelize the use of $leqslant$ and $geqslant$ in MSE.
                            – Pedro Tamaroff♦
                            Aug 10 '13 at 4:11











                          • I used this in an application to lower bound $(1+x/n)^n$, thank you.
                            – JP McCarthy
                            Aug 16 '16 at 11:55










                          • Didier, I like this approach. So (+1). I was wondering if you've seen a way to establish the same lower bound for $left(1+frac xnright)^n$ by using the limit definition of the exponential function and without appealing to calculus. The upper bound is trivial for $x>-n$.
                            – Mark Viola
                            Jan 5 '17 at 19:39











                          • @Dr.MV This reduces to showing $1+tgeqslant exp(t-t^2)$, that is, $frac11+tleqslantexp(-t+t^2)$. What you call the trivial upper bound yields $frac11+t=1-fract1+tleqslantexpleft(-fract1+tright)$ hence if $fract1+tgeqslant t-t^2$, we are done. This is asking that $tgeqslant (t-t^2)(1+t)=t(1-t^2)$ hence indeed, we are done for every $t$ in $(0,1)$. This fails for $t$ negative but similar arguments might work.
                            – Did
                            Jan 8 '17 at 9:17

















                          • We need to evangelize the use of $leqslant$ and $geqslant$ in MSE.
                            – Pedro Tamaroff♦
                            Aug 10 '13 at 4:11











                          • I used this in an application to lower bound $(1+x/n)^n$, thank you.
                            – JP McCarthy
                            Aug 16 '16 at 11:55










                          • Didier, I like this approach. So (+1). I was wondering if you've seen a way to establish the same lower bound for $left(1+frac xnright)^n$ by using the limit definition of the exponential function and without appealing to calculus. The upper bound is trivial for $x>-n$.
                            – Mark Viola
                            Jan 5 '17 at 19:39











                          • @Dr.MV This reduces to showing $1+tgeqslant exp(t-t^2)$, that is, $frac11+tleqslantexp(-t+t^2)$. What you call the trivial upper bound yields $frac11+t=1-fract1+tleqslantexpleft(-fract1+tright)$ hence if $fract1+tgeqslant t-t^2$, we are done. This is asking that $tgeqslant (t-t^2)(1+t)=t(1-t^2)$ hence indeed, we are done for every $t$ in $(0,1)$. This fails for $t$ negative but similar arguments might work.
                            – Did
                            Jan 8 '17 at 9:17
















                          We need to evangelize the use of $leqslant$ and $geqslant$ in MSE.
                          – Pedro Tamaroff♦
                          Aug 10 '13 at 4:11





                          We need to evangelize the use of $leqslant$ and $geqslant$ in MSE.
                          – Pedro Tamaroff♦
                          Aug 10 '13 at 4:11













                          I used this in an application to lower bound $(1+x/n)^n$, thank you.
                          – JP McCarthy
                          Aug 16 '16 at 11:55




                          I used this in an application to lower bound $(1+x/n)^n$, thank you.
                          – JP McCarthy
                          Aug 16 '16 at 11:55












                          Didier, I like this approach. So (+1). I was wondering if you've seen a way to establish the same lower bound for $left(1+frac xnright)^n$ by using the limit definition of the exponential function and without appealing to calculus. The upper bound is trivial for $x>-n$.
                          – Mark Viola
                          Jan 5 '17 at 19:39





                          Didier, I like this approach. So (+1). I was wondering if you've seen a way to establish the same lower bound for $left(1+frac xnright)^n$ by using the limit definition of the exponential function and without appealing to calculus. The upper bound is trivial for $x>-n$.
                          – Mark Viola
                          Jan 5 '17 at 19:39













                          @Dr.MV This reduces to showing $1+tgeqslant exp(t-t^2)$, that is, $frac11+tleqslantexp(-t+t^2)$. What you call the trivial upper bound yields $frac11+t=1-fract1+tleqslantexpleft(-fract1+tright)$ hence if $fract1+tgeqslant t-t^2$, we are done. This is asking that $tgeqslant (t-t^2)(1+t)=t(1-t^2)$ hence indeed, we are done for every $t$ in $(0,1)$. This fails for $t$ negative but similar arguments might work.
                          – Did
                          Jan 8 '17 at 9:17





                          @Dr.MV This reduces to showing $1+tgeqslant exp(t-t^2)$, that is, $frac11+tleqslantexp(-t+t^2)$. What you call the trivial upper bound yields $frac11+t=1-fract1+tleqslantexpleft(-fract1+tright)$ hence if $fract1+tgeqslant t-t^2$, we are done. This is asking that $tgeqslant (t-t^2)(1+t)=t(1-t^2)$ hence indeed, we are done for every $t$ in $(0,1)$. This fails for $t$ negative but similar arguments might work.
                          – Did
                          Jan 8 '17 at 9:17











                          up vote
                          1
                          down vote













                          $ (1+x/n)^n = sum_k=0^n binomnkfracx^kn^k $



                          Now just prove that $binomnkfracx^kn^k$ approaches $fracx^kk!$ as n approaches infinity, and you will have proven that your limit matches the Taylor series for $exp(x)$






                          share|cite|improve this answer


















                          • 5




                            This is not enough; there are infinitely many terms, so you need to show that you can swap two limits here.
                            – Qiaochu Yuan
                            Apr 11 '13 at 23:17






                          • 1




                            What you want to do is work with $limsup$ and $liminf$ here, and show $e^xleqliminf $ and $e^xgeq limsup$
                            – Pedro Tamaroff♦
                            Apr 11 '13 at 23:53










                          • How would you show that you can swap the two limits?
                            – amarney
                            Mar 26 '17 at 22:54














                          up vote
                          1
                          down vote













                          $ (1+x/n)^n = sum_k=0^n binomnkfracx^kn^k $



                          Now just prove that $binomnkfracx^kn^k$ approaches $fracx^kk!$ as n approaches infinity, and you will have proven that your limit matches the Taylor series for $exp(x)$






                          share|cite|improve this answer


















                          • 5




                            This is not enough; there are infinitely many terms, so you need to show that you can swap two limits here.
                            – Qiaochu Yuan
                            Apr 11 '13 at 23:17






                          • 1




                            What you want to do is work with $limsup$ and $liminf$ here, and show $e^xleqliminf $ and $e^xgeq limsup$
                            – Pedro Tamaroff♦
                            Apr 11 '13 at 23:53










                          • How would you show that you can swap the two limits?
                            – amarney
                            Mar 26 '17 at 22:54












                          up vote
                          1
                          down vote










                          up vote
                          1
                          down vote









                          $ (1+x/n)^n = sum_k=0^n binomnkfracx^kn^k $



                          Now just prove that $binomnkfracx^kn^k$ approaches $fracx^kk!$ as n approaches infinity, and you will have proven that your limit matches the Taylor series for $exp(x)$






                          share|cite|improve this answer














                          $ (1+x/n)^n = sum_k=0^n binomnkfracx^kn^k $



                          Now just prove that $binomnkfracx^kn^k$ approaches $fracx^kk!$ as n approaches infinity, and you will have proven that your limit matches the Taylor series for $exp(x)$







                          share|cite|improve this answer














                          share|cite|improve this answer



                          share|cite|improve this answer








                          edited Apr 12 '13 at 0:12

























                          answered Apr 11 '13 at 23:07









                          Three

                          5472511




                          5472511







                          • 5




                            This is not enough; there are infinitely many terms, so you need to show that you can swap two limits here.
                            – Qiaochu Yuan
                            Apr 11 '13 at 23:17






                          • 1




                            What you want to do is work with $limsup$ and $liminf$ here, and show $e^xleqliminf $ and $e^xgeq limsup$
                            – Pedro Tamaroff♦
                            Apr 11 '13 at 23:53










                          • How would you show that you can swap the two limits?
                            – amarney
                            Mar 26 '17 at 22:54












                          • 5




                            This is not enough; there are infinitely many terms, so you need to show that you can swap two limits here.
                            – Qiaochu Yuan
                            Apr 11 '13 at 23:17






                          • 1




                            What you want to do is work with $limsup$ and $liminf$ here, and show $e^xleqliminf $ and $e^xgeq limsup$
                            – Pedro Tamaroff♦
                            Apr 11 '13 at 23:53










                          • How would you show that you can swap the two limits?
                            – amarney
                            Mar 26 '17 at 22:54







                          5




                          5




                          This is not enough; there are infinitely many terms, so you need to show that you can swap two limits here.
                          – Qiaochu Yuan
                          Apr 11 '13 at 23:17




                          This is not enough; there are infinitely many terms, so you need to show that you can swap two limits here.
                          – Qiaochu Yuan
                          Apr 11 '13 at 23:17




                          1




                          1




                          What you want to do is work with $limsup$ and $liminf$ here, and show $e^xleqliminf $ and $e^xgeq limsup$
                          – Pedro Tamaroff♦
                          Apr 11 '13 at 23:53




                          What you want to do is work with $limsup$ and $liminf$ here, and show $e^xleqliminf $ and $e^xgeq limsup$
                          – Pedro Tamaroff♦
                          Apr 11 '13 at 23:53












                          How would you show that you can swap the two limits?
                          – amarney
                          Mar 26 '17 at 22:54




                          How would you show that you can swap the two limits?
                          – amarney
                          Mar 26 '17 at 22:54










                          up vote
                          1
                          down vote













                          For any fixed value of $x$, define



                          $$f(u)= ln(1+ux)over u$$



                          By L'Hopital's Rule,



                          $$lim_urightarrow0^+f(u)=lim_urightarrow0^+x/(1+ux)over1=x$$



                          Now exponentiate $f$:



                          $$e^f(u)=(1+ux)^1/u$$



                          By continuity of the exponential function, we have



                          $$lim_urightarrow0^+(1+ux)^1/u=lim_urightarrow0^+e^f(u)=e^lim_urightarrow0^+f(u)=e^x$$



                          All these limits have been shown to exist for the (positive) real variable $u$ tending to $0$, hence they must exist, and be the same, for the sequence of reciprocals of integers, $u=1/n$, as $n$ tends to infinity, and the result follows:



                          $$lim_nrightarrowinftyleft(1+xover nright)^n = e^x$$






                          share|cite|improve this answer
























                            up vote
                            1
                            down vote













                            For any fixed value of $x$, define



                            $$f(u)= ln(1+ux)over u$$



                            By L'Hopital's Rule,



                            $$lim_urightarrow0^+f(u)=lim_urightarrow0^+x/(1+ux)over1=x$$



                            Now exponentiate $f$:



                            $$e^f(u)=(1+ux)^1/u$$



                            By continuity of the exponential function, we have



                            $$lim_urightarrow0^+(1+ux)^1/u=lim_urightarrow0^+e^f(u)=e^lim_urightarrow0^+f(u)=e^x$$



                            All these limits have been shown to exist for the (positive) real variable $u$ tending to $0$, hence they must exist, and be the same, for the sequence of reciprocals of integers, $u=1/n$, as $n$ tends to infinity, and the result follows:



                            $$lim_nrightarrowinftyleft(1+xover nright)^n = e^x$$






                            share|cite|improve this answer






















                              up vote
                              1
                              down vote










                              up vote
                              1
                              down vote









                              For any fixed value of $x$, define



                              $$f(u)= ln(1+ux)over u$$



                              By L'Hopital's Rule,



                              $$lim_urightarrow0^+f(u)=lim_urightarrow0^+x/(1+ux)over1=x$$



                              Now exponentiate $f$:



                              $$e^f(u)=(1+ux)^1/u$$



                              By continuity of the exponential function, we have



                              $$lim_urightarrow0^+(1+ux)^1/u=lim_urightarrow0^+e^f(u)=e^lim_urightarrow0^+f(u)=e^x$$



                              All these limits have been shown to exist for the (positive) real variable $u$ tending to $0$, hence they must exist, and be the same, for the sequence of reciprocals of integers, $u=1/n$, as $n$ tends to infinity, and the result follows:



                              $$lim_nrightarrowinftyleft(1+xover nright)^n = e^x$$






                              share|cite|improve this answer












                              For any fixed value of $x$, define



                              $$f(u)= ln(1+ux)over u$$



                              By L'Hopital's Rule,



                              $$lim_urightarrow0^+f(u)=lim_urightarrow0^+x/(1+ux)over1=x$$



                              Now exponentiate $f$:



                              $$e^f(u)=(1+ux)^1/u$$



                              By continuity of the exponential function, we have



                              $$lim_urightarrow0^+(1+ux)^1/u=lim_urightarrow0^+e^f(u)=e^lim_urightarrow0^+f(u)=e^x$$



                              All these limits have been shown to exist for the (positive) real variable $u$ tending to $0$, hence they must exist, and be the same, for the sequence of reciprocals of integers, $u=1/n$, as $n$ tends to infinity, and the result follows:



                              $$lim_nrightarrowinftyleft(1+xover nright)^n = e^x$$







                              share|cite|improve this answer












                              share|cite|improve this answer



                              share|cite|improve this answer










                              answered Aug 10 '13 at 3:23









                              Barry Cipra

                              56.9k652120




                              56.9k652120




















                                  up vote
                                  0
                                  down vote













                                  This one of the ways in which it is defined. The equivalence of the definitions can be proved easily, I guess.
                                  If for example you take the exponential function to be the inverse of the logarithm:



                                  $log(lim_n(1 + fracxn)^n) = lim_n n log(1 + fracxn) = lim_n n cdot[fracxn - fracx^22n^2 + dots] = x$



                                  EDIT: The logarithm is defined as usual: $log x = int_1^x fracdtt$. The first identity follows from the continuity of the logarithm, the second it's just an application of one of the property of the logarithm ($log a^b = b log a $), while to obtain the third it sufficies to have the Taylor expansion of $log(1+x)$.






                                  share|cite|improve this answer






















                                  • The very first equality requires, me believes, a justification that I cannot see as very easy unless we already assume quite a bit (say, continuity...). After that things get even tougher as we need power series and then also, apparently, differentiation.
                                    – DonAntonio
                                    Apr 11 '13 at 23:40






                                  • 2




                                    The logarithm is defined as $int_1^x fracdtt$, therefore, if we have integration we can also have continuity and differentiation, I suppose.
                                    – user67133
                                    Apr 11 '13 at 23:45











                                  • Perhaps so and also perhaps mentioning this could clear things out a little, since we don't know, apparently, what the OP's background is.
                                    – DonAntonio
                                    Apr 11 '13 at 23:47






                                  • 1




                                    I cannot but totally agree. Thank you for your suggestions, I am going to edit the post to make it clearer!
                                    – user67133
                                    Apr 12 '13 at 0:07














                                  up vote
                                  0
                                  down vote













                                  This one of the ways in which it is defined. The equivalence of the definitions can be proved easily, I guess.
                                  If for example you take the exponential function to be the inverse of the logarithm:



                                  $log(lim_n(1 + fracxn)^n) = lim_n n log(1 + fracxn) = lim_n n cdot[fracxn - fracx^22n^2 + dots] = x$



                                  EDIT: The logarithm is defined as usual: $log x = int_1^x fracdtt$. The first identity follows from the continuity of the logarithm, the second it's just an application of one of the property of the logarithm ($log a^b = b log a $), while to obtain the third it sufficies to have the Taylor expansion of $log(1+x)$.






                                  share|cite|improve this answer






















                                  • The very first equality requires, me believes, a justification that I cannot see as very easy unless we already assume quite a bit (say, continuity...). After that things get even tougher as we need power series and then also, apparently, differentiation.
                                    – DonAntonio
                                    Apr 11 '13 at 23:40






                                  • 2




                                    The logarithm is defined as $int_1^x fracdtt$, therefore, if we have integration we can also have continuity and differentiation, I suppose.
                                    – user67133
                                    Apr 11 '13 at 23:45











                                  • Perhaps so and also perhaps mentioning this could clear things out a little, since we don't know, apparently, what the OP's background is.
                                    – DonAntonio
                                    Apr 11 '13 at 23:47






                                  • 1




                                    I cannot but totally agree. Thank you for your suggestions, I am going to edit the post to make it clearer!
                                    – user67133
                                    Apr 12 '13 at 0:07












                                  up vote
                                  0
                                  down vote










                                  up vote
                                  0
                                  down vote









                                  This one of the ways in which it is defined. The equivalence of the definitions can be proved easily, I guess.
                                  If for example you take the exponential function to be the inverse of the logarithm:



                                  $log(lim_n(1 + fracxn)^n) = lim_n n log(1 + fracxn) = lim_n n cdot[fracxn - fracx^22n^2 + dots] = x$



                                  EDIT: The logarithm is defined as usual: $log x = int_1^x fracdtt$. The first identity follows from the continuity of the logarithm, the second it's just an application of one of the property of the logarithm ($log a^b = b log a $), while to obtain the third it sufficies to have the Taylor expansion of $log(1+x)$.






                                  share|cite|improve this answer














                                  This one of the ways in which it is defined. The equivalence of the definitions can be proved easily, I guess.
                                  If for example you take the exponential function to be the inverse of the logarithm:



                                  $log(lim_n(1 + fracxn)^n) = lim_n n log(1 + fracxn) = lim_n n cdot[fracxn - fracx^22n^2 + dots] = x$



                                  EDIT: The logarithm is defined as usual: $log x = int_1^x fracdtt$. The first identity follows from the continuity of the logarithm, the second it's just an application of one of the property of the logarithm ($log a^b = b log a $), while to obtain the third it sufficies to have the Taylor expansion of $log(1+x)$.







                                  share|cite|improve this answer














                                  share|cite|improve this answer



                                  share|cite|improve this answer








                                  edited Apr 12 '13 at 0:16

























                                  answered Apr 11 '13 at 23:08







                                  user67133


















                                  • The very first equality requires, me believes, a justification that I cannot see as very easy unless we already assume quite a bit (say, continuity...). After that things get even tougher as we need power series and then also, apparently, differentiation.
                                    – DonAntonio
                                    Apr 11 '13 at 23:40






                                  • 2




                                    The logarithm is defined as $int_1^x fracdtt$, therefore, if we have integration we can also have continuity and differentiation, I suppose.
                                    – user67133
                                    Apr 11 '13 at 23:45











                                  • Perhaps so and also perhaps mentioning this could clear things out a little, since we don't know, apparently, what the OP's background is.
                                    – DonAntonio
                                    Apr 11 '13 at 23:47






                                  • 1




                                    I cannot but totally agree. Thank you for your suggestions, I am going to edit the post to make it clearer!
                                    – user67133
                                    Apr 12 '13 at 0:07
















                                  • The very first equality requires, me believes, a justification that I cannot see as very easy unless we already assume quite a bit (say, continuity...). After that things get even tougher as we need power series and then also, apparently, differentiation.
                                    – DonAntonio
                                    Apr 11 '13 at 23:40






                                  • 2




                                    The logarithm is defined as $int_1^x fracdtt$, therefore, if we have integration we can also have continuity and differentiation, I suppose.
                                    – user67133
                                    Apr 11 '13 at 23:45











                                  • Perhaps so and also perhaps mentioning this could clear things out a little, since we don't know, apparently, what the OP's background is.
                                    – DonAntonio
                                    Apr 11 '13 at 23:47






                                  • 1




                                    I cannot but totally agree. Thank you for your suggestions, I am going to edit the post to make it clearer!
                                    – user67133
                                    Apr 12 '13 at 0:07















                                  The very first equality requires, me believes, a justification that I cannot see as very easy unless we already assume quite a bit (say, continuity...). After that things get even tougher as we need power series and then also, apparently, differentiation.
                                  – DonAntonio
                                  Apr 11 '13 at 23:40




                                  The very first equality requires, me believes, a justification that I cannot see as very easy unless we already assume quite a bit (say, continuity...). After that things get even tougher as we need power series and then also, apparently, differentiation.
                                  – DonAntonio
                                  Apr 11 '13 at 23:40




                                  2




                                  2




                                  The logarithm is defined as $int_1^x fracdtt$, therefore, if we have integration we can also have continuity and differentiation, I suppose.
                                  – user67133
                                  Apr 11 '13 at 23:45





                                  The logarithm is defined as $int_1^x fracdtt$, therefore, if we have integration we can also have continuity and differentiation, I suppose.
                                  – user67133
                                  Apr 11 '13 at 23:45













                                  Perhaps so and also perhaps mentioning this could clear things out a little, since we don't know, apparently, what the OP's background is.
                                  – DonAntonio
                                  Apr 11 '13 at 23:47




                                  Perhaps so and also perhaps mentioning this could clear things out a little, since we don't know, apparently, what the OP's background is.
                                  – DonAntonio
                                  Apr 11 '13 at 23:47




                                  1




                                  1




                                  I cannot but totally agree. Thank you for your suggestions, I am going to edit the post to make it clearer!
                                  – user67133
                                  Apr 12 '13 at 0:07




                                  I cannot but totally agree. Thank you for your suggestions, I am going to edit the post to make it clearer!
                                  – user67133
                                  Apr 12 '13 at 0:07










                                  up vote
                                  0
                                  down vote













                                  There is at most one function $g$ on $mathbbR$ such that
                                  $$g'(x)=g(x)text for all xtext in mathbbRquadtextandquad g(0)=1,.$$
                                  If you let $f_n(x)=(1+x/n)^n$ and you can demonstrate that it compactly converges to some function $f$, you can demonstrate that $f'(x)=f(x)$ and $f(0)=1$. Likewise, if you take $f_n(x)=sum_k=0^n x^k/k!$ and demonstrate this sequence converges compactly, you can show that this limit satisfies the same conditions. Thus it doesn't matter what your definition is. The uniqueness criteria is what you should probably have in mind when you think of "the exponential".






                                  share|cite|improve this answer
























                                    up vote
                                    0
                                    down vote













                                    There is at most one function $g$ on $mathbbR$ such that
                                    $$g'(x)=g(x)text for all xtext in mathbbRquadtextandquad g(0)=1,.$$
                                    If you let $f_n(x)=(1+x/n)^n$ and you can demonstrate that it compactly converges to some function $f$, you can demonstrate that $f'(x)=f(x)$ and $f(0)=1$. Likewise, if you take $f_n(x)=sum_k=0^n x^k/k!$ and demonstrate this sequence converges compactly, you can show that this limit satisfies the same conditions. Thus it doesn't matter what your definition is. The uniqueness criteria is what you should probably have in mind when you think of "the exponential".






                                    share|cite|improve this answer






















                                      up vote
                                      0
                                      down vote










                                      up vote
                                      0
                                      down vote









                                      There is at most one function $g$ on $mathbbR$ such that
                                      $$g'(x)=g(x)text for all xtext in mathbbRquadtextandquad g(0)=1,.$$
                                      If you let $f_n(x)=(1+x/n)^n$ and you can demonstrate that it compactly converges to some function $f$, you can demonstrate that $f'(x)=f(x)$ and $f(0)=1$. Likewise, if you take $f_n(x)=sum_k=0^n x^k/k!$ and demonstrate this sequence converges compactly, you can show that this limit satisfies the same conditions. Thus it doesn't matter what your definition is. The uniqueness criteria is what you should probably have in mind when you think of "the exponential".






                                      share|cite|improve this answer












                                      There is at most one function $g$ on $mathbbR$ such that
                                      $$g'(x)=g(x)text for all xtext in mathbbRquadtextandquad g(0)=1,.$$
                                      If you let $f_n(x)=(1+x/n)^n$ and you can demonstrate that it compactly converges to some function $f$, you can demonstrate that $f'(x)=f(x)$ and $f(0)=1$. Likewise, if you take $f_n(x)=sum_k=0^n x^k/k!$ and demonstrate this sequence converges compactly, you can show that this limit satisfies the same conditions. Thus it doesn't matter what your definition is. The uniqueness criteria is what you should probably have in mind when you think of "the exponential".







                                      share|cite|improve this answer












                                      share|cite|improve this answer



                                      share|cite|improve this answer










                                      answered Dec 15 '17 at 15:21









                                      Robert Wolfe

                                      5,39122361




                                      5,39122361















                                          protected by John Ma Nov 11 '17 at 4:03



                                          Thank you for your interest in this question.
                                          Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).



                                          Would you like to answer one of these unanswered questions instead?


                                          這個網誌中的熱門文章

                                          How to combine Bézier curves to a surface?

                                          Mutual Information Always Non-negative

                                          Why am i infinitely getting the same tweet with the Twitter Search API?