Does the gradient theorem generalize to vector-valued potentials?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
2
down vote

favorite












The gradient theorem states that if $F$ is a differentiable scalar-valued function of an $n$-dimensional vector space, then the gradient of $F$ is a conservative vector field. Line integrals through this field are path-independent, meaning that any line integral through $F'$ that starts at point $a$ and ends at point $b$ will evaluate to the scalar $F(b) - F(a)$.



Does this result generalize to cases where we start with a vector field? So say $G$ is a vector field and $G'$ is its Jacobian matrix: is the tensor field defined by $G'$ conservative in roughly the same way? Does any line integral through that field starting at $a$ and ending at $b$ now evaluate to the vector $G(b) - G(a)$?



I feel this must be true because you can just argue component-wise, treating each dimension of the output of $G$ as a scalar and considering the vector field defined by each of the corresponding dimensions of $G'$. But I'm not certain that actually works out, or whether there are additional conditions that must hold.



For example, must $G$ itself be a conservative vector field? Or, to put it differently, must $G'$ be a Hessian matrix?







share|cite|improve this question


























    up vote
    2
    down vote

    favorite












    The gradient theorem states that if $F$ is a differentiable scalar-valued function of an $n$-dimensional vector space, then the gradient of $F$ is a conservative vector field. Line integrals through this field are path-independent, meaning that any line integral through $F'$ that starts at point $a$ and ends at point $b$ will evaluate to the scalar $F(b) - F(a)$.



    Does this result generalize to cases where we start with a vector field? So say $G$ is a vector field and $G'$ is its Jacobian matrix: is the tensor field defined by $G'$ conservative in roughly the same way? Does any line integral through that field starting at $a$ and ending at $b$ now evaluate to the vector $G(b) - G(a)$?



    I feel this must be true because you can just argue component-wise, treating each dimension of the output of $G$ as a scalar and considering the vector field defined by each of the corresponding dimensions of $G'$. But I'm not certain that actually works out, or whether there are additional conditions that must hold.



    For example, must $G$ itself be a conservative vector field? Or, to put it differently, must $G'$ be a Hessian matrix?







    share|cite|improve this question
























      up vote
      2
      down vote

      favorite









      up vote
      2
      down vote

      favorite











      The gradient theorem states that if $F$ is a differentiable scalar-valued function of an $n$-dimensional vector space, then the gradient of $F$ is a conservative vector field. Line integrals through this field are path-independent, meaning that any line integral through $F'$ that starts at point $a$ and ends at point $b$ will evaluate to the scalar $F(b) - F(a)$.



      Does this result generalize to cases where we start with a vector field? So say $G$ is a vector field and $G'$ is its Jacobian matrix: is the tensor field defined by $G'$ conservative in roughly the same way? Does any line integral through that field starting at $a$ and ending at $b$ now evaluate to the vector $G(b) - G(a)$?



      I feel this must be true because you can just argue component-wise, treating each dimension of the output of $G$ as a scalar and considering the vector field defined by each of the corresponding dimensions of $G'$. But I'm not certain that actually works out, or whether there are additional conditions that must hold.



      For example, must $G$ itself be a conservative vector field? Or, to put it differently, must $G'$ be a Hessian matrix?







      share|cite|improve this question














      The gradient theorem states that if $F$ is a differentiable scalar-valued function of an $n$-dimensional vector space, then the gradient of $F$ is a conservative vector field. Line integrals through this field are path-independent, meaning that any line integral through $F'$ that starts at point $a$ and ends at point $b$ will evaluate to the scalar $F(b) - F(a)$.



      Does this result generalize to cases where we start with a vector field? So say $G$ is a vector field and $G'$ is its Jacobian matrix: is the tensor field defined by $G'$ conservative in roughly the same way? Does any line integral through that field starting at $a$ and ending at $b$ now evaluate to the vector $G(b) - G(a)$?



      I feel this must be true because you can just argue component-wise, treating each dimension of the output of $G$ as a scalar and considering the vector field defined by each of the corresponding dimensions of $G'$. But I'm not certain that actually works out, or whether there are additional conditions that must hold.



      For example, must $G$ itself be a conservative vector field? Or, to put it differently, must $G'$ be a Hessian matrix?









      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Aug 27 at 13:57

























      asked Aug 26 at 9:15









      senderle

      22616




      22616

























          active

          oldest

          votes











          Your Answer




          StackExchange.ifUsing("editor", function ()
          return StackExchange.using("mathjaxEditing", function ()
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          );
          );
          , "mathjax-editing");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "69"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: false,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













           

          draft saved


          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2894850%2fdoes-the-gradient-theorem-generalize-to-vector-valued-potentials%23new-answer', 'question_page');

          );

          Post as a guest



































          active

          oldest

          votes













          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes















           

          draft saved


          draft discarded















































           


          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2894850%2fdoes-the-gradient-theorem-generalize-to-vector-valued-potentials%23new-answer', 'question_page');

          );

          Post as a guest













































































          這個網誌中的熱門文章

          How to combine Bézier curves to a surface?

          Mutual Information Always Non-negative

          Why am i infinitely getting the same tweet with the Twitter Search API?