Complex Least Squares With Magnitude Equality Constraints

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite
3












For $mathbfx in mathbbC^N$, I'd like to solve the following problem:



$$
mathbfx^ast = arg min_mathbfx Vert mathbfAx-b Vert_2 ,,,,,, mathrms.t. ,,,,, Vert x_i Vert_2 = a_i, ,,,, i = 0, dots, N-1,
$$



where $a_i in mathbbR$. The above is a least-squares problem where the magnitude of the elements of $mathbfx$ are fixed and only their phase may vary.



Can anyone point me in the direction of how to solve this? I have tried adding the equality constraints as a penalty term to the cost function, but had no success. Though I have not found anything yet, I am hoping that is a well-studied problem with a known solution.



Thanks for any help you can provide.







share|cite|improve this question






















  • Interesting problem. No idea how to solve it. :-) It's non-convex and in a particularly difficult manner. (For instance, if $x$ were real, one could search the $2^N$ combinations exhaustively at least.)
    – Michael Grant
    May 12 '16 at 21:30














up vote
1
down vote

favorite
3












For $mathbfx in mathbbC^N$, I'd like to solve the following problem:



$$
mathbfx^ast = arg min_mathbfx Vert mathbfAx-b Vert_2 ,,,,,, mathrms.t. ,,,,, Vert x_i Vert_2 = a_i, ,,,, i = 0, dots, N-1,
$$



where $a_i in mathbbR$. The above is a least-squares problem where the magnitude of the elements of $mathbfx$ are fixed and only their phase may vary.



Can anyone point me in the direction of how to solve this? I have tried adding the equality constraints as a penalty term to the cost function, but had no success. Though I have not found anything yet, I am hoping that is a well-studied problem with a known solution.



Thanks for any help you can provide.







share|cite|improve this question






















  • Interesting problem. No idea how to solve it. :-) It's non-convex and in a particularly difficult manner. (For instance, if $x$ were real, one could search the $2^N$ combinations exhaustively at least.)
    – Michael Grant
    May 12 '16 at 21:30












up vote
1
down vote

favorite
3









up vote
1
down vote

favorite
3






3





For $mathbfx in mathbbC^N$, I'd like to solve the following problem:



$$
mathbfx^ast = arg min_mathbfx Vert mathbfAx-b Vert_2 ,,,,,, mathrms.t. ,,,,, Vert x_i Vert_2 = a_i, ,,,, i = 0, dots, N-1,
$$



where $a_i in mathbbR$. The above is a least-squares problem where the magnitude of the elements of $mathbfx$ are fixed and only their phase may vary.



Can anyone point me in the direction of how to solve this? I have tried adding the equality constraints as a penalty term to the cost function, but had no success. Though I have not found anything yet, I am hoping that is a well-studied problem with a known solution.



Thanks for any help you can provide.







share|cite|improve this question














For $mathbfx in mathbbC^N$, I'd like to solve the following problem:



$$
mathbfx^ast = arg min_mathbfx Vert mathbfAx-b Vert_2 ,,,,,, mathrms.t. ,,,,, Vert x_i Vert_2 = a_i, ,,,, i = 0, dots, N-1,
$$



where $a_i in mathbbR$. The above is a least-squares problem where the magnitude of the elements of $mathbfx$ are fixed and only their phase may vary.



Can anyone point me in the direction of how to solve this? I have tried adding the equality constraints as a penalty term to the cost function, but had no success. Though I have not found anything yet, I am hoping that is a well-studied problem with a known solution.



Thanks for any help you can provide.









share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited May 12 '16 at 21:24

























asked May 12 '16 at 3:21









AnonSubmitter85

2,83221420




2,83221420











  • Interesting problem. No idea how to solve it. :-) It's non-convex and in a particularly difficult manner. (For instance, if $x$ were real, one could search the $2^N$ combinations exhaustively at least.)
    – Michael Grant
    May 12 '16 at 21:30
















  • Interesting problem. No idea how to solve it. :-) It's non-convex and in a particularly difficult manner. (For instance, if $x$ were real, one could search the $2^N$ combinations exhaustively at least.)
    – Michael Grant
    May 12 '16 at 21:30















Interesting problem. No idea how to solve it. :-) It's non-convex and in a particularly difficult manner. (For instance, if $x$ were real, one could search the $2^N$ combinations exhaustively at least.)
– Michael Grant
May 12 '16 at 21:30




Interesting problem. No idea how to solve it. :-) It's non-convex and in a particularly difficult manner. (For instance, if $x$ were real, one could search the $2^N$ combinations exhaustively at least.)
– Michael Grant
May 12 '16 at 21:30










1 Answer
1






active

oldest

votes

















up vote
0
down vote













Really nice and interesting question.

I tried, at first, solving it on the Real domain.



A brute force approach is my reference and I tried working with the following cost function:



$$ arg min_x f left( x right) = arg min_x frac12 left| A x - b right|_2^2 + fraclambda2 left| operatornameabs left( x right) - a right|_2^2 $$



Where $ operatornameabs left( x right) $ is element wise.



The derivative is given by:



$$ fracdd x f left( x right) = A^T left( A x - b right) + lambda operatornamesign left( x right) left( operatornameabs left( x right) - a right) $$



I tried Gradient Descent where I raise the value of $ lambda $ at each iteration.

It worked not so bad, but even the sign of the solution wasn't consistent with the optimal solution.



My intermediate code is given here.



I will try another 2 approaches:



  1. Using Iterative Least Squares like approach.

  2. Using Orthogonal Matching Pursuit (OMP) like approach.





share|cite|improve this answer






















    Your Answer




    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "69"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: false,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );








     

    draft saved


    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1781856%2fcomplex-least-squares-with-magnitude-equality-constraints%23new-answer', 'question_page');

    );

    Post as a guest






























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    0
    down vote













    Really nice and interesting question.

    I tried, at first, solving it on the Real domain.



    A brute force approach is my reference and I tried working with the following cost function:



    $$ arg min_x f left( x right) = arg min_x frac12 left| A x - b right|_2^2 + fraclambda2 left| operatornameabs left( x right) - a right|_2^2 $$



    Where $ operatornameabs left( x right) $ is element wise.



    The derivative is given by:



    $$ fracdd x f left( x right) = A^T left( A x - b right) + lambda operatornamesign left( x right) left( operatornameabs left( x right) - a right) $$



    I tried Gradient Descent where I raise the value of $ lambda $ at each iteration.

    It worked not so bad, but even the sign of the solution wasn't consistent with the optimal solution.



    My intermediate code is given here.



    I will try another 2 approaches:



    1. Using Iterative Least Squares like approach.

    2. Using Orthogonal Matching Pursuit (OMP) like approach.





    share|cite|improve this answer


























      up vote
      0
      down vote













      Really nice and interesting question.

      I tried, at first, solving it on the Real domain.



      A brute force approach is my reference and I tried working with the following cost function:



      $$ arg min_x f left( x right) = arg min_x frac12 left| A x - b right|_2^2 + fraclambda2 left| operatornameabs left( x right) - a right|_2^2 $$



      Where $ operatornameabs left( x right) $ is element wise.



      The derivative is given by:



      $$ fracdd x f left( x right) = A^T left( A x - b right) + lambda operatornamesign left( x right) left( operatornameabs left( x right) - a right) $$



      I tried Gradient Descent where I raise the value of $ lambda $ at each iteration.

      It worked not so bad, but even the sign of the solution wasn't consistent with the optimal solution.



      My intermediate code is given here.



      I will try another 2 approaches:



      1. Using Iterative Least Squares like approach.

      2. Using Orthogonal Matching Pursuit (OMP) like approach.





      share|cite|improve this answer
























        up vote
        0
        down vote










        up vote
        0
        down vote









        Really nice and interesting question.

        I tried, at first, solving it on the Real domain.



        A brute force approach is my reference and I tried working with the following cost function:



        $$ arg min_x f left( x right) = arg min_x frac12 left| A x - b right|_2^2 + fraclambda2 left| operatornameabs left( x right) - a right|_2^2 $$



        Where $ operatornameabs left( x right) $ is element wise.



        The derivative is given by:



        $$ fracdd x f left( x right) = A^T left( A x - b right) + lambda operatornamesign left( x right) left( operatornameabs left( x right) - a right) $$



        I tried Gradient Descent where I raise the value of $ lambda $ at each iteration.

        It worked not so bad, but even the sign of the solution wasn't consistent with the optimal solution.



        My intermediate code is given here.



        I will try another 2 approaches:



        1. Using Iterative Least Squares like approach.

        2. Using Orthogonal Matching Pursuit (OMP) like approach.





        share|cite|improve this answer














        Really nice and interesting question.

        I tried, at first, solving it on the Real domain.



        A brute force approach is my reference and I tried working with the following cost function:



        $$ arg min_x f left( x right) = arg min_x frac12 left| A x - b right|_2^2 + fraclambda2 left| operatornameabs left( x right) - a right|_2^2 $$



        Where $ operatornameabs left( x right) $ is element wise.



        The derivative is given by:



        $$ fracdd x f left( x right) = A^T left( A x - b right) + lambda operatornamesign left( x right) left( operatornameabs left( x right) - a right) $$



        I tried Gradient Descent where I raise the value of $ lambda $ at each iteration.

        It worked not so bad, but even the sign of the solution wasn't consistent with the optimal solution.



        My intermediate code is given here.



        I will try another 2 approaches:



        1. Using Iterative Least Squares like approach.

        2. Using Orthogonal Matching Pursuit (OMP) like approach.






        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Aug 16 at 7:35

























        answered Sep 9 '17 at 9:37









        Royi

        3,05012046




        3,05012046






















             

            draft saved


            draft discarded


























             


            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1781856%2fcomplex-least-squares-with-magnitude-equality-constraints%23new-answer', 'question_page');

            );

            Post as a guest













































































            這個網誌中的熱門文章

            How to combine Bézier curves to a surface?

            Mutual Information Always Non-negative

            Why am i infinitely getting the same tweet with the Twitter Search API?