What is the difference between projected gradient descent and ordinary gradient descent?
Clash Royale CLAN TAG#URR8PPP
up vote
14
down vote
favorite
I just read about projected gradient descent but I did not see the intuition to use Projected one instead of normal gradient descent. Would you tell me the reason and preferable situations of projected gradient descent? What does that projection contribute?
optimization numerical-optimization gradient-descent
add a comment |Â
up vote
14
down vote
favorite
I just read about projected gradient descent but I did not see the intuition to use Projected one instead of normal gradient descent. Would you tell me the reason and preferable situations of projected gradient descent? What does that projection contribute?
optimization numerical-optimization gradient-descent
add a comment |Â
up vote
14
down vote
favorite
up vote
14
down vote
favorite
I just read about projected gradient descent but I did not see the intuition to use Projected one instead of normal gradient descent. Would you tell me the reason and preferable situations of projected gradient descent? What does that projection contribute?
optimization numerical-optimization gradient-descent
I just read about projected gradient descent but I did not see the intuition to use Projected one instead of normal gradient descent. Would you tell me the reason and preferable situations of projected gradient descent? What does that projection contribute?
optimization numerical-optimization gradient-descent
optimization numerical-optimization gradient-descent
edited Jul 23 '16 at 3:01
Rodrigo de Azevedo
12.7k41751
12.7k41751
asked Nov 17 '13 at 22:47
erogol
160211
160211
add a comment |Â
add a comment |Â
2 Answers
2
active
oldest
votes
up vote
43
down vote
accepted
At a basic level, projected gradient descent is just a more general method for solving a more general problem.
Gradient descent minimizes a function by moving in the negative gradient direction at each step. There is no constraint on the variable.
$$
textProblem 1: min_x f(x)
$$
$$
x_k+1 = x_k - t_k nabla f(x_k)
$$
On the other hand, projected gradient descent minimizes a function subject to a constraint. At each step we move in the direction of the negative gradient, and then "project" onto the feasible set.
$$
textProblem 2: min_x f(x) text subject to x in C
$$
$$
y_k+1 = x_k - t_k nabla f(x_k)\
x_k+1 = textarg min_x in C |y_k+1-x|
$$
1
pretty good answer thanks
â erogol
Nov 19 '13 at 19:43
1
sexy answer, sir
â Enlightened One
Jul 23 '16 at 3:02
simplicity is far under-appreciated. Thank you for this answer.
â Kai
Jun 1 at 17:08
add a comment |Â
up vote
-1
down vote
The previous answer is perfect. I would like to add one thing, imagine you have to optimize a convex function but with a non convex constraint, that time gradient descent will help but with every iteration we will make sure that my solution does not go out of the constraint domain so in every iteration I will project my solution into the constraint set until the convergence reached.In that way this method is extremely useful.
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
43
down vote
accepted
At a basic level, projected gradient descent is just a more general method for solving a more general problem.
Gradient descent minimizes a function by moving in the negative gradient direction at each step. There is no constraint on the variable.
$$
textProblem 1: min_x f(x)
$$
$$
x_k+1 = x_k - t_k nabla f(x_k)
$$
On the other hand, projected gradient descent minimizes a function subject to a constraint. At each step we move in the direction of the negative gradient, and then "project" onto the feasible set.
$$
textProblem 2: min_x f(x) text subject to x in C
$$
$$
y_k+1 = x_k - t_k nabla f(x_k)\
x_k+1 = textarg min_x in C |y_k+1-x|
$$
1
pretty good answer thanks
â erogol
Nov 19 '13 at 19:43
1
sexy answer, sir
â Enlightened One
Jul 23 '16 at 3:02
simplicity is far under-appreciated. Thank you for this answer.
â Kai
Jun 1 at 17:08
add a comment |Â
up vote
43
down vote
accepted
At a basic level, projected gradient descent is just a more general method for solving a more general problem.
Gradient descent minimizes a function by moving in the negative gradient direction at each step. There is no constraint on the variable.
$$
textProblem 1: min_x f(x)
$$
$$
x_k+1 = x_k - t_k nabla f(x_k)
$$
On the other hand, projected gradient descent minimizes a function subject to a constraint. At each step we move in the direction of the negative gradient, and then "project" onto the feasible set.
$$
textProblem 2: min_x f(x) text subject to x in C
$$
$$
y_k+1 = x_k - t_k nabla f(x_k)\
x_k+1 = textarg min_x in C |y_k+1-x|
$$
1
pretty good answer thanks
â erogol
Nov 19 '13 at 19:43
1
sexy answer, sir
â Enlightened One
Jul 23 '16 at 3:02
simplicity is far under-appreciated. Thank you for this answer.
â Kai
Jun 1 at 17:08
add a comment |Â
up vote
43
down vote
accepted
up vote
43
down vote
accepted
At a basic level, projected gradient descent is just a more general method for solving a more general problem.
Gradient descent minimizes a function by moving in the negative gradient direction at each step. There is no constraint on the variable.
$$
textProblem 1: min_x f(x)
$$
$$
x_k+1 = x_k - t_k nabla f(x_k)
$$
On the other hand, projected gradient descent minimizes a function subject to a constraint. At each step we move in the direction of the negative gradient, and then "project" onto the feasible set.
$$
textProblem 2: min_x f(x) text subject to x in C
$$
$$
y_k+1 = x_k - t_k nabla f(x_k)\
x_k+1 = textarg min_x in C |y_k+1-x|
$$
At a basic level, projected gradient descent is just a more general method for solving a more general problem.
Gradient descent minimizes a function by moving in the negative gradient direction at each step. There is no constraint on the variable.
$$
textProblem 1: min_x f(x)
$$
$$
x_k+1 = x_k - t_k nabla f(x_k)
$$
On the other hand, projected gradient descent minimizes a function subject to a constraint. At each step we move in the direction of the negative gradient, and then "project" onto the feasible set.
$$
textProblem 2: min_x f(x) text subject to x in C
$$
$$
y_k+1 = x_k - t_k nabla f(x_k)\
x_k+1 = textarg min_x in C |y_k+1-x|
$$
edited Nov 20 '13 at 0:48
answered Nov 19 '13 at 1:45
p.s.
4,24211314
4,24211314
1
pretty good answer thanks
â erogol
Nov 19 '13 at 19:43
1
sexy answer, sir
â Enlightened One
Jul 23 '16 at 3:02
simplicity is far under-appreciated. Thank you for this answer.
â Kai
Jun 1 at 17:08
add a comment |Â
1
pretty good answer thanks
â erogol
Nov 19 '13 at 19:43
1
sexy answer, sir
â Enlightened One
Jul 23 '16 at 3:02
simplicity is far under-appreciated. Thank you for this answer.
â Kai
Jun 1 at 17:08
1
1
pretty good answer thanks
â erogol
Nov 19 '13 at 19:43
pretty good answer thanks
â erogol
Nov 19 '13 at 19:43
1
1
sexy answer, sir
â Enlightened One
Jul 23 '16 at 3:02
sexy answer, sir
â Enlightened One
Jul 23 '16 at 3:02
simplicity is far under-appreciated. Thank you for this answer.
â Kai
Jun 1 at 17:08
simplicity is far under-appreciated. Thank you for this answer.
â Kai
Jun 1 at 17:08
add a comment |Â
up vote
-1
down vote
The previous answer is perfect. I would like to add one thing, imagine you have to optimize a convex function but with a non convex constraint, that time gradient descent will help but with every iteration we will make sure that my solution does not go out of the constraint domain so in every iteration I will project my solution into the constraint set until the convergence reached.In that way this method is extremely useful.
add a comment |Â
up vote
-1
down vote
The previous answer is perfect. I would like to add one thing, imagine you have to optimize a convex function but with a non convex constraint, that time gradient descent will help but with every iteration we will make sure that my solution does not go out of the constraint domain so in every iteration I will project my solution into the constraint set until the convergence reached.In that way this method is extremely useful.
add a comment |Â
up vote
-1
down vote
up vote
-1
down vote
The previous answer is perfect. I would like to add one thing, imagine you have to optimize a convex function but with a non convex constraint, that time gradient descent will help but with every iteration we will make sure that my solution does not go out of the constraint domain so in every iteration I will project my solution into the constraint set until the convergence reached.In that way this method is extremely useful.
The previous answer is perfect. I would like to add one thing, imagine you have to optimize a convex function but with a non convex constraint, that time gradient descent will help but with every iteration we will make sure that my solution does not go out of the constraint domain so in every iteration I will project my solution into the constraint set until the convergence reached.In that way this method is extremely useful.
edited Sep 3 at 10:57
amWhy
190k26221433
190k26221433
answered Sep 3 at 10:30
explorer
1715
1715
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f571068%2fwhat-is-the-difference-between-projected-gradient-descent-and-ordinary-gradient%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password