Switching between ConjugateGradient, BFGS, LBFGS, Newton et al

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Switching between ConjugateGradient, BFGS, LBFGS, Newton et al

Colin Beckingham
In a question on StackOverflow at http://stackoverflow.com/questions/32703119/logistic-regression-in-julia-using-optim-jl a user poses a question about logistic regression and gets a couple of interesting answers; the first answer (currently) involves currying and closures and the second uses autodifferentiation. Both good answers from a beginners point of view, so I did some tweaking to see what would happen.

The goal being to see how different methods would affect the result I tried switching between the different methods that seemed familiar, and found that while the second solution would accept just about any method, trying to use anything but ConjugateGradient in the first produced a frightening amount of red ink in the REPL. Where should I look for guidance on what method fits with what approach, or is it just a matter of trial and error?

--
You received this message because you are subscribed to the Google Groups "julia-opt" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
Visit this group at https://groups.google.com/group/julia-opt.
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: Switching between ConjugateGradient, BFGS, LBFGS, Newton et al

John Myles White
I think these concerns are orthogonal.

For statistical models, you need closures because the true mathematical function is log_likelihood(data, parameters), but you want to optimize a function that depends only on the parameters and holds the data fixed.

To optimize that function efficiently you need its gradient, which you can compute using automatic differentiation if you don't have a reliable implementation fo the analytic gradient.

 -- John

On Saturday, September 10, 2016 at 11:50:02 AM UTC-7, Colin Beckingham wrote:
In a question on StackOverflow at <a href="http://stackoverflow.com/questions/32703119/logistic-regression-in-julia-using-optim-jl" target="_blank" rel="nofollow" onmousedown="this.href=&#39;http://www.google.com/url?q\x3dhttp%3A%2F%2Fstackoverflow.com%2Fquestions%2F32703119%2Flogistic-regression-in-julia-using-optim-jl\x26sa\x3dD\x26sntz\x3d1\x26usg\x3dAFQjCNG5ibZUgkjm7gZVkhqyLECw2Yk-vA&#39;;return true;" onclick="this.href=&#39;http://www.google.com/url?q\x3dhttp%3A%2F%2Fstackoverflow.com%2Fquestions%2F32703119%2Flogistic-regression-in-julia-using-optim-jl\x26sa\x3dD\x26sntz\x3d1\x26usg\x3dAFQjCNG5ibZUgkjm7gZVkhqyLECw2Yk-vA&#39;;return true;">http://stackoverflow.com/questions/32703119/logistic-regression-in-julia-using-optim-jl a user poses a question about logistic regression and gets a couple of interesting answers; the first answer (currently) involves currying and closures and the second uses autodifferentiation. Both good answers from a beginners point of view, so I did some tweaking to see what would happen.

The goal being to see how different methods would affect the result I tried switching between the different methods that seemed familiar, and found that while the second solution would accept just about any method, trying to use anything but ConjugateGradient in the first produced a frightening amount of red ink in the REPL. Where should I look for guidance on what method fits with what approach, or is it just a matter of trial and error?

--
You received this message because you are subscribed to the Google Groups "julia-opt" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
Visit this group at https://groups.google.com/group/julia-opt.
For more options, visit https://groups.google.com/d/optout.