Can someone point me to the best place to do multinomial logistic regression in Julia? I've tried with https://github.com/lindahua/Regression.jl but I am not getting the right results according to my simulation, I'm not sure how to interpret the output from that package (documentation is rather lacking), and I am not getting any responses on their issues (https://github.com/lindahua/Regression.jl/issues/14).
-- You received this message because you are subscribed to the Google Groups "julia-stats" group. To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email]. For more options, visit https://groups.google.com/d/optout. |
I think that Dahua doesn't have as much time for Julia package development as he used to have. I'm not aware of other implementations but there could easily be some that I don't know of. For now, it will probably be easiest to roll your own version or fork Dahua's package. On Fri, Mar 25, 2016 at 12:51 PM, Benjamin Deonovic <[hidden email]> wrote:
You received this message because you are subscribed to the Google Groups "julia-stats" group. To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email]. For more options, visit https://groups.google.com/d/optout. |
Hi Benjamin,
-- You can use the scikit-learn version via ScikitLearn.jl if you don't mind the (moderate) cost of sending your data to Python.
It was released recently, don't hesitate to file an issue if you get a problem with it. Cédric On Friday, March 25, 2016 at 12:58:43 PM UTC-4, Andreas Noack wrote:
You received this message because you are subscribed to the Google Groups "julia-stats" group. To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email]. For more options, visit https://groups.google.com/d/optout. |
I got the following error:
--
On Sunday, March 27, 2016 at 12:34:13 PM UTC-5, Cedric St-Jean wrote:
You received this message because you are subscribed to the Google Groups "julia-stats" group. To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email]. For more options, visit https://groups.google.com/d/optout. |
I figured out I just need to change the solver from "liblinear" to "lbfgs"
-- On Wednesday, April 6, 2016 at 4:30:27 PM UTC-5, Benjamin Deonovic wrote:
You received this message because you are subscribed to the Google Groups "julia-stats" group. To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email]. For more options, visit https://groups.google.com/d/optout. |
In reply to this post by Cedric St-Jean-2
How do I extract the coefficient values after I have fit the model?
-- On Sunday, March 27, 2016 at 12:34:13 PM UTC-5, Cedric St-Jean wrote:
You received this message because you are subscribed to the Google Groups "julia-stats" group. To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email]. For more options, visit https://groups.google.com/d/optout. |
All Python attributes are accessible with [:attribute_name]. model[:coef_] and model[:intercept_] should work.
-- On Wednesday, April 6, 2016 at 5:40:31 PM UTC-4, Benjamin Deonovic wrote:
You received this message because you are subscribed to the Google Groups "julia-stats" group. To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email]. For more options, visit https://groups.google.com/d/optout. |
Great thanks! I'm having trouble reproducing some R results for multinomial regression. Could you take a look if I am doing things right with the SciKitLearn package? http://stackoverflow.com/questions/36463072/julia-multinomial-regression-with-time-series-lagged-values
-- On Wednesday, April 6, 2016 at 5:06:09 PM UTC-5, Cedric St-Jean wrote:
You received this message because you are subscribed to the Google Groups "julia-stats" group. To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email]. For more options, visit https://groups.google.com/d/optout. |
The ScikitLearn.jl call looks OK. Some ideas:
-- - scikit-learn's regression is regularized by default. You can disable it with LogisticRegression(C=1.e9) - you convert y to Vector{Int64}, might want to try Vector{Float64} - isn't multinomial logistic regression non-identifiable under certain parametrizations? I would try computing the output (i.e. predict(model, X)) in both models and see if they differ Otherwise, trying a third package might be useful in figuring out which one is wrong. Cédric On Wednesday, April 6, 2016 at 6:28:57 PM UTC-4, Benjamin Deonovic wrote:
You received this message because you are subscribed to the Google Groups "julia-stats" group. To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email]. For more options, visit https://groups.google.com/d/optout. |
Note that not all of the vectors of coefficients are uniquely identifiable. This is due to the fact that all probabilities must sum to 1, making one of them completely determined once all the rest are known. As a result there are only separately specifiable probabilities, and hence separately identifiable vectors of coefficients. One way to see this is to note that if we add a constant vector to all of the coefficient vectors, the equations are identical: From Wikipedia. So it's expected that the coefficients are different, even if the predictions are the same. What do you want to use the coefficients for? On Wednesday, April 6, 2016 at 7:09:09 PM UTC-4, Cedric St-Jean wrote: -- You received this message because you are subscribed to the Google Groups "julia-stats" group. To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email]. For more options, visit https://groups.google.com/d/optout. |
Thanks, I don't need the coefficients but the estimated probabilityes, (in wikipedia's notation) $P(Y_i = k) = exp(X\beta_k)/(1 + \sum_u exp(X\beta_u))$. These should be identifiable, correct?
-- On Wednesday, April 6, 2016 at 6:19:38 PM UTC-5, Cedric St-Jean wrote: Note that not all of the vectors of coefficients are uniquely <a href="https://en.wikipedia.org/wiki/Identifiability" title="Identifiability" style="color:rgb(11,0,128);background-image:none;font-family:sans-serif;font-size:14px" target="_blank" rel="nofollow" onmousedown="this.href='https://www.google.com/url?q\75https%3A%2F%2Fen.wikipedia.org%2Fwiki%2FIdentifiability\46sa\75D\46sntz\0751\46usg\75AFQjCNF4ak2FENB0jSc5FwOigYFfESvCHw';return true;" onclick="this.href='https://www.google.com/url?q\75https%3A%2F%2Fen.wikipedia.org%2Fwiki%2FIdentifiability\46sa\75D\46sntz\0751\46usg\75AFQjCNF4ak2FENB0jSc5FwOigYFfESvCHw';return true;">identifiable. This is due to the fact that all probabilities must sum to 1, making one of them completely determined once all the rest are known. As a result there are only separately specifiable probabilities, and hence separately identifiable vectors of coefficients. One way to see this is to note that if we add a constant vector to all of the coefficient vectors, the equations are identical: You received this message because you are subscribed to the Google Groups "julia-stats" group. To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email]. For more options, visit https://groups.google.com/d/optout. |
Yeah. Use `predict_proba(model, X)` to get them. On Thu, Apr 7, 2016 at 3:00 PM, Benjamin Deonovic <[hidden email]> wrote:
You received this message because you are subscribed to the Google Groups "julia-stats" group. To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email]. For more options, visit https://groups.google.com/d/optout. |
Thanks Cedric, you've been a great help. It looks like all of the predicted probabilities are the same across all the different implementations (R, Regression.jl, and ScikitLearn) thanks for taking your time to help me with this.
-- On Thursday, April 7, 2016 at 2:06:24 PM UTC-5, Cedric St-Jean wrote:
You received this message because you are subscribed to the Google Groups "julia-stats" group. To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email]. For more options, visit https://groups.google.com/d/optout. |
Free forum by Nabble | Edit this page |