testing linear hypothesis on coefficients and glm

6 messages
Open this post in threaded view
|

testing linear hypothesis on coefficients and glm

 hi,my searches for how to perform a linear hypothesis test after a glm call turned up nothing(i can't seem to find any documentation of glm.jl other than readme.md on github)say i estimateglm(y ~ x1 + x2)is there a canned command that i can use to perform a (wald) test of f.e. H0: x1 = x2 ?more generally i am trying to figure out how to do get a standard (social science) regression going with testing, prediction, marginal effects and robust and clustered standard errors, so pointers in that direction are very welcomethanks a lot!edwin -- You received this message because you are subscribed to the Google Groups "julia-stats" group. To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email]. For more options, visit https://groups.google.com/d/optout.
Open this post in threaded view
|

testing linear hypothesis on coefficients and glm

 I think your best bet would be to pycall into python statsmodels. -- You received this message because you are subscribed to the Google Groups "julia-stats" group. To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email]. For more options, visit https://groups.google.com/d/optout.
Open this post in threaded view
|

Re: testing linear hypothesis on coefficients and glm

 On Tuesday, April 12, 2016 at 11:08:09 PM UTC+2, Ariel Katz wrote:I think your best bet would be to pycall into python statsmodels.thanks, but of course not what i was hoping formy first naive go at what i had in mind, namely the following:`function h0test(reg, h0) rvars = map(Symbol, vcat(:Intercept, reg.mf.terms.terms)) R     = differentiate(h0, rvars) R     = map(Float64, hcat(R...)) b     = coef(reg) V     = inv(R * vcov(reg) * R') w     = b' * R' * V * R * b df    = rank(V) df_r  = df_residual(reg) Fstat = w[1] / df pval  = (1 - cdf(FDist(df, df_r), Fstat)) @printf "H0: %s = 0\n" h0 @printf "  F(%d, %d) = %0.2f\n" df df_r Fstat @printf "  Prob > F = %0.4f\n" pvalend` seems to work (sort of)`julia> reg1 = lm(growth ~ tradeshare+rgdp60+yearsschool+rev_coups+assasinations, df)DataFrames.DataFrameRegressionModel{GLM.LinearModel{GLM.LmResp{Array{Float64,1}},GLM.DensePredQR{Float64}},Float64}Formula: growth ~ 1 + tradeshare + rgdp60 + yearsschool + rev_coups + assasinationsCoefficients:                   Estimate   Std.Error  t value Pr(>|t|)(Intercept)        0.489761      0.6896  0.71021   0.4804tradeshare           1.5617    0.757947  2.06043   0.0438rgdp60         -0.000469346 0.000148212 -3.16673   0.0024yearsschool        0.574846    0.139338  4.12555   0.0001rev_coups           -2.1575     1.11029 -1.94319   0.0568assasinations      0.354078    0.477394 0.741689   0.4612julia> h0test(reg1, "tradeshare+yearsschool")H0: tradeshare+yearsschool = 0  F(1, 59) = 8.23  Prob > F = 0.0057julia> h0test(reg1, "+tradeshare")H0: +tradeshare = 0  F(1, 59) = 4.25  Prob > F = 0.0438` -- You received this message because you are subscribed to the Google Groups "julia-stats" group. To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email]. For more options, visit https://groups.google.com/d/optout.
Open this post in threaded view
|

Re: testing linear hypothesis on coefficients and glm

 If you need robust/clustered variance covariance matrices look at CovarianceMatrices.jl. -- You received this message because you are subscribed to the Google Groups "julia-stats" group. To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email]. For more options, visit https://groups.google.com/d/optout.
 In reply to this post by Edwin Leuven for closure sake, the below is what i have atm(it only handles single equation restrictions, but can be easily extended)i am a bit surprised that this functionality is not already availablecomments welcome of course, i'm new to julia`using Distributionsusing Calculus"Perform F-test for a linear null-hypothesis h0 using regression reg"function lhtest(reg, h0) lhtest(reg, h0, vcov(reg))end"Perform F-test for a linear null-hypothesis h0 using regression reg and cov matrix vcov"function lhtest(reg, h0, vcov::Array{Float64,2}) rvars = map(Symbol, vcat(:Intercept, reg.mf.terms.terms)) h     = parse(h0) h     = Expr(:call, :-, unblock(h.args[1]), unblock(h.args[2])) c     = get_const(h, rvars) R     = differentiate(deparse(h), rvars) R     = map(Float64, hcat(R...)) b     = coef(reg) V     = inv(R * vcov * R') w     = (R * b + c)' * V * (R * b + c) df    = rank(V) df_r  = df_residual(reg) Fstat = w[1] / df pval  = (1 - cdf(FDist(df, df_r), Fstat)) @printf "\nH0: %s = 0\n\n" h @printf "  F(%d, %d) = %0.2f\n" df df_r Fstat @printf "  Prob > F = %0.4f\n" pvalend"Extract numerical constant from expression when symbols are set to 0"function get_const(x::Expr, symbols) ex = copy(x) ex.args = map(z->get_const(z, symbols), ex.args) return eval(ex)endfunction get_const(x::Symbol, symbols) (x in symbols) && return 0 (x in [:+; :-; :*; 0; symbols]) || error(@sprintf "%s is unknown\n" string(x)) return xendget_const(x, symbols) = x# the following from MacroTools.jlisline(ex) = isexpr(ex, :line) || isa(ex, LineNumberNode)isexpr(x::Expr, ts...) = x.head in tsisexpr(x, ts...) = any(T->isa(T, Type) && isa(x, T), ts)rmlines(x::Expr) = Expr(x.head, rmlines(x.args)...)rmlines(xs) = filter(x->!isline(x), xs)function unblock(ex)  isexpr(ex, :block) || return ex  exs = rmlines(ex).args  length(exs) == 1 || return ex  return unblock(exs[1])end` -- You received this message because you are subscribed to the Google Groups "julia-stats" group. To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email]. For more options, visit https://groups.google.com/d/optout.