Trouble with GLlM in new Julia install

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
18 messages Options
Reply | Threaded
Open this post in threaded view
|

Trouble with GLlM in new Julia install

Pedro L Vera
Hello,

I'm back trying Julia and I'm not able to run a simple linear regression. I've insalled 0.4.3 and 0.4.5 in a Linux Debian laptop.
 
I'm just re-familiarizing myself with the language and I could run linear regressions fine just a few months ago.

Now, I can't even follow the instructions in the GLM package. Any type of:

lm1= fit(LinearModel, y ~ x, data)

goes into a seeming endless loop that just eats up a lot of CPU and time.

I figure I must be doing something really stupid but I can't figure it out.

All the following modules are installed:

using DataFrames
using GLM
using Distributions
using NumericExtensions
using Gadfly
using Cairo

Any ideas as to what's going on?

Thanks.

Pedro

--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: Trouble with GLlM in new Julia install

Andreas Noack
Can you provide a specific example of a dataset that shows this behavior? Please also give the output from versioninfo() and Pkg.status().

On Fri, Apr 15, 2016 at 9:16 PM, Pedro L Vera <[hidden email]> wrote:
Hello,

I'm back trying Julia and I'm not able to run a simple linear regression. I've insalled 0.4.3 and 0.4.5 in a Linux Debian laptop.
 
I'm just re-familiarizing myself with the language and I could run linear regressions fine just a few months ago.

Now, I can't even follow the instructions in the GLM package. Any type of:

lm1= fit(LinearModel, y ~ x, data)

goes into a seeming endless loop that just eats up a lot of CPU and time.

I figure I must be doing something really stupid but I can't figure it out.

All the following modules are installed:

using DataFrames
using GLM
using Distributions
using NumericExtensions
using Gadfly
using Cairo

Any ideas as to what's going on?

Thanks.

Pedro

--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: Trouble with GLlM in new Julia install

Pedro L Vera
Sure.

julia> versioninfo()
Julia Version 0.4.3
Commit a2f713d (2016-01-12 21:37 UTC)
Platform Info:
   System: Linux (x86_64-linux-gnu)
   CPU: AMD A10-8700P Radeon R6, 10 Compute Cores 4C+6G
   WORD_SIZE: 64
   BLAS: libopenblas (NO_LAPACKE DYNAMIC_ARCH NO_AFFINITY Excavator)
   LAPACK: libopenblas
   LIBM: libopenlibm
   LLVM: libLLVM-3.7

julia> Pkg.status()
14 required packages:
  - Atom                          0.4.1
  - Cairo                         0.2.31
  - DataFrames                    0.7.0
  - Distributions                 0.8.10
  - GLM                           0.5.0
  - Gadfly                        0.4.2
  - Gaston                        0.0.0
  - IJulia                        1.1.9
  - MbedTLS                       0.2.1
  - NumericExtensions             0.6.2
  - Plotly                        0.0.3+             master
  - PyPlot                        2.1.1
  - RDatasets                     0.1.3
  - Winston                       0.11.13
64 additional packages:
  - ArrayViews                    0.6.4
  - BinDeps                       0.3.21
  - Blink                         0.3.4
  - BufferedStreams               0.0.3
  - Calculus                      0.1.14
  - CodeTools                     0.3.0
  - Codecs                        0.1.5
  - ColorTypes                    0.2.2
  - Colors                        0.6.3
  - Compat                        0.7.13
  - Compose                       0.4.2
  - Conda                         0.1.9
  - Contour                       0.1.0
  - DataArrays                    0.2.20
  - DataStructures                0.4.3
  - Dates                         0.4.4
  - Distances                     0.3.0
  - Docile                        0.5.23
  - DualNumbers                   0.2.2
  - FactCheck                     0.4.2
  - FixedPointNumbers             0.1.2
  - FixedSizeArrays               0.1.0
  - GZip                          0.2.18
  - Graphics                      0.1.3
  - Grid                          0.4.0
  - Hexagons                      0.0.4
  - Hiccup                        0.0.2
  - HttpCommon                    0.2.4
  - HttpParser                    0.1.1
  - HttpServer                    0.1.5
  - IniFile                       0.2.5
  - Iterators                     0.1.9
  - JSON                          0.5.0
  - JuliaParser                   0.6.4
  - KernelDensity                 0.1.2
  - LNR                           0.0.2
  - LaTeXStrings                  0.1.6
  - Lazy                          0.10.1
  - Libz                          0.0.2
  - Loess                         0.0.6
  - MacroTools                    0.3.0
  - Measures                      0.0.2
  - Media                         0.1.2
  - Mustache                      0.0.14
  - Mux                           0.2.0
  - NaNMath                       0.2.1
  - Nettle                        0.2.2
  - NumericFuns                   0.2.4
  - Optim                         0.4.4
  - PDMats                        0.4.1
  - PyCall                        1.4.0
  - Reexport                      0.0.3
  - Requests                      0.3.6
  - Requires                      0.2.2
  - SHA                           0.1.2
  - Showoff                       0.0.6
  - SortingAlgorithms             0.0.6
  - StatsBase                     0.8.0
  - StatsFuns                     0.2.0
  - Tk                            0.3.7
  - URIParser                     0.1.3
  - WebSockets                    0.1.2
  - WoodburyMatrices              0.1.5
  - ZMQ                           0.3.1

Example. Straight from the GLM doc:

using GLM, RDatasets

form=dataset("datasets","Formaldehyde")

#code above works fine

lm1 = fit(LinearModell, Optden ~ Carb, form)

#just hangs and eats up CPU

Thanks.

Pedro

--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: Trouble with GLlM in new Julia install

Milan Bouchet-Valat
Le vendredi 15 avril 2016 à 22:09 -0400, Pedro L Vera a écrit :

> Sure.
>
> julia> versioninfo()
> Julia Version 0.4.3
> Commit a2f713d (2016-01-12 21:37 UTC)
> Platform Info:
>    System: Linux (x86_64-linux-gnu)
>    CPU: AMD A10-8700P Radeon R6, 10 Compute Cores 4C+6G
>    WORD_SIZE: 64
>    BLAS: libopenblas (NO_LAPACKE DYNAMIC_ARCH NO_AFFINITY Excavator)
>    LAPACK: libopenblas
>    LIBM: libopenlibm
>    LLVM: libLLVM-3.7
>
> julia> Pkg.status()
> 14 required packages:
>   - Atom                          0.4.1
>   - Cairo                         0.2.31
>   - DataFrames                    0.7.0
>   - Distributions                 0.8.10
>   - GLM                           0.5.0
>   - Gadfly                        0.4.2
>   - Gaston                        0.0.0
>   - IJulia                        1.1.9
>   - MbedTLS                       0.2.1
>   - NumericExtensions             0.6.2
>   - Plotly                        0.0.3+             master
>   - PyPlot                        2.1.1
>   - RDatasets                     0.1.3
>   - Winston                       0.11.13
> 64 additional packages:
>   - ArrayViews                    0.6.4
>   - BinDeps                       0.3.21
>   - Blink                         0.3.4
>   - BufferedStreams               0.0.3
>   - Calculus                      0.1.14
>   - CodeTools                     0.3.0
>   - Codecs                        0.1.5
>   - ColorTypes                    0.2.2
>   - Colors                        0.6.3
>   - Compat                        0.7.13
>   - Compose                       0.4.2
>   - Conda                         0.1.9
>   - Contour                       0.1.0
>   - DataArrays                    0.2.20
>   - DataStructures                0.4.3
>   - Dates                         0.4.4
>   - Distances                     0.3.0
>   - Docile                        0.5.23
>   - DualNumbers                   0.2.2
>   - FactCheck                     0.4.2
>   - FixedPointNumbers             0.1.2
>   - FixedSizeArrays               0.1.0
>   - GZip                          0.2.18
>   - Graphics                      0.1.3
>   - Grid                          0.4.0
>   - Hexagons                      0.0.4
>   - Hiccup                        0.0.2
>   - HttpCommon                    0.2.4
>   - HttpParser                    0.1.1
>   - HttpServer                    0.1.5
>   - IniFile                       0.2.5
>   - Iterators                     0.1.9
>   - JSON                          0.5.0
>   - JuliaParser                   0.6.4
>   - KernelDensity                 0.1.2
>   - LNR                           0.0.2
>   - LaTeXStrings                  0.1.6
>   - Lazy                          0.10.1
>   - Libz                          0.0.2
>   - Loess                         0.0.6
>   - MacroTools                    0.3.0
>   - Measures                      0.0.2
>   - Media                         0.1.2
>   - Mustache                      0.0.14
>   - Mux                           0.2.0
>   - NaNMath                       0.2.1
>   - Nettle                        0.2.2
>   - NumericFuns                   0.2.4
>   - Optim                         0.4.4
>   - PDMats                        0.4.1
>   - PyCall                        1.4.0
>   - Reexport                      0.0.3
>   - Requests                      0.3.6
>   - Requires                      0.2.2
>   - SHA                           0.1.2
>   - Showoff                       0.0.6
>   - SortingAlgorithms             0.0.6
>   - StatsBase                     0.8.0
>   - StatsFuns                     0.2.0
>   - Tk                            0.3.7
>   - URIParser                     0.1.3
>   - WebSockets                    0.1.2
>   - WoodburyMatrices              0.1.5
>   - ZMQ                           0.3.1
>
> Example. Straight from the GLM doc:
>
> using GLM, RDatasets
>
> form=dataset("datasets","Formaldehyde")
>
> #code above works fine
>
> lm1 = fit(LinearModell, Optden ~ Carb, form)
Just to be sure: LinearModell is a typo, right? It should
be LinearModel.

Assuming this is the case, could you run Julia inside gdb (i.e.
'gdb julia'), try to fit the model, and hit Ctrl+C while it hangs?
Then, run 't a a bt' in gdb, and paste the output here (or in a gist).


Regards

> #just hangs and eats up CPU
>
> Thanks.
>
> Pedro
>

--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: Trouble with GLlM in new Julia install

Benjamin Deonovic
In reply to this post by Pedro L Vera
Did you use a binary or install from source? Perhaps your install didn't go well? 

On Friday, April 15, 2016 at 8:16:31 PM UTC-5, Pedro L Vera wrote:
Hello,

I'm back trying Julia and I'm not able to run a simple linear regression. I've insalled 0.4.3 and 0.4.5 in a Linux Debian laptop.
 
I'm just re-familiarizing myself with the language and I could run linear regressions fine just a few months ago.

Now, I can't even follow the instructions in the GLM package. Any type of:

lm1= fit(LinearModel, y ~ x, data)

goes into a seeming endless loop that just eats up a lot of CPU and time.

I figure I must be doing something really stupid but I can't figure it out.

All the following modules are installed:

using DataFrames
using GLM
using Distributions
using NumericExtensions
using Gadfly
using Cairo

Any ideas as to what's going on?

Thanks.

Pedro

--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: Trouble with GLlM in new Julia install

Pedro L Vera
Thanks for the suggestions.


The current install is from a debian package. Reinstalled it, just to make
sure, and the problem persists.yy

As for the other comment, yes, it was a typo in the email, not the code.

Running Julia in gdb, hangs, as before and after I stopped it, got the
following message:

(gdb) n
Single stepping until exit from function dtrmm_kernel_LN_EXCAVATOR,
which has no line number information.
0x00007ffdeae3d052 in dtrmm_LTLU ()

No idea what it means.

Thanks.

Pedro

--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: Trouble with GLlM in new Julia install

Milan Bouchet-Valat
Le samedi 16 avril 2016 à 10:27 -0400, Pedro L Vera a écrit :

> Thanks for the suggestions.
>
>
> The current install is from a debian package. Reinstalled it, just to make 
> sure, and the problem persists.yy
>
> As for the other comment, yes, it was a typo in the email, not the code.
>
> Running Julia in gdb, hangs, as before and after I stopped it, got the 
> following message:
>
> (gdb) n
> Single stepping until exit from function dtrmm_kernel_LN_EXCAVATOR,
> which has no line number information.
> 0x00007ffdeae3d052 in dtrmm_LTLU ()
>
> No idea what it means.
Have you tried running 't a a bt'?

Anyway, this looks like an issue with OpenBLAS (as I suspected). It
could be interesting to try the generic Linux binaries for Julia (for
both 0.4 and 0.5 nightlies).


Regards

--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: Trouble with GLlM in new Julia install

Pedro L Vera
In reply to this post by Milan Bouchet-Valat
Sorry. Forgot that part of your message. Here's the output.

(gdb) t a a bt

Thread 4 (Thread 0x7ffde76fe700 (LWP 11085)):
#0  0x00007ffff73df04f in pthread_cond_wait@@GLIBC_2.3.2 () from
/lib/x86_64-linux-gnu/libpthread.so.0
#1  0x00007ffdeaf7458b in ?? () from
/usr/bin/../lib/x86_64-linux-gnu/julia/libopenblas.so
#2  0x00007ffff73d9454 in start_thread () from
/lib/x86_64-linux-gnu/libpthread.so.0
#3  0x00007ffff40b2ecd in clone () from /lib/x86_64-linux-gnu/libc.so.6

Thread 3 (Thread 0x7ffde9eff700 (LWP 11084)):
#0  0x00007ffff73df04f in pthread_cond_wait@@GLIBC_2.3.2 () from
/lib/x86_64-linux-gnu/libpthread.so.0
#1  0x00007ffdeaf7458b in ?? () from
/usr/bin/../lib/x86_64-linux-gnu/julia/libopenblas.so
#2  0x00007ffff73d9454 in start_thread () from
/lib/x86_64-linux-gnu/libpthread.so.0
#3  0x00007ffff40b2ecd in clone () from /lib/x86_64-linux-gnu/libc.so.6

Thread 2 (Thread 0x7ffdea700700 (LWP 11083)):
#0  0x00007ffff73df04f in pthread_cond_wait@@GLIBC_2.3.2 () from
/lib/x86_64-linux-gnu/libpthread.so.0
#1  0x00007ffdeaf7458b in ?? () from
/usr/bin/../lib/x86_64-linux-gnu/julia/libopenblas.so
#2  0x00007ffff73d9454 in start_thread () from
/lib/x86_64-linux-gnu/libpthread.so.0
#3  0x00007ffff40b2ecd in clone () from /lib/x86_64-linux-gnu/libc.so.6

Thread 1 (Thread 0x7ffff7fcd740 (LWP 11074)):
#0  julia_getindex_557 () at array.jl:173
#1  0x00007ffff7a2a83b in jl_apply (nargs=5, args=0x7fffffff0240,
f=<optimized out>) at julia.h:1325
#2  jl_apply_generic (F=0x7ffdee6d1d70, args=0x7fffffff0240,
nargs=<optimized out>) at gf.c:1684
#3  0x00007ffff7a30da4 in jl_apply (nargs=<optimized out>,
args=0x7fffffff0240, f=0x7ffdee6d1d70)
     at julia.h:1325
#4  jl_f_apply (F=<optimized out>, args=<optimized out>, nargs=<optimized
out>) at builtins.c:497
#5  0x00007fffef2eb827 in julia_inlining_pass_446 (e=<optimized out>,
sv=<optimized out>,
     ast=<optimized out>) at inference.jl:2768
#6  0x00007fffef2eaf40 in julia_inlining_pass_446 (e=<optimized out>,
sv=<optimized out>,
     ast=<optimized out>) at inference.jl:2721
#7  0x00007ffff7a2a83b in jl_apply (nargs=3, args=0x7fffffff0c18,
f=<optimized out>) at julia.h:1325
#8  jl_apply_generic (F=0x7ffdee8a3f10, args=0x7fffffff0c18,
nargs=<optimized out>) at gf.c:1684
#9  0x00007fffef2eabfa in julia_inlining_pass_446 (e=<optimized out>,
sv=<optimized out>,
     ast=<optimized out>) at inference.jl:2679
#10 0x00007ffff7a2a83b in jl_apply (nargs=3, args=0x7fffffff1168,
f=<optimized out>) at julia.h:1325
#11 jl_apply_generic (F=0x7ffdee8a3f10, args=0x7fffffff1168,
nargs=<optimized out>) at gf.c:1684
#12 0x00007fffef2dae6d in julia_typeinf_uncached_35 (cop=0x7ffdf271f100,
optimize=<optimized out>)
     at inference.jl:1701
#13 0x00007fffef2dbba9 in jlcall_typeinf_uncached_35 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
#14 0x00007ffff7a2a83b in jl_apply (nargs=7, args=0x7fffffff13c0,
f=<optimized out>) at julia.h:1325
#15 jl_apply_generic (F=0x7ffded8d93d0, args=0x7fffffff13c0,
nargs=<optimized out>) at gf.c:1684
#16 0x00007fffef2d5a94 in julia_typeinf_4 (cop=0xed83c01000000001,
needtree=<optimized out>)
     at inference.jl:1339
---Type <return> to continue, or q <return> to quit---
#17 0x00007fffef2d61f3 in jlcall_typeinf_4 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
#18 0x00007ffff7a2a83b in jl_apply (nargs=6, args=0x7fffffff14e0,
f=<optimized out>) at julia.h:1325
#19 jl_apply_generic (F=0x7ffdee7d4a10, args=0x7fffffff14e0,
nargs=<optimized out>) at gf.c:1684
#20 0x00007fffef2e737d in julia_typeinf_392 (
     linfo=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     atypes=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     sparams=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     def=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>) at inference.jl:1289
#21 0x00007ffff7a2a83b in jl_apply (nargs=4, args=0x7fffffff1958,
f=<optimized out>) at julia.h:1325
#22 jl_apply_generic (F=0x7ffdee7d4a10, args=0x7fffffff1958,
nargs=<optimized out>) at gf.c:1684
#23 0x00007fffef2e5111 in julia_abstract_call_gf_329 (f=<optimized out>,
fargs=<optimized out>,
     argtype=<optimized out>, e=<optimized out>) at inference.jl:737
#24 0x00007ffff7a2a83b in jl_apply (nargs=4, args=0x7fffffff1b28,
f=<optimized out>) at julia.h:1325
#25 jl_apply_generic (F=0x7ffdee8d9350, args=0x7fffffff1b28,
nargs=<optimized out>) at gf.c:1684
#26 0x00007fffef2e24e2 in julia_abstract_call_310 (f=<optimized out>,
fargs=<optimized out>,
     argtypes=<optimized out>, vtypes=<optimized out>, sv=<optimized out>,
e=<optimized out>)
     at inference.jl:837
#27 0x00007ffff7a2a83b in jl_apply (nargs=6, args=0x7fffffff1d28,
f=<optimized out>) at julia.h:1325
#28 jl_apply_generic (F=0x7ffdee8fc5d0, args=0x7fffffff1d28,
nargs=<optimized out>) at gf.c:1684
#29 0x00007fffef2e1030 in julia_abstract_eval_call_270 (e=<optimized out>,
vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:934
#30 0x00007fffef2dfbd4 in julia_abstract_eval_248 (e=<optimized out>,
vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:961
#31 0x00007ffff7a2a83b in jl_apply (nargs=3, args=0x7fffffff1f60,
f=<optimized out>) at julia.h:1325
#32 jl_apply_generic (F=0x7ffdeda0d350, args=0x7fffffff1f60,
nargs=<optimized out>) at gf.c:1684
#33 0x00007fffef2df429 in julia_abstract_interpret_238 (e=<optimized out>,
vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:1110
#34 0x00007ffff7a2a83b in jl_apply (nargs=3, args=0x7fffffff24a8,
f=<optimized out>) at julia.h:1325
#35 jl_apply_generic (F=0x7ffdee701890, args=0x7fffffff24a8,
nargs=<optimized out>) at gf.c:1684
#36 0x00007fffef2d9410 in julia_typeinf_uncached_35 (cop=0x26ab540,
optimize=<optimized out>)
     at inference.jl:1549
#37 0x00007fffef2dbba9 in jlcall_typeinf_uncached_35 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
#38 0x00007ffff7a2a83b in jl_apply (nargs=7, args=0x7fffffff2700,
f=<optimized out>) at julia.h:1325
#39 jl_apply_generic (F=0x7ffded8d93d0, args=0x7fffffff2700,
nargs=<optimized out>) at gf.c:1684
#40 0x00007fffef2d5a94 in julia_typeinf_4 (cop=0xed83c01000000001,
needtree=<optimized out>)
     at inference.jl:1339
#41 0x00007fffef2d61f3 in jlcall_typeinf_4 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
#42 0x00007ffff7a2a83b in jl_apply (nargs=6, args=0x7fffffff2820,
f=<optimized out>) at julia.h:1325
#43 jl_apply_generic (F=0x7ffdee7d4a10, args=0x7fffffff2820,
nargs=<optimized out>) at gf.c:1684
---Type <return> to continue, or q <return> to quit---
#44 0x00007fffef2e737d in julia_typeinf_392 (
     linfo=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     atypes=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     sparams=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     def=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>) at inference.jl:1289
#45 0x00007ffff7a2a83b in jl_apply (nargs=4, args=0x7fffffff2c88,
f=<optimized out>) at julia.h:1325
#46 jl_apply_generic (F=0x7ffdee7d4a10, args=0x7fffffff2c88,
nargs=<optimized out>) at gf.c:1684
#47 0x00007fffef30193d in julia_abstract_call_gf_742 () at
inference.jl:737
#48 0x00007fffef3020cd in jlcall_abstract_call_gf_742 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
#49 0x00007ffff7a2a83b in jl_apply (nargs=4, args=0x7fffffff2e28,
f=<optimized out>) at julia.h:1325
#50 jl_apply_generic (F=0x7ffdee8d9350, args=0x7fffffff2e28,
nargs=<optimized out>) at gf.c:1684
#51 0x00007fffef2ff1a4 in julia_abstract_call_727 () at inference.jl:862
#52 0x00007fffef2fe315 in julia_abstract_apply_697 (af=<optimized out>,
fargs=<optimized out>,
     aargtypes=<optimized out>, vtypes=<optimized out>, sv=<optimized out>,
e=<optimized out>)
     at inference.jl:811
#53 0x00007ffff7a2a83b in jl_apply (nargs=6, args=0x7fffffff30d8,
f=<optimized out>) at julia.h:1325
#54 jl_apply_generic (F=0x7ffded89e730, args=0x7fffffff30d8,
nargs=<optimized out>) at gf.c:1684
#55 0x00007fffef2e26ec in julia_abstract_call_310 (f=<optimized out>,
fargs=<optimized out>,
     argtypes=<optimized out>, vtypes=<optimized out>, sv=<optimized out>,
e=<optimized out>)
     at inference.jl:824
#56 0x00007ffff7a2a83b in jl_apply (nargs=6, args=0x7fffffff32d8,
f=<optimized out>) at julia.h:1325
#57 jl_apply_generic (F=0x7ffdee8fc5d0, args=0x7fffffff32d8,
nargs=<optimized out>) at gf.c:1684
#58 0x00007fffef2e1030 in julia_abstract_eval_call_270 (e=<optimized out>,
vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:934
#59 0x00007fffef2dfbd4 in julia_abstract_eval_248 (e=<optimized out>,
vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:961
#60 0x00007ffff7a2a83b in jl_apply (nargs=3, args=0x7fffffff3918,
f=<optimized out>) at julia.h:1325
#61 jl_apply_generic (F=0x7ffdeda0d350, args=0x7fffffff3918,
nargs=<optimized out>) at gf.c:1684
#62 0x00007fffef2da2e7 in julia_typeinf_uncached_35 (cop=0x7ffdf26d9cd0,
optimize=<optimized out>)
     at inference.jl:1622
#63 0x00007fffef2dbba9 in jlcall_typeinf_uncached_35 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
#64 0x00007ffff7a2a83b in jl_apply (nargs=7, args=0x7fffffff3b70,
f=<optimized out>) at julia.h:1325
#65 jl_apply_generic (F=0x7ffded8d93d0, args=0x7fffffff3b70,
nargs=<optimized out>) at gf.c:1684
#66 0x00007fffef2d5a94 in julia_typeinf_4 (cop=0xf1d7f17000000001,
needtree=<optimized out>)
     at inference.jl:1339
#67 0x00007fffef2d61f3 in jlcall_typeinf_4 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
#68 0x00007ffff7a2a83b in jl_apply (nargs=6, args=0x7fffffff3c90,
f=<optimized out>) at julia.h:1325
#69 jl_apply_generic (F=0x7ffdee7d4a10, args=0x7fffffff3c90,
nargs=<optimized out>) at gf.c:1684
#70 0x00007fffef2e737d in julia_typeinf_392 (
---Type <return> to continue, or q <return> to quit---
     linfo=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     atypes=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     sparams=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     def=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>) at inference.jl:1289
#71 0x00007ffff7a2a83b in jl_apply (nargs=4, args=0x7fffffff4108,
f=<optimized out>) at julia.h:1325
#72 jl_apply_generic (F=0x7ffdee7d4a10, args=0x7fffffff4108,
nargs=<optimized out>) at gf.c:1684
#73 0x00007fffef2e5111 in julia_abstract_call_gf_329 (f=<optimized out>,
fargs=<optimized out>,
     argtype=<optimized out>, e=<optimized out>) at inference.jl:737
#74 0x00007ffff7a2a83b in jl_apply (nargs=4, args=0x7fffffff42d8,
f=<optimized out>) at julia.h:1325
#75 jl_apply_generic (F=0x7ffdee8d9350, args=0x7fffffff42d8,
nargs=<optimized out>) at gf.c:1684
#76 0x00007fffef2e24e2 in julia_abstract_call_310 (f=<optimized out>,
fargs=<optimized out>,
     argtypes=<optimized out>, vtypes=<optimized out>, sv=<optimized out>,
e=<optimized out>)
     at inference.jl:837
#77 0x00007ffff7a2a83b in jl_apply (nargs=6, args=0x7fffffff44d8,
f=<optimized out>) at julia.h:1325
#78 jl_apply_generic (F=0x7ffdee8fc5d0, args=0x7fffffff44d8,
nargs=<optimized out>) at gf.c:1684
#79 0x00007fffef2e1030 in julia_abstract_eval_call_270 (e=<optimized out>,
vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:934
#80 0x00007fffef2dfbd4 in julia_abstract_eval_248 (e=<optimized out>,
vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:961
#81 0x00007ffff7a2a83b in jl_apply (nargs=3, args=0x7fffffff4b18,
f=<optimized out>) at julia.h:1325
#82 jl_apply_generic (F=0x7ffdeda0d350, args=0x7fffffff4b18,
nargs=<optimized out>) at gf.c:1684
#83 0x00007fffef2da2e7 in julia_typeinf_uncached_35 (cop=0x7ffdf26d9460,
optimize=<optimized out>)
     at inference.jl:1622
#84 0x00007fffef2dbba9 in jlcall_typeinf_uncached_35 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
#85 0x00007ffff7a2a83b in jl_apply (nargs=7, args=0x7fffffff4d70,
f=<optimized out>) at julia.h:1325
#86 jl_apply_generic (F=0x7ffded8d93d0, args=0x7fffffff4d70,
nargs=<optimized out>) at gf.c:1684
#87 0x00007fffef2d5a94 in julia_typeinf_4 (cop=0xed83c01000000001,
needtree=<optimized out>)
     at inference.jl:1339
#88 0x00007fffef2d61f3 in jlcall_typeinf_4 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
#89 0x00007ffff7a2a83b in jl_apply (nargs=6, args=0x7fffffff4e90,
f=<optimized out>) at julia.h:1325
#90 jl_apply_generic (F=0x7ffdee7d4a10, args=0x7fffffff4e90,
nargs=<optimized out>) at gf.c:1684
#91 0x00007fffef2e737d in julia_typeinf_392 (
     linfo=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     atypes=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     sparams=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     def=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or ---Type <return> to continue, or q
<return> to quit---
in conjunction with DW_OP_piece or DW_OP_bit_piece.>) at inference.jl:1289
#92 0x00007ffff7a2a83b in jl_apply (nargs=4, args=0x7fffffff5308,
f=<optimized out>) at julia.h:1325
#93 jl_apply_generic (F=0x7ffdee7d4a10, args=0x7fffffff5308,
nargs=<optimized out>) at gf.c:1684
#94 0x00007fffef2e5111 in julia_abstract_call_gf_329 (f=<optimized out>,
fargs=<optimized out>,
     argtype=<optimized out>, e=<optimized out>) at inference.jl:737
#95 0x00007ffff7a2a83b in jl_apply (nargs=4, args=0x7fffffff54d8,
f=<optimized out>) at julia.h:1325
#96 jl_apply_generic (F=0x7ffdee8d9350, args=0x7fffffff54d8,
nargs=<optimized out>) at gf.c:1684
#97 0x00007fffef2e24e2 in julia_abstract_call_310 (f=<optimized out>,
fargs=<optimized out>,
     argtypes=<optimized out>, vtypes=<optimized out>, sv=<optimized out>,
e=<optimized out>)
     at inference.jl:837
#98 0x00007ffff7a2a83b in jl_apply (nargs=6, args=0x7fffffff56d8,
f=<optimized out>) at julia.h:1325
#99 jl_apply_generic (F=0x7ffdee8fc5d0, args=0x7fffffff56d8,
nargs=<optimized out>) at gf.c:1684
#100 0x00007fffef2e1030 in julia_abstract_eval_call_270 (e=<optimized
out>, vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:934
#101 0x00007fffef2dfbd4 in julia_abstract_eval_248 (e=<optimized out>,
vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:961
#102 0x00007ffff7a2a83b in jl_apply (nargs=3, args=0x7fffffff5988,
f=<optimized out>) at julia.h:1325
#103 jl_apply_generic (F=0x7ffdeda0d350, args=0x7fffffff5988,
nargs=<optimized out>) at gf.c:1684
#104 0x00007fffef2e0bb1 in julia_abstract_eval_call_270 (e=<optimized
out>, vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:906
#105 0x00007fffef2dfbd4 in julia_abstract_eval_248 (e=<optimized out>,
vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:961
#106 0x00007ffff7a2a83b in jl_apply (nargs=3, args=0x7fffffff5bc0,
f=<optimized out>) at julia.h:1325
#107 jl_apply_generic (F=0x7ffdeda0d350, args=0x7fffffff5bc0,
nargs=<optimized out>) at gf.c:1684
#108 0x00007fffef2df429 in julia_abstract_interpret_238 (e=<optimized
out>, vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:1110
#109 0x00007ffff7a2a83b in jl_apply (nargs=3, args=0x7fffffff6108,
f=<optimized out>) at julia.h:1325
#110 jl_apply_generic (F=0x7ffdee701890, args=0x7fffffff6108,
nargs=<optimized out>) at gf.c:1684
#111 0x00007fffef2d9410 in julia_typeinf_uncached_35 (cop=0x7ffdeff8a400,
optimize=<optimized out>)
     at inference.jl:1549
#112 0x00007fffef2dbba9 in jlcall_typeinf_uncached_35 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
#113 0x00007ffff7a2a83b in jl_apply (nargs=7, args=0x7fffffff6360,
f=<optimized out>) at julia.h:1325
#114 jl_apply_generic (F=0x7ffded8d93d0, args=0x7fffffff6360,
nargs=<optimized out>) at gf.c:1684
#115 0x00007fffef2d5a94 in julia_typeinf_4 (cop=0xf188ee3000000001,
needtree=<optimized out>)
     at inference.jl:1339
#116 0x00007fffef2d61f3 in jlcall_typeinf_4 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
#117 0x00007ffff7a2a83b in jl_apply (nargs=6, args=0x7fffffff6480,
f=<optimized out>) at julia.h:1325
#118 jl_apply_generic (F=0x7ffdee7d4a10, args=0x7fffffff6480,
nargs=<optimized out>) at gf.c:1684
#119 0x00007fffef2e737d in julia_typeinf_392 (
     linfo=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     atypes=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
---Type <return> to continue, or q <return> to quit---
     sparams=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     def=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>) at inference.jl:1289
#120 0x00007ffff7a2a83b in jl_apply (nargs=4, args=0x7fffffff68f8,
f=<optimized out>) at julia.h:1325
#121 jl_apply_generic (F=0x7ffdee7d4a10, args=0x7fffffff68f8,
nargs=<optimized out>) at gf.c:1684
#122 0x00007fffef2e5111 in julia_abstract_call_gf_329 (f=<optimized out>,
fargs=<optimized out>,
     argtype=<optimized out>, e=<optimized out>) at inference.jl:737
#123 0x00007ffff7a2a83b in jl_apply (nargs=4, args=0x7fffffff6ac8,
f=<optimized out>) at julia.h:1325
#124 jl_apply_generic (F=0x7ffdee8d9350, args=0x7fffffff6ac8,
nargs=<optimized out>) at gf.c:1684
#125 0x00007fffef2e24e2 in julia_abstract_call_310 (f=<optimized out>,
fargs=<optimized out>,
     argtypes=<optimized out>, vtypes=<optimized out>, sv=<optimized out>,
e=<optimized out>)
     at inference.jl:837
#126 0x00007ffff7a2a83b in jl_apply (nargs=6, args=0x7fffffff6cc8,
f=<optimized out>) at julia.h:1325
#127 jl_apply_generic (F=0x7ffdee8fc5d0, args=0x7fffffff6cc8,
nargs=<optimized out>) at gf.c:1684
#128 0x00007fffef2e1030 in julia_abstract_eval_call_270 (e=<optimized
out>, vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:934
#129 0x00007fffef2dfbd4 in julia_abstract_eval_248 (e=<optimized out>,
vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:961
#130 0x00007ffff7a2a83b in jl_apply (nargs=3, args=0x7fffffff6f78,
f=<optimized out>) at julia.h:1325
#131 jl_apply_generic (F=0x7ffdeda0d350, args=0x7fffffff6f78,
nargs=<optimized out>) at gf.c:1684
#132 0x00007fffef2e0bb1 in julia_abstract_eval_call_270 (e=<optimized
out>, vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:906
#133 0x00007fffef2dfbd4 in julia_abstract_eval_248 (e=<optimized out>,
vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:961
#134 0x00007ffff7a2a83b in jl_apply (nargs=3, args=0x7fffffff7228,
f=<optimized out>) at julia.h:1325
#135 jl_apply_generic (F=0x7ffdeda0d350, args=0x7fffffff7228,
nargs=<optimized out>) at gf.c:1684
#136 0x00007fffef2e0bb1 in julia_abstract_eval_call_270 (e=<optimized
out>, vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:906
#137 0x00007fffef2dfbd4 in julia_abstract_eval_248 (e=<optimized out>,
vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:961
#138 0x00007ffff7a2a83b in jl_apply (nargs=3, args=0x7fffffff7460,
f=<optimized out>) at julia.h:1325
#139 jl_apply_generic (F=0x7ffdeda0d350, args=0x7fffffff7460,
nargs=<optimized out>) at gf.c:1684
#140 0x00007fffef2df533 in julia_abstract_interpret_238 (e=<optimized
out>, vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:1120
#141 0x00007ffff7a2a83b in jl_apply (nargs=3, args=0x7fffffff79a8,
f=<optimized out>) at julia.h:1325
#142 jl_apply_generic (F=0x7ffdee701890, args=0x7fffffff79a8,
nargs=<optimized out>) at gf.c:1684
#143 0x00007fffef2d9410 in julia_typeinf_uncached_35 (cop=0x7ffdf2128910,
optimize=<optimized out>)
     at inference.jl:1549
#144 0x00007fffef2dbba9 in jlcall_typeinf_uncached_35 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
#145 0x00007ffff7a2a83b in jl_apply (nargs=7, args=0x7fffffff7c00,
f=<optimized out>) at julia.h:1325
#146 jl_apply_generic (F=0x7ffded8d93d0, args=0x7fffffff7c00,
nargs=<optimized out>) at gf.c:1684
#147 0x00007fffef2d5a94 in julia_typeinf_4 (cop=0xf1460a9000000001,
needtree=<optimized out>)
---Type <return> to continue, or q <return> to quit---
     at inference.jl:1339
#148 0x00007fffef2d61f3 in jlcall_typeinf_4 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
#149 0x00007ffff7a2a83b in jl_apply (nargs=6, args=0x7fffffff7d20,
f=<optimized out>) at julia.h:1325
#150 jl_apply_generic (F=0x7ffdee7d4a10, args=0x7fffffff7d20,
nargs=<optimized out>) at gf.c:1684
#151 0x00007fffef2e737d in julia_typeinf_392 (
     linfo=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     atypes=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     sparams=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     def=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>) at inference.jl:1289
#152 0x00007ffff7a2a83b in jl_apply (nargs=4, args=0x7fffffff8198,
f=<optimized out>) at julia.h:1325
#153 jl_apply_generic (F=0x7ffdee7d4a10, args=0x7fffffff8198,
nargs=<optimized out>) at gf.c:1684
#154 0x00007fffef2e5111 in julia_abstract_call_gf_329 (f=<optimized out>,
fargs=<optimized out>,
     argtype=<optimized out>, e=<optimized out>) at inference.jl:737
#155 0x00007ffff7a2a83b in jl_apply (nargs=4, args=0x7fffffff8368,
f=<optimized out>) at julia.h:1325
#156 jl_apply_generic (F=0x7ffdee8d9350, args=0x7fffffff8368,
nargs=<optimized out>) at gf.c:1684
#157 0x00007fffef2e24e2 in julia_abstract_call_310 (f=<optimized out>,
fargs=<optimized out>,
     argtypes=<optimized out>, vtypes=<optimized out>, sv=<optimized out>,
e=<optimized out>)
     at inference.jl:837
#158 0x00007ffff7a2a83b in jl_apply (nargs=6, args=0x7fffffff8568,
f=<optimized out>) at julia.h:1325
#159 jl_apply_generic (F=0x7ffdee8fc5d0, args=0x7fffffff8568,
nargs=<optimized out>) at gf.c:1684
#160 0x00007fffef2e1030 in julia_abstract_eval_call_270 (e=<optimized
out>, vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:934
#161 0x00007fffef2dfbd4 in julia_abstract_eval_248 (e=<optimized out>,
vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:961
#162 0x00007ffff7a2a83b in jl_apply (nargs=3, args=0x7fffffff8818,
f=<optimized out>) at julia.h:1325
#163 jl_apply_generic (F=0x7ffdeda0d350, args=0x7fffffff8818,
nargs=<optimized out>) at gf.c:1684
#164 0x00007fffef2e0bb1 in julia_abstract_eval_call_270 (e=<optimized
out>, vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:906
#165 0x00007fffef2dfbd4 in julia_abstract_eval_248 (e=<optimized out>,
vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:961
#166 0x00007ffff7a2a83b in jl_apply (nargs=3, args=0x7fffffff8a50,
f=<optimized out>) at julia.h:1325
#167 jl_apply_generic (F=0x7ffdeda0d350, args=0x7fffffff8a50,
nargs=<optimized out>) at gf.c:1684
#168 0x00007fffef2df533 in julia_abstract_interpret_238 (e=<optimized
out>, vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:1120
#169 0x00007ffff7a2a83b in jl_apply (nargs=3, args=0x7fffffff8f98,
f=<optimized out>) at julia.h:1325
#170 jl_apply_generic (F=0x7ffdee701890, args=0x7fffffff8f98,
nargs=<optimized out>) at gf.c:1684
#171 0x00007fffef2d9410 in julia_typeinf_uncached_35 (cop=0x7ffdf1596800,
optimize=<optimized out>)
     at inference.jl:1549
#172 0x00007fffef2dbba9 in jlcall_typeinf_uncached_35 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
---Type <return> to continue, or q <return> to quit---
#173 0x00007ffff7a2a83b in jl_apply (nargs=7, args=0x7fffffff91f0,
f=<optimized out>) at julia.h:1325
#174 jl_apply_generic (F=0x7ffded8d93d0, args=0x7fffffff91f0,
nargs=<optimized out>) at gf.c:1684
#175 0x00007fffef2d5a94 in julia_typeinf_4 (cop=0xf16e53c000000001,
needtree=<optimized out>)
     at inference.jl:1339
#176 0x00007fffef2d61f3 in jlcall_typeinf_4 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
#177 0x00007ffff7a2a83b in jl_apply (nargs=6, args=0x7fffffff9310,
f=<optimized out>) at julia.h:1325
#178 jl_apply_generic (F=0x7ffdee7d4a10, args=0x7fffffff9310,
nargs=<optimized out>) at gf.c:1684
#179 0x00007fffef2e737d in julia_typeinf_392 (
     linfo=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     atypes=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     sparams=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     def=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>) at inference.jl:1289
#180 0x00007ffff7a2a83b in jl_apply (nargs=4, args=0x7fffffff9788,
f=<optimized out>) at julia.h:1325
#181 jl_apply_generic (F=0x7ffdee7d4a10, args=0x7fffffff9788,
nargs=<optimized out>) at gf.c:1684
#182 0x00007fffef2e5111 in julia_abstract_call_gf_329 (f=<optimized out>,
fargs=<optimized out>,
     argtype=<optimized out>, e=<optimized out>) at inference.jl:737
#183 0x00007ffff7a2a83b in jl_apply (nargs=4, args=0x7fffffff9958,
f=<optimized out>) at julia.h:1325
#184 jl_apply_generic (F=0x7ffdee8d9350, args=0x7fffffff9958,
nargs=<optimized out>) at gf.c:1684
#185 0x00007fffef2e24e2 in julia_abstract_call_310 (f=<optimized out>,
fargs=<optimized out>,
     argtypes=<optimized out>, vtypes=<optimized out>, sv=<optimized out>,
e=<optimized out>)
     at inference.jl:837
#186 0x00007ffff7a2a83b in jl_apply (nargs=6, args=0x7fffffff9b58,
f=<optimized out>) at julia.h:1325
#187 jl_apply_generic (F=0x7ffdee8fc5d0, args=0x7fffffff9b58,
nargs=<optimized out>) at gf.c:1684
#188 0x00007fffef2e1030 in julia_abstract_eval_call_270 (e=<optimized
out>, vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:934
#189 0x00007fffef2dfbd4 in julia_abstract_eval_248 (e=<optimized out>,
vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:961
#190 0x00007ffff7a2a83b in jl_apply (nargs=3, args=0x7fffffffa198,
f=<optimized out>) at julia.h:1325
#191 jl_apply_generic (F=0x7ffdeda0d350, args=0x7fffffffa198,
nargs=<optimized out>) at gf.c:1684
#192 0x00007fffef2da2e7 in julia_typeinf_uncached_35 (cop=0x7ffdf1127a40,
optimize=<optimized out>)
     at inference.jl:1622
#193 0x00007fffef2dbba9 in jlcall_typeinf_uncached_35 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
#194 0x00007ffff7a2a83b in jl_apply (nargs=7, args=0x7fffffffa3f0,
f=<optimized out>) at julia.h:1325
#195 jl_apply_generic (F=0x7ffded8d93d0, args=0x7fffffffa3f0,
nargs=<optimized out>) at gf.c:1684
#196 0x00007fffef2d5a94 in julia_typeinf_4 (cop=0xed83c01000000001,
needtree=<optimized out>)
     at inference.jl:1339
#197 0x00007fffef2d61f3 in jlcall_typeinf_4 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
#198 0x00007ffff7a2a83b in jl_apply (nargs=6, args=0x7fffffffa510,
f=<optimized out>) at julia.h:1325
#199 jl_apply_generic (F=0x7ffdee7d4a10, args=0x7fffffffa510,
nargs=<optimized out>) at gf.c:1684
---Type <return> to continue, or q <return> to quit---
#200 0x00007fffef2e737d in julia_typeinf_392 (
     linfo=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     atypes=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     sparams=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     def=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>) at inference.jl:1289
#201 0x00007ffff7a2a83b in jl_apply (nargs=4, args=0x7fffffffa978,
f=<optimized out>) at julia.h:1325
#202 jl_apply_generic (F=0x7ffdee7d4a10, args=0x7fffffffa978,
nargs=<optimized out>) at gf.c:1684
#203 0x00007fffef30193d in julia_abstract_call_gf_742 () at
inference.jl:737
#204 0x00007fffef3020cd in jlcall_abstract_call_gf_742 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
#205 0x00007ffff7a2a83b in jl_apply (nargs=4, args=0x7fffffffab18,
f=<optimized out>) at julia.h:1325
#206 jl_apply_generic (F=0x7ffdee8d9350, args=0x7fffffffab18,
nargs=<optimized out>) at gf.c:1684
#207 0x00007fffef2ff1a4 in julia_abstract_call_727 () at inference.jl:862
#208 0x00007fffef2fe315 in julia_abstract_apply_697 (af=<optimized out>,
fargs=<optimized out>,
     aargtypes=<optimized out>, vtypes=<optimized out>, sv=<optimized out>,
e=<optimized out>)
     at inference.jl:811
#209 0x00007ffff7a2a83b in jl_apply (nargs=6, args=0x7fffffffadc8,
f=<optimized out>) at julia.h:1325
#210 jl_apply_generic (F=0x7ffded89e730, args=0x7fffffffadc8,
nargs=<optimized out>) at gf.c:1684
#211 0x00007fffef2e26ec in julia_abstract_call_310 (f=<optimized out>,
fargs=<optimized out>,
     argtypes=<optimized out>, vtypes=<optimized out>, sv=<optimized out>,
e=<optimized out>)
     at inference.jl:824
#212 0x00007ffff7a2a83b in jl_apply (nargs=6, args=0x7fffffffafc8,
f=<optimized out>) at julia.h:1325
#213 jl_apply_generic (F=0x7ffdee8fc5d0, args=0x7fffffffafc8,
nargs=<optimized out>) at gf.c:1684
#214 0x00007fffef2e1030 in julia_abstract_eval_call_270 (e=<optimized
out>, vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:934
#215 0x00007fffef2dfbd4 in julia_abstract_eval_248 (e=<optimized out>,
vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:961
#216 0x00007ffff7a2a83b in jl_apply (nargs=3, args=0x7fffffffb200,
f=<optimized out>) at julia.h:1325
#217 jl_apply_generic (F=0x7ffdeda0d350, args=0x7fffffffb200,
nargs=<optimized out>) at gf.c:1684
#218 0x00007fffef2df429 in julia_abstract_interpret_238 (e=<optimized
out>, vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:1110
#219 0x00007ffff7a2a83b in jl_apply (nargs=3, args=0x7fffffffb748,
f=<optimized out>) at julia.h:1325
#220 jl_apply_generic (F=0x7ffdee701890, args=0x7fffffffb748,
nargs=<optimized out>) at gf.c:1684
#221 0x00007fffef2d9410 in julia_typeinf_uncached_35 (cop=0x7ffdf21e9c30,
optimize=<optimized out>)
     at inference.jl:1549
#222 0x00007fffef2dbba9 in jlcall_typeinf_uncached_35 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
#223 0x00007ffff7a2a83b in jl_apply (nargs=7, args=0x7fffffffb9a0,
f=<optimized out>) at julia.h:1325
#224 jl_apply_generic (F=0x7ffded8d93d0, args=0x7fffffffb9a0,
nargs=<optimized out>) at gf.c:1684
#225 0x00007fffef2d5a94 in julia_typeinf_4 (cop=0xf011831000000001,
needtree=<optimized out>)
     at inference.jl:1339
---Type <return> to continue, or q <return> to quit---
#226 0x00007fffef2d61f3 in jlcall_typeinf_4 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
#227 0x00007ffff7a2a83b in jl_apply (nargs=6, args=0x7fffffffbac0,
f=<optimized out>) at julia.h:1325
#228 jl_apply_generic (F=0x7ffdee7d4a10, args=0x7fffffffbac0,
nargs=<optimized out>) at gf.c:1684
#229 0x00007fffef2e737d in julia_typeinf_392 (
     linfo=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     atypes=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     sparams=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>,
     def=<error reading variable: DWARF-2 expression error: DW_OP_reg
operations must be used either alone or in conjunction with DW_OP_piece or
DW_OP_bit_piece.>) at inference.jl:1289
#230 0x00007ffff7a2a83b in jl_apply (nargs=4, args=0x7fffffffbf28,
f=<optimized out>) at julia.h:1325
#231 jl_apply_generic (F=0x7ffdee7d4a10, args=0x7fffffffbf28,
nargs=<optimized out>) at gf.c:1684
#232 0x00007fffef30193d in julia_abstract_call_gf_742 () at
inference.jl:737
#233 0x00007fffef3020cd in jlcall_abstract_call_gf_742 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
#234 0x00007ffff7a2a83b in jl_apply (nargs=4, args=0x7fffffffc0c8,
f=<optimized out>) at julia.h:1325
#235 jl_apply_generic (F=0x7ffdee8d9350, args=0x7fffffffc0c8,
nargs=<optimized out>) at gf.c:1684
#236 0x00007fffef2ff1a4 in julia_abstract_call_727 () at inference.jl:862
#237 0x00007fffef2fe315 in julia_abstract_apply_697 (af=<optimized out>,
fargs=<optimized out>,
     aargtypes=<optimized out>, vtypes=<optimized out>, sv=<optimized out>,
e=<optimized out>)
     at inference.jl:811
#238 0x00007ffff7a2a83b in jl_apply (nargs=6, args=0x7fffffffc378,
f=<optimized out>) at julia.h:1325
#239 jl_apply_generic (F=0x7ffded89e730, args=0x7fffffffc378,
nargs=<optimized out>) at gf.c:1684
#240 0x00007fffef2e26ec in julia_abstract_call_310 (f=<optimized out>,
fargs=<optimized out>,
     argtypes=<optimized out>, vtypes=<optimized out>, sv=<optimized out>,
e=<optimized out>)
     at inference.jl:824
#241 0x00007ffff7a2a83b in jl_apply (nargs=6, args=0x7fffffffc578,
f=<optimized out>) at julia.h:1325
#242 jl_apply_generic (F=0x7ffdee8fc5d0, args=0x7fffffffc578,
nargs=<optimized out>) at gf.c:1684
#243 0x00007fffef2e1030 in julia_abstract_eval_call_270 (e=<optimized
out>, vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:934
#244 0x00007fffef2dfbd4 in julia_abstract_eval_248 (e=<optimized out>,
vtypes=<optimized out>,
     sv=<optimized out>) at inference.jl:961
#245 0x00007ffff7a2a83b in jl_apply (nargs=3, args=0x7fffffffcbb8,
f=<optimized out>) at julia.h:1325
#246 jl_apply_generic (F=0x7ffdeda0d350, args=0x7fffffffcbb8,
nargs=<optimized out>) at gf.c:1684
#247 0x00007fffef2da2e7 in julia_typeinf_uncached_35 (cop=0x7ffdf112d390,
optimize=<optimized out>)
     at inference.jl:1622
#248 0x00007fffef2dbba9 in jlcall_typeinf_uncached_35 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
#249 0x00007ffff7a2a83b in jl_apply (nargs=7, args=0x7fffffffce10,
f=<optimized out>) at julia.h:1325
#250 jl_apply_generic (F=0x7ffded8d93d0, args=0x7fffffffce10,
nargs=<optimized out>) at gf.c:1684
#251 0x00007fffef2d5a94 in julia_typeinf_4 (cop=0xed83c01000000001,
needtree=<optimized out>)
     at inference.jl:1339
#252 0x00007fffef2d61f3 in jlcall_typeinf_4 () from
/usr/lib/x86_64-linux-gnu/julia/sys.so
---Type <return> to continue, or q <return> to quit---
#253 0x00007ffff7a2a83b in jl_apply (nargs=6, args=0x7fffffffcf68,
f=<optimized out>) at julia.h:1325
#254 jl_apply_generic (F=0x7ffdee7d4a10, args=0x7fffffffcf68,
nargs=<optimized out>) at gf.c:1684
#255 0x00007fffef2d5597 in julia_typeinf_ext_0 (linfo=<optimized out>,
atypes=<optimized out>,
     sparams=<optimized out>, def=<optimized out>) at inference.jl:1283
#256 0x00007ffff7a2a83b in jl_apply (nargs=4, args=0x7fffffffd050,
f=<optimized out>) at julia.h:1325
#257 jl_apply_generic (F=0x7ffdee804d50, args=0x7fffffffd050,
nargs=<optimized out>) at gf.c:1684
#258 0x00007ffff7a27a79 in jl_apply (nargs=4, args=0x7fffffffd050,
f=<optimized out>) at julia.h:1325
#259 jl_type_infer (li=0x7ffdf29b55f0,
argtypes=argtypes@entry=0x7ffdf0632540, def=<optimized out>)
     at gf.c:423
#260 0x00007ffff7a29d01 in cache_method (mt=mt@entry=0x7ffdf17f5050,
type=type@entry=0x7ffdf0632540,
     method=<optimized out>, decl=<optimized out>, sparams=<optimized out>,
isstaged=<optimized out>)
     at gf.c:852
#261 0x00007ffff7a2a365 in jl_mt_assoc_by_type
(mt=mt@entry=0x7ffdf17f5050, tt=0x7ffdf0632540,
     cache=cache@entry=1, inexact=inexact@entry=0) at gf.c:1057
#262 0x00007ffff7a2a8e2 in jl_apply_generic (F=0x7ffdf06a1d30,
args=0x7fffffffd338, nargs=<optimized out>)
     at gf.c:1692
#263 0x00007ffff7a7c163 in jl_apply (nargs=3, args=0x7fffffffd338,
f=0x7ffdf06a1d30) at julia.h:1325
#264 do_call (f=f@entry=0x7ffdf06a1d30, args=args@entry=0x7ffdf1113168,
nargs=nargs@entry=3,
     eval0=eval0@entry=0x0, locals=locals@entry=0x7fffffffdb90,
nl=nl@entry=0, ngensym=1) at interpreter.c:65
#265 0x00007ffff7a7b4ff in eval (e=0x7ffdf00c3f10,
locals=locals@entry=0x7fffffffdb90, nl=nl@entry=0,
     ngensym=ngensym@entry=1) at interpreter.c:213
#266 0x00007ffff7a7b234 in eval (e=e@entry=0x7ffdf00c3ef0,
locals=locals@entry=0x7fffffffdb90,
     nl=nl@entry=0, ngensym=ngensym@entry=1) at interpreter.c:219
#267 0x00007ffff7a7c26f in eval_body (stmts=stmts@entry=0x7ffdf11130d0,
locals=locals@entry=0x7fffffffdb90,
     nl=nl@entry=0, ngensym=ngensym@entry=1, start=start@entry=0,
toplevel=toplevel@entry=1)
     at interpreter.c:592
#268 0x00007ffff7a7c6b2 in jl_toplevel_eval_body (stmts=0x7ffdf11130d0) at
interpreter.c:525
#269 0x00007ffff7a8dba7 in jl_toplevel_eval_flex (e=<optimized out>,
fast=1) at toplevel.c:521
#270 0x00007ffff7a337d3 in jl_toplevel_eval_in (m=0x7ffded87c010,
ex=0x7ffdf00c39d0, delay_warn=0)
     at builtins.c:579
#271 0x00007ffded820480 in julia_eval_user_input_21222 () at REPL.jl:62
#272 0x00007ffded820564 in jlcall_eval_user_input_21222 ()
#273 0x00007ffff7a2a83b in jl_apply (nargs=2, args=0x7fffffffe120,
f=<optimized out>) at julia.h:1325
#274 jl_apply_generic (F=0x7ffdee30fad0, args=0x7fffffffe120,
nargs=<optimized out>) at gf.c:1684
#275 0x00007ffff7e8d0ff in julia_anonymous_21188 () at REPL.jl:92
#276 julia_anonymous_21188 () at task.jl:63
#277 0x00007ffff7a81d53 in jl_apply (nargs=0, args=0x0, f=<optimized out>)
at julia.h:1325
#278 start_task () at task.c:247
#279 0x0000000000000000 in ?? ()
(gdb)

Quite spammy. Sorry.


Thanks for the help.

--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: Trouble with GLlM in new Julia install

Pedro L Vera
In reply to this post by Milan Bouchet-Valat
I get the same error (and some new warnings) with the latest 0.5 build, as
below

   _       _ _(_)_     |  A fresh approach to technical computing
   (_)     | (_) (_)    |  Documentation: http://docs.julialang.org
    _ _   _| |_  __ _   |  Type "?help" for help.
   | | | | | | |/ _` |  |
   | | |_| | | | (_| |  |  Version 0.5.0-dev+3488 (2016-04-11 17:05 UTC)
  _/ |\__'_|_|_|\__'_|  |  Commit e1cf87f (4 days old master)
|__/                   |  x86_64-unknown-linux-gnu

julia> using GLM,RDatasets
WARNING: New definition
     write(GZip.GZipStream, Array{#T<:Any, N<:Any}) at
/home/pedro/.julia/v0.5/GZip/src/GZip.jl:456
is ambiguous with:
     write(Base.IO, Array{UInt8, N<:Any}) at io.jl:154.
To fix, define
     write(GZip.GZipStream, Array{UInt8, N<:Any})
before the new definition.

julia> form = dataset("datasets","Formaldehyde")
6x2 DataFrames.DataFrame
│ Row │ Carb │ OptDen │
┝━━━━━┿━━━━━━┿━━━━━━━━┥
│ 1   │ 0.1  │ 0.086  │
│ 2   │ 0.3  │ 0.269  │
│ 3   │ 0.5  │ 0.446  │
│ 4   │ 0.6  │ 0.538  │
│ 5   │ 0.7  │ 0.626  │
│ 6   │ 0.9  │ 0.782  │

julia> lm1 = fit(LinearModel, OptDen ~ Carb, form)
^CERROR: InterruptException:
  [inlined code] from ./pointer.jl:17
  in geqrt!(::Array{Float64,2}, ::Array{Float64,2}) at
./linalg/lapack.jl:434
  [inlined code] from ./abstractarray.jl:196
  in qrfact!(::Array{Float64,2}, ::Type{Val{false}}) at ./linalg/qr.jl:88
  [inlined code] from ./linalg/qr.jl:90
  in GLM.DensePredQR{Float64}(::Array{Float64,2}, ::Array{Float64,1}) at
/home/pedro/.julia/v0.5/GLM/src/linpred.jl:39
  [inlined code] from ./boot.jl:304
  in
fit(::Type{GLM.LinearModel{GLM.LmResp{Array{Float64,1}},GLM.DensePredQR{Float64}}},
::Array{Float64,2}, ::Array{Float64,1}) at
/home/pedro/.julia/v0.5/GLM/src/lm.jl:63
  in
fit(::Type{GLM.LinearModel{L<:GLM.LmResp{V<:DenseArray{T<:AbstractFloat,1}},T<:GLM.LinPred}},
::Array{Float64,2}, ::Array{Float64,1}) at
/home/pedro/.julia/v0.5/GLM/src/lm.jl:71
  [inlined code] from ./int.jl:32
  in #fit#55(::Array{Any,1}, ::Any,
::Type{GLM.LinearModel{L<:GLM.LmResp{V<:DenseArray{T<:AbstractFloat,1}},T<:GLM.LinPred}},
::DataFrames.Formula, ::DataFrames.DataFrame) at
/home/pedro/.julia/v0.5/DataFrames/src/statsmodels/statsmodel.jl:55
  in eval(::Module, ::Any) at ./boot.jl:237

Thanks.

Pedro

--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: Trouble with GLlM in new Julia install

Andreas Noack
Are you able to run all the julia tests? I.e. something like julia $TESTDIR/runtests.jl all

I'm not sure what TESTDIR is when you install from the repos but it's probably not that difficult to find.

I don't have access to an Excavator machine and I haven't been able to reproduce the error on the Piledriver system that I have access to.

On Sat, Apr 16, 2016 at 11:23 AM, Pedro L Vera <[hidden email]> wrote:
I get the same error (and some new warnings) with the latest 0.5 build, as below

  _       _ _(_)_     |  A fresh approach to technical computing
  (_)     | (_) (_)    |  Documentation: http://docs.julialang.org
   _ _   _| |_  __ _   |  Type "?help" for help.
  | | | | | | |/ _` |  |
  | | |_| | | | (_| |  |  Version 0.5.0-dev+3488 (2016-04-11 17:05 UTC)
 _/ |\__'_|_|_|\__'_|  |  Commit e1cf87f (4 days old master)
|__/                   |  x86_64-unknown-linux-gnu

julia> using GLM,RDatasets
WARNING: New definition
    write(GZip.GZipStream, Array{#T<:Any, N<:Any}) at /home/pedro/.julia/v0.5/GZip/src/GZip.jl:456
is ambiguous with:
    write(Base.IO, Array{UInt8, N<:Any}) at io.jl:154.
To fix, define
    write(GZip.GZipStream, Array{UInt8, N<:Any})
before the new definition.

julia> form = dataset("datasets","Formaldehyde")
6x2 DataFrames.DataFrame
│ Row │ Carb │ OptDen │
┝━━━━━┿━━━━━━┿━━━━━━━━┥
│ 1   │ 0.1  │ 0.086  │
│ 2   │ 0.3  │ 0.269  │
│ 3   │ 0.5  │ 0.446  │
│ 4   │ 0.6  │ 0.538  │
│ 5   │ 0.7  │ 0.626  │
│ 6   │ 0.9  │ 0.782  │

julia> lm1 = fit(LinearModel, OptDen ~ Carb, form)
^CERROR: InterruptException:
 [inlined code] from ./pointer.jl:17
 in geqrt!(::Array{Float64,2}, ::Array{Float64,2}) at ./linalg/lapack.jl:434
 [inlined code] from ./abstractarray.jl:196
 in qrfact!(::Array{Float64,2}, ::Type{Val{false}}) at ./linalg/qr.jl:88
 [inlined code] from ./linalg/qr.jl:90
 in GLM.DensePredQR{Float64}(::Array{Float64,2}, ::Array{Float64,1}) at /home/pedro/.julia/v0.5/GLM/src/linpred.jl:39
 [inlined code] from ./boot.jl:304
 in fit(::Type{GLM.LinearModel{GLM.LmResp{Array{Float64,1}},GLM.DensePredQR{Float64}}}, ::Array{Float64,2}, ::Array{Float64,1}) at /home/pedro/.julia/v0.5/GLM/src/lm.jl:63
 in fit(::Type{GLM.LinearModel{L<:GLM.LmResp{V<:DenseArray{T<:AbstractFloat,1}},T<:GLM.LinPred}}, ::Array{Float64,2}, ::Array{Float64,1}) at /home/pedro/.julia/v0.5/GLM/src/lm.jl:71
 [inlined code] from ./int.jl:32
 in #fit#55(::Array{Any,1}, ::Any, ::Type{GLM.LinearModel{L<:GLM.LmResp{V<:DenseArray{T<:AbstractFloat,1}},T<:GLM.LinPred}}, ::DataFrames.Formula, ::DataFrames.DataFrame) at /home/pedro/.julia/v0.5/DataFrames/src/statsmodels/statsmodel.jl:55
 in eval(::Module, ::Any) at ./boot.jl:237

Thanks.

Pedro


--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: Trouble with GLlM in new Julia install

Pedro L Vera
I've sort of given up onI this. Tried compiling bleeding edge binaries
or stable and could not. Installed a couple of dependencies that
julila needed that I did not have (like m4) and stil did not work
after spendig many hours.

Maybe it's just a GLM issue. Other Julia feature seem to work.

Maybe i'll come back to this after it settles down a bit.

Anyway, thanks for the suggestions.

Best,

Pedro

On Sat, Apr 16, 2016 at 10:13 PM, Andreas Noack
<[hidden email]> wrote:

> Are you able to run all the julia tests? I.e. something like julia
> $TESTDIR/runtests.jl all
>
> I'm not sure what TESTDIR is when you install from the repos but it's
> probably not that difficult to find.
>
> I don't have access to an Excavator machine and I haven't been able to
> reproduce the error on the Piledriver system that I have access to.
>
> On Sat, Apr 16, 2016 at 11:23 AM, Pedro L Vera <[hidden email]> wrote:
>>
>> I get the same error (and some new warnings) with the latest 0.5 build, as
>> below
>>
>>   _       _ _(_)_     |  A fresh approach to technical computing
>>   (_)     | (_) (_)    |  Documentation: http://docs.julialang.org
>>    _ _   _| |_  __ _   |  Type "?help" for help.
>>   | | | | | | |/ _` |  |
>>   | | |_| | | | (_| |  |  Version 0.5.0-dev+3488 (2016-04-11 17:05 UTC)
>>  _/ |\__'_|_|_|\__'_|  |  Commit e1cf87f (4 days old master)
>> |__/                   |  x86_64-unknown-linux-gnu
>>
>> julia> using GLM,RDatasets
>> WARNING: New definition
>>     write(GZip.GZipStream, Array{#T<:Any, N<:Any}) at
>> /home/pedro/.julia/v0.5/GZip/src/GZip.jl:456
>> is ambiguous with:
>>     write(Base.IO, Array{UInt8, N<:Any}) at io.jl:154.
>> To fix, define
>>     write(GZip.GZipStream, Array{UInt8, N<:Any})
>> before the new definition.
>>
>> julia> form = dataset("datasets","Formaldehyde")
>> 6x2 DataFrames.DataFrame
>> │ Row │ Carb │ OptDen │
>> ┝━━━━━┿━━━━━━┿━━━━━━━━┥
>> │ 1   │ 0.1  │ 0.086  │
>> │ 2   │ 0.3  │ 0.269  │
>> │ 3   │ 0.5  │ 0.446  │
>> │ 4   │ 0.6  │ 0.538  │
>> │ 5   │ 0.7  │ 0.626  │
>> │ 6   │ 0.9  │ 0.782  │
>>
>> julia> lm1 = fit(LinearModel, OptDen ~ Carb, form)
>> ^CERROR: InterruptException:
>>  [inlined code] from ./pointer.jl:17
>>  in geqrt!(::Array{Float64,2}, ::Array{Float64,2}) at
>> ./linalg/lapack.jl:434
>>  [inlined code] from ./abstractarray.jl:196
>>  in qrfact!(::Array{Float64,2}, ::Type{Val{false}}) at ./linalg/qr.jl:88
>>  [inlined code] from ./linalg/qr.jl:90
>>  in GLM.DensePredQR{Float64}(::Array{Float64,2}, ::Array{Float64,1}) at
>> /home/pedro/.julia/v0.5/GLM/src/linpred.jl:39
>>  [inlined code] from ./boot.jl:304
>>  in
>> fit(::Type{GLM.LinearModel{GLM.LmResp{Array{Float64,1}},GLM.DensePredQR{Float64}}},
>> ::Array{Float64,2}, ::Array{Float64,1}) at
>> /home/pedro/.julia/v0.5/GLM/src/lm.jl:63
>>  in
>> fit(::Type{GLM.LinearModel{L<:GLM.LmResp{V<:DenseArray{T<:AbstractFloat,1}},T<:GLM.LinPred}},
>> ::Array{Float64,2}, ::Array{Float64,1}) at
>> /home/pedro/.julia/v0.5/GLM/src/lm.jl:71
>>  [inlined code] from ./int.jl:32
>>  in #fit#55(::Array{Any,1}, ::Any,
>> ::Type{GLM.LinearModel{L<:GLM.LmResp{V<:DenseArray{T<:AbstractFloat,1}},T<:GLM.LinPred}},
>> ::DataFrames.Formula, ::DataFrames.DataFrame) at
>> /home/pedro/.julia/v0.5/DataFrames/src/statsmodels/statsmodel.jl:55
>>  in eval(::Module, ::Any) at ./boot.jl:237
>>
>> Thanks.
>>
>> Pedro
>>
>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "julia-stats" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to [hidden email].
>> For more options, visit https://groups.google.com/d/optout.
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "julia-stats" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [hidden email].
> For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: Trouble with GLlM in new Julia install

Andreas Noack
Nobody suggested to build from source.

It would be great if you could try the generic Linux binary from http://julialang.org/downloads/ and/or to run the tests as described in the last email. Thanks.

On Sun, Apr 17, 2016 at 7:21 PM, Pedro L Vera <[hidden email]> wrote:
I've sort of given up onI this. Tried compiling bleeding edge binaries
or stable and could not. Installed a couple of dependencies that
julila needed that I did not have (like m4) and stil did not work
after spendig many hours.

Maybe it's just a GLM issue. Other Julia feature seem to work.

Maybe i'll come back to this after it settles down a bit.

Anyway, thanks for the suggestions.

Best,

Pedro

On Sat, Apr 16, 2016 at 10:13 PM, Andreas Noack
<[hidden email]> wrote:
> Are you able to run all the julia tests? I.e. something like julia
> $TESTDIR/runtests.jl all
>
> I'm not sure what TESTDIR is when you install from the repos but it's
> probably not that difficult to find.
>
> I don't have access to an Excavator machine and I haven't been able to
> reproduce the error on the Piledriver system that I have access to.
>
> On Sat, Apr 16, 2016 at 11:23 AM, Pedro L Vera <[hidden email]> wrote:
>>
>> I get the same error (and some new warnings) with the latest 0.5 build, as
>> below
>>
>>   _       _ _(_)_     |  A fresh approach to technical computing
>>   (_)     | (_) (_)    |  Documentation: http://docs.julialang.org
>>    _ _   _| |_  __ _   |  Type "?help" for help.
>>   | | | | | | |/ _` |  |
>>   | | |_| | | | (_| |  |  Version 0.5.0-dev+3488 (2016-04-11 17:05 UTC)
>>  _/ |\__'_|_|_|\__'_|  |  Commit e1cf87f (4 days old master)
>> |__/                   |  x86_64-unknown-linux-gnu
>>
>> julia> using GLM,RDatasets
>> WARNING: New definition
>>     write(GZip.GZipStream, Array{#T<:Any, N<:Any}) at
>> /home/pedro/.julia/v0.5/GZip/src/GZip.jl:456
>> is ambiguous with:
>>     write(Base.IO, Array{UInt8, N<:Any}) at io.jl:154.
>> To fix, define
>>     write(GZip.GZipStream, Array{UInt8, N<:Any})
>> before the new definition.
>>
>> julia> form = dataset("datasets","Formaldehyde")
>> 6x2 DataFrames.DataFrame
>> │ Row │ Carb │ OptDen │
>> ┝━━━━━┿━━━━━━┿━━━━━━━━┥
>> │ 1   │ 0.1  │ 0.086  │
>> │ 2   │ 0.3  │ 0.269  │
>> │ 3   │ 0.5  │ 0.446  │
>> │ 4   │ 0.6  │ 0.538  │
>> │ 5   │ 0.7  │ 0.626  │
>> │ 6   │ 0.9  │ 0.782  │
>>
>> julia> lm1 = fit(LinearModel, OptDen ~ Carb, form)
>> ^CERROR: InterruptException:
>>  [inlined code] from ./pointer.jl:17
>>  in geqrt!(::Array{Float64,2}, ::Array{Float64,2}) at
>> ./linalg/lapack.jl:434
>>  [inlined code] from ./abstractarray.jl:196
>>  in qrfact!(::Array{Float64,2}, ::Type{Val{false}}) at ./linalg/qr.jl:88
>>  [inlined code] from ./linalg/qr.jl:90
>>  in GLM.DensePredQR{Float64}(::Array{Float64,2}, ::Array{Float64,1}) at
>> /home/pedro/.julia/v0.5/GLM/src/linpred.jl:39
>>  [inlined code] from ./boot.jl:304
>>  in
>> fit(::Type{GLM.LinearModel{GLM.LmResp{Array{Float64,1}},GLM.DensePredQR{Float64}}},
>> ::Array{Float64,2}, ::Array{Float64,1}) at
>> /home/pedro/.julia/v0.5/GLM/src/lm.jl:63
>>  in
>> fit(::Type{GLM.LinearModel{L<:GLM.LmResp{V<:DenseArray{T<:AbstractFloat,1}},T<:GLM.LinPred}},
>> ::Array{Float64,2}, ::Array{Float64,1}) at
>> /home/pedro/.julia/v0.5/GLM/src/lm.jl:71
>>  [inlined code] from ./int.jl:32
>>  in #fit#55(::Array{Any,1}, ::Any,
>> ::Type{GLM.LinearModel{L<:GLM.LmResp{V<:DenseArray{T<:AbstractFloat,1}},T<:GLM.LinPred}},
>> ::DataFrames.Formula, ::DataFrames.DataFrame) at
>> /home/pedro/.julia/v0.5/DataFrames/src/statsmodels/statsmodel.jl:55
>>  in eval(::Module, ::Any) at ./boot.jl:237
>>
>> Thanks.
>>
>> Pedro
>>
>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "julia-stats" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to [hidden email].
>> For more options, visit https://groups.google.com/d/optout.
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "julia-stats" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [hidden email].
> For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: Trouble with GLlM in new Julia install

Pedro L Vera
i downloaded the binaries for 0.4.5 and still the same problem with GLM.

Did run runtests.jl but stopped it after about 20 min of no screen
output and 100% CPU usage. Didntt think it should take that long.
Maybe I'm wrong.

Thanks.

On Sun, Apr 17, 2016 at 9:23 PM, Andreas Noack
<[hidden email]> wrote:

> Nobody suggested to build from source.
>
> It would be great if you could try the generic Linux binary from
> http://julialang.org/downloads/ and/or to run the tests as described in the
> last email. Thanks.
>
> On Sun, Apr 17, 2016 at 7:21 PM, Pedro L Vera <[hidden email]> wrote:
>>
>> I've sort of given up onI this. Tried compiling bleeding edge binaries
>> or stable and could not. Installed a couple of dependencies that
>> julila needed that I did not have (like m4) and stil did not work
>> after spendig many hours.
>>
>> Maybe it's just a GLM issue. Other Julia feature seem to work.
>>
>> Maybe i'll come back to this after it settles down a bit.
>>
>> Anyway, thanks for the suggestions.
>>
>> Best,
>>
>> Pedro
>>
>> On Sat, Apr 16, 2016 at 10:13 PM, Andreas Noack
>> <[hidden email]> wrote:
>> > Are you able to run all the julia tests? I.e. something like julia
>> > $TESTDIR/runtests.jl all
>> >
>> > I'm not sure what TESTDIR is when you install from the repos but it's
>> > probably not that difficult to find.
>> >
>> > I don't have access to an Excavator machine and I haven't been able to
>> > reproduce the error on the Piledriver system that I have access to.
>> >
>> > On Sat, Apr 16, 2016 at 11:23 AM, Pedro L Vera <[hidden email]>
>> > wrote:
>> >>
>> >> I get the same error (and some new warnings) with the latest 0.5 build,
>> >> as
>> >> below
>> >>
>> >>   _       _ _(_)_     |  A fresh approach to technical computing
>> >>   (_)     | (_) (_)    |  Documentation: http://docs.julialang.org
>> >>    _ _   _| |_  __ _   |  Type "?help" for help.
>> >>   | | | | | | |/ _` |  |
>> >>   | | |_| | | | (_| |  |  Version 0.5.0-dev+3488 (2016-04-11 17:05 UTC)
>> >>  _/ |\__'_|_|_|\__'_|  |  Commit e1cf87f (4 days old master)
>> >> |__/                   |  x86_64-unknown-linux-gnu
>> >>
>> >> julia> using GLM,RDatasets
>> >> WARNING: New definition
>> >>     write(GZip.GZipStream, Array{#T<:Any, N<:Any}) at
>> >> /home/pedro/.julia/v0.5/GZip/src/GZip.jl:456
>> >> is ambiguous with:
>> >>     write(Base.IO, Array{UInt8, N<:Any}) at io.jl:154.
>> >> To fix, define
>> >>     write(GZip.GZipStream, Array{UInt8, N<:Any})
>> >> before the new definition.
>> >>
>> >> julia> form = dataset("datasets","Formaldehyde")
>> >> 6x2 DataFrames.DataFrame
>> >> │ Row │ Carb │ OptDen │
>> >> ┝━━━━━┿━━━━━━┿━━━━━━━━┥
>> >> │ 1   │ 0.1  │ 0.086  │
>> >> │ 2   │ 0.3  │ 0.269  │
>> >> │ 3   │ 0.5  │ 0.446  │
>> >> │ 4   │ 0.6  │ 0.538  │
>> >> │ 5   │ 0.7  │ 0.626  │
>> >> │ 6   │ 0.9  │ 0.782  │
>> >>
>> >> julia> lm1 = fit(LinearModel, OptDen ~ Carb, form)
>> >> ^CERROR: InterruptException:
>> >>  [inlined code] from ./pointer.jl:17
>> >>  in geqrt!(::Array{Float64,2}, ::Array{Float64,2}) at
>> >> ./linalg/lapack.jl:434
>> >>  [inlined code] from ./abstractarray.jl:196
>> >>  in qrfact!(::Array{Float64,2}, ::Type{Val{false}}) at
>> >> ./linalg/qr.jl:88
>> >>  [inlined code] from ./linalg/qr.jl:90
>> >>  in GLM.DensePredQR{Float64}(::Array{Float64,2}, ::Array{Float64,1}) at
>> >> /home/pedro/.julia/v0.5/GLM/src/linpred.jl:39
>> >>  [inlined code] from ./boot.jl:304
>> >>  in
>> >>
>> >> fit(::Type{GLM.LinearModel{GLM.LmResp{Array{Float64,1}},GLM.DensePredQR{Float64}}},
>> >> ::Array{Float64,2}, ::Array{Float64,1}) at
>> >> /home/pedro/.julia/v0.5/GLM/src/lm.jl:63
>> >>  in
>> >>
>> >> fit(::Type{GLM.LinearModel{L<:GLM.LmResp{V<:DenseArray{T<:AbstractFloat,1}},T<:GLM.LinPred}},
>> >> ::Array{Float64,2}, ::Array{Float64,1}) at
>> >> /home/pedro/.julia/v0.5/GLM/src/lm.jl:71
>> >>  [inlined code] from ./int.jl:32
>> >>  in #fit#55(::Array{Any,1}, ::Any,
>> >>
>> >> ::Type{GLM.LinearModel{L<:GLM.LmResp{V<:DenseArray{T<:AbstractFloat,1}},T<:GLM.LinPred}},
>> >> ::DataFrames.Formula, ::DataFrames.DataFrame) at
>> >> /home/pedro/.julia/v0.5/DataFrames/src/statsmodels/statsmodel.jl:55
>> >>  in eval(::Module, ::Any) at ./boot.jl:237
>> >>
>> >> Thanks.
>> >>
>> >> Pedro
>> >>
>> >>
>> >> --
>> >> You received this message because you are subscribed to the Google
>> >> Groups
>> >> "julia-stats" group.
>> >> To unsubscribe from this group and stop receiving emails from it, send
>> >> an
>> >> email to [hidden email].
>> >> For more options, visit https://groups.google.com/d/optout.
>> >
>> >
>> > --
>> > You received this message because you are subscribed to the Google
>> > Groups
>> > "julia-stats" group.
>> > To unsubscribe from this group and stop receiving emails from it, send
>> > an
>> > email to [hidden email].
>> > For more options, visit https://groups.google.com/d/optout.
>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "julia-stats" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to [hidden email].
>> For more options, visit https://groups.google.com/d/optout.
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "julia-stats" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [hidden email].
> For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: Trouble with GLlM in new Julia install

Andreas Noack
Could you try to start julia with

OPENBLAS_CORETYPE=Piledriver julia

and see if the error is still there.

On Mon, Apr 18, 2016 at 9:58 AM, Pedro L Vera <[hidden email]> wrote:
i downloaded the binaries for 0.4.5 and still the same problem with GLM.

Did run runtests.jl but stopped it after about 20 min of no screen
output and 100% CPU usage. Didntt think it should take that long.
Maybe I'm wrong.

Thanks.

On Sun, Apr 17, 2016 at 9:23 PM, Andreas Noack
<[hidden email]> wrote:
> Nobody suggested to build from source.
>
> It would be great if you could try the generic Linux binary from
> http://julialang.org/downloads/ and/or to run the tests as described in the
> last email. Thanks.
>
> On Sun, Apr 17, 2016 at 7:21 PM, Pedro L Vera <[hidden email]> wrote:
>>
>> I've sort of given up onI this. Tried compiling bleeding edge binaries
>> or stable and could not. Installed a couple of dependencies that
>> julila needed that I did not have (like m4) and stil did not work
>> after spendig many hours.
>>
>> Maybe it's just a GLM issue. Other Julia feature seem to work.
>>
>> Maybe i'll come back to this after it settles down a bit.
>>
>> Anyway, thanks for the suggestions.
>>
>> Best,
>>
>> Pedro
>>
>> On Sat, Apr 16, 2016 at 10:13 PM, Andreas Noack
>> <[hidden email]> wrote:
>> > Are you able to run all the julia tests? I.e. something like julia
>> > $TESTDIR/runtests.jl all
>> >
>> > I'm not sure what TESTDIR is when you install from the repos but it's
>> > probably not that difficult to find.
>> >
>> > I don't have access to an Excavator machine and I haven't been able to
>> > reproduce the error on the Piledriver system that I have access to.
>> >
>> > On Sat, Apr 16, 2016 at 11:23 AM, Pedro L Vera <[hidden email]>
>> > wrote:
>> >>
>> >> I get the same error (and some new warnings) with the latest 0.5 build,
>> >> as
>> >> below
>> >>
>> >>   _       _ _(_)_     |  A fresh approach to technical computing
>> >>   (_)     | (_) (_)    |  Documentation: http://docs.julialang.org
>> >>    _ _   _| |_  __ _   |  Type "?help" for help.
>> >>   | | | | | | |/ _` |  |
>> >>   | | |_| | | | (_| |  |  Version 0.5.0-dev+3488 (2016-04-11 17:05 UTC)
>> >>  _/ |\__'_|_|_|\__'_|  |  Commit e1cf87f (4 days old master)
>> >> |__/                   |  x86_64-unknown-linux-gnu
>> >>
>> >> julia> using GLM,RDatasets
>> >> WARNING: New definition
>> >>     write(GZip.GZipStream, Array{#T<:Any, N<:Any}) at
>> >> /home/pedro/.julia/v0.5/GZip/src/GZip.jl:456
>> >> is ambiguous with:
>> >>     write(Base.IO, Array{UInt8, N<:Any}) at io.jl:154.
>> >> To fix, define
>> >>     write(GZip.GZipStream, Array{UInt8, N<:Any})
>> >> before the new definition.
>> >>
>> >> julia> form = dataset("datasets","Formaldehyde")
>> >> 6x2 DataFrames.DataFrame
>> >> │ Row │ Carb │ OptDen │
>> >> ┝━━━━━┿━━━━━━┿━━━━━━━━┥
>> >> │ 1   │ 0.1  │ 0.086  │
>> >> │ 2   │ 0.3  │ 0.269  │
>> >> │ 3   │ 0.5  │ 0.446  │
>> >> │ 4   │ 0.6  │ 0.538  │
>> >> │ 5   │ 0.7  │ 0.626  │
>> >> │ 6   │ 0.9  │ 0.782  │
>> >>
>> >> julia> lm1 = fit(LinearModel, OptDen ~ Carb, form)
>> >> ^CERROR: InterruptException:
>> >>  [inlined code] from ./pointer.jl:17
>> >>  in geqrt!(::Array{Float64,2}, ::Array{Float64,2}) at
>> >> ./linalg/lapack.jl:434
>> >>  [inlined code] from ./abstractarray.jl:196
>> >>  in qrfact!(::Array{Float64,2}, ::Type{Val{false}}) at
>> >> ./linalg/qr.jl:88
>> >>  [inlined code] from ./linalg/qr.jl:90
>> >>  in GLM.DensePredQR{Float64}(::Array{Float64,2}, ::Array{Float64,1}) at
>> >> /home/pedro/.julia/v0.5/GLM/src/linpred.jl:39
>> >>  [inlined code] from ./boot.jl:304
>> >>  in
>> >>
>> >> fit(::Type{GLM.LinearModel{GLM.LmResp{Array{Float64,1}},GLM.DensePredQR{Float64}}},
>> >> ::Array{Float64,2}, ::Array{Float64,1}) at
>> >> /home/pedro/.julia/v0.5/GLM/src/lm.jl:63
>> >>  in
>> >>
>> >> fit(::Type{GLM.LinearModel{L<:GLM.LmResp{V<:DenseArray{T<:AbstractFloat,1}},T<:GLM.LinPred}},
>> >> ::Array{Float64,2}, ::Array{Float64,1}) at
>> >> /home/pedro/.julia/v0.5/GLM/src/lm.jl:71
>> >>  [inlined code] from ./int.jl:32
>> >>  in #fit#55(::Array{Any,1}, ::Any,
>> >>
>> >> ::Type{GLM.LinearModel{L<:GLM.LmResp{V<:DenseArray{T<:AbstractFloat,1}},T<:GLM.LinPred}},
>> >> ::DataFrames.Formula, ::DataFrames.DataFrame) at
>> >> /home/pedro/.julia/v0.5/DataFrames/src/statsmodels/statsmodel.jl:55
>> >>  in eval(::Module, ::Any) at ./boot.jl:237
>> >>
>> >> Thanks.
>> >>
>> >> Pedro
>> >>
>> >>
>> >> --
>> >> You received this message because you are subscribed to the Google
>> >> Groups
>> >> "julia-stats" group.
>> >> To unsubscribe from this group and stop receiving emails from it, send
>> >> an
>> >> email to [hidden email].
>> >> For more options, visit https://groups.google.com/d/optout.
>> >
>> >
>> > --
>> > You received this message because you are subscribed to the Google
>> > Groups
>> > "julia-stats" group.
>> > To unsubscribe from this group and stop receiving emails from it, send
>> > an
>> > email to [hidden email].
>> > For more options, visit https://groups.google.com/d/optout.
>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "julia-stats" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to [hidden email].
>> For more options, visit https://groups.google.com/d/optout.
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "julia-stats" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [hidden email].
> For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: Trouble with GLlM in new Julia install

Pedro L Vera
That fixed it!

The regression now works:

julia> lm1 = fit(LinearModel, OptDen ~ Carb, form)
DataFrames.DataFrameRegressionModel{GLM.LinearModel{GLM.LmResp{Array{Float64,1}},GLM.DensePredQR{Float64}},Float64}

Formula: OptDen ~ 1 + Carb

Coefficients:
               Estimate  Std.Error  t value Pr(>|t|)
(Intercept)  0.00508571 0.00783368 0.649211   0.5516
Carb           0.876286  0.0135345  64.7444    <1e-6

Anything I should update or add to run a normal julia start?

thanks for the help.

Pedro

On Mon, Apr 18, 2016 at 10:11 AM, Andreas Noack
<[hidden email]> wrote:

> Could you try to start julia with
>
> OPENBLAS_CORETYPE=Piledriver julia
>
> and see if the error is still there.
>
> On Mon, Apr 18, 2016 at 9:58 AM, Pedro L Vera <[hidden email]> wrote:
>>
>> i downloaded the binaries for 0.4.5 and still the same problem with GLM.
>>
>> Did run runtests.jl but stopped it after about 20 min of no screen
>> output and 100% CPU usage. Didntt think it should take that long.
>> Maybe I'm wrong.
>>
>> Thanks.
>>
>> On Sun, Apr 17, 2016 at 9:23 PM, Andreas Noack
>> <[hidden email]> wrote:
>> > Nobody suggested to build from source.
>> >
>> > It would be great if you could try the generic Linux binary from
>> > http://julialang.org/downloads/ and/or to run the tests as described in
>> > the
>> > last email. Thanks.
>> >
>> > On Sun, Apr 17, 2016 at 7:21 PM, Pedro L Vera <[hidden email]>
>> > wrote:
>> >>
>> >> I've sort of given up onI this. Tried compiling bleeding edge binaries
>> >> or stable and could not. Installed a couple of dependencies that
>> >> julila needed that I did not have (like m4) and stil did not work
>> >> after spendig many hours.
>> >>
>> >> Maybe it's just a GLM issue. Other Julia feature seem to work.
>> >>
>> >> Maybe i'll come back to this after it settles down a bit.
>> >>
>> >> Anyway, thanks for the suggestions.
>> >>
>> >> Best,
>> >>
>> >> Pedro
>> >>
>> >> On Sat, Apr 16, 2016 at 10:13 PM, Andreas Noack
>> >> <[hidden email]> wrote:
>> >> > Are you able to run all the julia tests? I.e. something like julia
>> >> > $TESTDIR/runtests.jl all
>> >> >
>> >> > I'm not sure what TESTDIR is when you install from the repos but it's
>> >> > probably not that difficult to find.
>> >> >
>> >> > I don't have access to an Excavator machine and I haven't been able
>> >> > to
>> >> > reproduce the error on the Piledriver system that I have access to.
>> >> >
>> >> > On Sat, Apr 16, 2016 at 11:23 AM, Pedro L Vera <[hidden email]>
>> >> > wrote:
>> >> >>
>> >> >> I get the same error (and some new warnings) with the latest 0.5
>> >> >> build,
>> >> >> as
>> >> >> below
>> >> >>
>> >> >>   _       _ _(_)_     |  A fresh approach to technical computing
>> >> >>   (_)     | (_) (_)    |  Documentation: http://docs.julialang.org
>> >> >>    _ _   _| |_  __ _   |  Type "?help" for help.
>> >> >>   | | | | | | |/ _` |  |
>> >> >>   | | |_| | | | (_| |  |  Version 0.5.0-dev+3488 (2016-04-11 17:05
>> >> >> UTC)
>> >> >>  _/ |\__'_|_|_|\__'_|  |  Commit e1cf87f (4 days old master)
>> >> >> |__/                   |  x86_64-unknown-linux-gnu
>> >> >>
>> >> >> julia> using GLM,RDatasets
>> >> >> WARNING: New definition
>> >> >>     write(GZip.GZipStream, Array{#T<:Any, N<:Any}) at
>> >> >> /home/pedro/.julia/v0.5/GZip/src/GZip.jl:456
>> >> >> is ambiguous with:
>> >> >>     write(Base.IO, Array{UInt8, N<:Any}) at io.jl:154.
>> >> >> To fix, define
>> >> >>     write(GZip.GZipStream, Array{UInt8, N<:Any})
>> >> >> before the new definition.
>> >> >>
>> >> >> julia> form = dataset("datasets","Formaldehyde")
>> >> >> 6x2 DataFrames.DataFrame
>> >> >> │ Row │ Carb │ OptDen │
>> >> >> ┝━━━━━┿━━━━━━┿━━━━━━━━┥
>> >> >> │ 1   │ 0.1  │ 0.086  │
>> >> >> │ 2   │ 0.3  │ 0.269  │
>> >> >> │ 3   │ 0.5  │ 0.446  │
>> >> >> │ 4   │ 0.6  │ 0.538  │
>> >> >> │ 5   │ 0.7  │ 0.626  │
>> >> >> │ 6   │ 0.9  │ 0.782  │
>> >> >>
>> >> >> julia> lm1 = fit(LinearModel, OptDen ~ Carb, form)
>> >> >> ^CERROR: InterruptException:
>> >> >>  [inlined code] from ./pointer.jl:17
>> >> >>  in geqrt!(::Array{Float64,2}, ::Array{Float64,2}) at
>> >> >> ./linalg/lapack.jl:434
>> >> >>  [inlined code] from ./abstractarray.jl:196
>> >> >>  in qrfact!(::Array{Float64,2}, ::Type{Val{false}}) at
>> >> >> ./linalg/qr.jl:88
>> >> >>  [inlined code] from ./linalg/qr.jl:90
>> >> >>  in GLM.DensePredQR{Float64}(::Array{Float64,2}, ::Array{Float64,1})
>> >> >> at
>> >> >> /home/pedro/.julia/v0.5/GLM/src/linpred.jl:39
>> >> >>  [inlined code] from ./boot.jl:304
>> >> >>  in
>> >> >>
>> >> >>
>> >> >> fit(::Type{GLM.LinearModel{GLM.LmResp{Array{Float64,1}},GLM.DensePredQR{Float64}}},
>> >> >> ::Array{Float64,2}, ::Array{Float64,1}) at
>> >> >> /home/pedro/.julia/v0.5/GLM/src/lm.jl:63
>> >> >>  in
>> >> >>
>> >> >>
>> >> >> fit(::Type{GLM.LinearModel{L<:GLM.LmResp{V<:DenseArray{T<:AbstractFloat,1}},T<:GLM.LinPred}},
>> >> >> ::Array{Float64,2}, ::Array{Float64,1}) at
>> >> >> /home/pedro/.julia/v0.5/GLM/src/lm.jl:71
>> >> >>  [inlined code] from ./int.jl:32
>> >> >>  in #fit#55(::Array{Any,1}, ::Any,
>> >> >>
>> >> >>
>> >> >> ::Type{GLM.LinearModel{L<:GLM.LmResp{V<:DenseArray{T<:AbstractFloat,1}},T<:GLM.LinPred}},
>> >> >> ::DataFrames.Formula, ::DataFrames.DataFrame) at
>> >> >> /home/pedro/.julia/v0.5/DataFrames/src/statsmodels/statsmodel.jl:55
>> >> >>  in eval(::Module, ::Any) at ./boot.jl:237
>> >> >>
>> >> >> Thanks.
>> >> >>
>> >> >> Pedro
>> >> >>
>> >> >>
>> >> >> --
>> >> >> You received this message because you are subscribed to the Google
>> >> >> Groups
>> >> >> "julia-stats" group.
>> >> >> To unsubscribe from this group and stop receiving emails from it,
>> >> >> send
>> >> >> an
>> >> >> email to [hidden email].
>> >> >> For more options, visit https://groups.google.com/d/optout.
>> >> >
>> >> >
>> >> > --
>> >> > You received this message because you are subscribed to the Google
>> >> > Groups
>> >> > "julia-stats" group.
>> >> > To unsubscribe from this group and stop receiving emails from it,
>> >> > send
>> >> > an
>> >> > email to [hidden email].
>> >> > For more options, visit https://groups.google.com/d/optout.
>> >>
>> >> --
>> >> You received this message because you are subscribed to the Google
>> >> Groups
>> >> "julia-stats" group.
>> >> To unsubscribe from this group and stop receiving emails from it, send
>> >> an
>> >> email to [hidden email].
>> >> For more options, visit https://groups.google.com/d/optout.
>> >
>> >
>> > --
>> > You received this message because you are subscribed to the Google
>> > Groups
>> > "julia-stats" group.
>> > To unsubscribe from this group and stop receiving emails from it, send
>> > an
>> > email to [hidden email].
>> > For more options, visit https://groups.google.com/d/optout.
>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "julia-stats" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to [hidden email].
>> For more options, visit https://groups.google.com/d/optout.
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "julia-stats" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [hidden email].
> For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: Trouble with GLlM in new Julia install

Andreas Noack
The issue is probably in OpenBLAS' optimized kernels for triangular multiplication on Excavator. By running julia with OPENBLAS_CORETYPE=Piledriver you force OpenBLAS to use kernels for the older Piledriver microarchitecture. These are probably a bit slower but they don't have the bug that causes the freeze. I'll report this to the OpenBLAS developers and, hopefully, that are able to fix it before our next release. In the meantime, you can add

export OPENBLAS_CORETYPE=Piledriver

to your .profile.

On Mon, Apr 18, 2016 at 10:20 AM, Pedro L Vera <[hidden email]> wrote:
That fixed it!

The regression now works:

julia> lm1 = fit(LinearModel, OptDen ~ Carb, form)
DataFrames.DataFrameRegressionModel{GLM.LinearModel{GLM.LmResp{Array{Float64,1}},GLM.DensePredQR{Float64}},Float64}

Formula: OptDen ~ 1 + Carb

Coefficients:
               Estimate  Std.Error  t value Pr(>|t|)
(Intercept)  0.00508571 0.00783368 0.649211   0.5516
Carb           0.876286  0.0135345  64.7444    <1e-6

Anything I should update or add to run a normal julia start?

thanks for the help.

Pedro

On Mon, Apr 18, 2016 at 10:11 AM, Andreas Noack
<[hidden email]> wrote:
> Could you try to start julia with
>
> OPENBLAS_CORETYPE=Piledriver julia
>
> and see if the error is still there.
>
> On Mon, Apr 18, 2016 at 9:58 AM, Pedro L Vera <[hidden email]> wrote:
>>
>> i downloaded the binaries for 0.4.5 and still the same problem with GLM.
>>
>> Did run runtests.jl but stopped it after about 20 min of no screen
>> output and 100% CPU usage. Didntt think it should take that long.
>> Maybe I'm wrong.
>>
>> Thanks.
>>
>> On Sun, Apr 17, 2016 at 9:23 PM, Andreas Noack
>> <[hidden email]> wrote:
>> > Nobody suggested to build from source.
>> >
>> > It would be great if you could try the generic Linux binary from
>> > http://julialang.org/downloads/ and/or to run the tests as described in
>> > the
>> > last email. Thanks.
>> >
>> > On Sun, Apr 17, 2016 at 7:21 PM, Pedro L Vera <[hidden email]>
>> > wrote:
>> >>
>> >> I've sort of given up onI this. Tried compiling bleeding edge binaries
>> >> or stable and could not. Installed a couple of dependencies that
>> >> julila needed that I did not have (like m4) and stil did not work
>> >> after spendig many hours.
>> >>
>> >> Maybe it's just a GLM issue. Other Julia feature seem to work.
>> >>
>> >> Maybe i'll come back to this after it settles down a bit.
>> >>
>> >> Anyway, thanks for the suggestions.
>> >>
>> >> Best,
>> >>
>> >> Pedro
>> >>
>> >> On Sat, Apr 16, 2016 at 10:13 PM, Andreas Noack
>> >> <[hidden email]> wrote:
>> >> > Are you able to run all the julia tests? I.e. something like julia
>> >> > $TESTDIR/runtests.jl all
>> >> >
>> >> > I'm not sure what TESTDIR is when you install from the repos but it's
>> >> > probably not that difficult to find.
>> >> >
>> >> > I don't have access to an Excavator machine and I haven't been able
>> >> > to
>> >> > reproduce the error on the Piledriver system that I have access to.
>> >> >
>> >> > On Sat, Apr 16, 2016 at 11:23 AM, Pedro L Vera <[hidden email]>
>> >> > wrote:
>> >> >>
>> >> >> I get the same error (and some new warnings) with the latest 0.5
>> >> >> build,
>> >> >> as
>> >> >> below
>> >> >>
>> >> >>   _       _ _(_)_     |  A fresh approach to technical computing
>> >> >>   (_)     | (_) (_)    |  Documentation: http://docs.julialang.org
>> >> >>    _ _   _| |_  __ _   |  Type "?help" for help.
>> >> >>   | | | | | | |/ _` |  |
>> >> >>   | | |_| | | | (_| |  |  Version 0.5.0-dev+3488 (2016-04-11 17:05
>> >> >> UTC)
>> >> >>  _/ |\__'_|_|_|\__'_|  |  Commit e1cf87f (4 days old master)
>> >> >> |__/                   |  x86_64-unknown-linux-gnu
>> >> >>
>> >> >> julia> using GLM,RDatasets
>> >> >> WARNING: New definition
>> >> >>     write(GZip.GZipStream, Array{#T<:Any, N<:Any}) at
>> >> >> /home/pedro/.julia/v0.5/GZip/src/GZip.jl:456
>> >> >> is ambiguous with:
>> >> >>     write(Base.IO, Array{UInt8, N<:Any}) at io.jl:154.
>> >> >> To fix, define
>> >> >>     write(GZip.GZipStream, Array{UInt8, N<:Any})
>> >> >> before the new definition.
>> >> >>
>> >> >> julia> form = dataset("datasets","Formaldehyde")
>> >> >> 6x2 DataFrames.DataFrame
>> >> >> │ Row │ Carb │ OptDen │
>> >> >> ┝━━━━━┿━━━━━━┿━━━━━━━━┥
>> >> >> │ 1   │ 0.1  │ 0.086  │
>> >> >> │ 2   │ 0.3  │ 0.269  │
>> >> >> │ 3   │ 0.5  │ 0.446  │
>> >> >> │ 4   │ 0.6  │ 0.538  │
>> >> >> │ 5   │ 0.7  │ 0.626  │
>> >> >> │ 6   │ 0.9  │ 0.782  │
>> >> >>
>> >> >> julia> lm1 = fit(LinearModel, OptDen ~ Carb, form)
>> >> >> ^CERROR: InterruptException:
>> >> >>  [inlined code] from ./pointer.jl:17
>> >> >>  in geqrt!(::Array{Float64,2}, ::Array{Float64,2}) at
>> >> >> ./linalg/lapack.jl:434
>> >> >>  [inlined code] from ./abstractarray.jl:196
>> >> >>  in qrfact!(::Array{Float64,2}, ::Type{Val{false}}) at
>> >> >> ./linalg/qr.jl:88
>> >> >>  [inlined code] from ./linalg/qr.jl:90
>> >> >>  in GLM.DensePredQR{Float64}(::Array{Float64,2}, ::Array{Float64,1})
>> >> >> at
>> >> >> /home/pedro/.julia/v0.5/GLM/src/linpred.jl:39
>> >> >>  [inlined code] from ./boot.jl:304
>> >> >>  in
>> >> >>
>> >> >>
>> >> >> fit(::Type{GLM.LinearModel{GLM.LmResp{Array{Float64,1}},GLM.DensePredQR{Float64}}},
>> >> >> ::Array{Float64,2}, ::Array{Float64,1}) at
>> >> >> /home/pedro/.julia/v0.5/GLM/src/lm.jl:63
>> >> >>  in
>> >> >>
>> >> >>
>> >> >> fit(::Type{GLM.LinearModel{L<:GLM.LmResp{V<:DenseArray{T<:AbstractFloat,1}},T<:GLM.LinPred}},
>> >> >> ::Array{Float64,2}, ::Array{Float64,1}) at
>> >> >> /home/pedro/.julia/v0.5/GLM/src/lm.jl:71
>> >> >>  [inlined code] from ./int.jl:32
>> >> >>  in #fit#55(::Array{Any,1}, ::Any,
>> >> >>
>> >> >>
>> >> >> ::Type{GLM.LinearModel{L<:GLM.LmResp{V<:DenseArray{T<:AbstractFloat,1}},T<:GLM.LinPred}},
>> >> >> ::DataFrames.Formula, ::DataFrames.DataFrame) at
>> >> >> /home/pedro/.julia/v0.5/DataFrames/src/statsmodels/statsmodel.jl:55
>> >> >>  in eval(::Module, ::Any) at ./boot.jl:237
>> >> >>
>> >> >> Thanks.
>> >> >>
>> >> >> Pedro
>> >> >>
>> >> >>
>> >> >> --
>> >> >> You received this message because you are subscribed to the Google
>> >> >> Groups
>> >> >> "julia-stats" group.
>> >> >> To unsubscribe from this group and stop receiving emails from it,
>> >> >> send
>> >> >> an
>> >> >> email to [hidden email].
>> >> >> For more options, visit https://groups.google.com/d/optout.
>> >> >
>> >> >
>> >> > --
>> >> > You received this message because you are subscribed to the Google
>> >> > Groups
>> >> > "julia-stats" group.
>> >> > To unsubscribe from this group and stop receiving emails from it,
>> >> > send
>> >> > an
>> >> > email to [hidden email].
>> >> > For more options, visit https://groups.google.com/d/optout.
>> >>
>> >> --
>> >> You received this message because you are subscribed to the Google
>> >> Groups
>> >> "julia-stats" group.
>> >> To unsubscribe from this group and stop receiving emails from it, send
>> >> an
>> >> email to [hidden email].
>> >> For more options, visit https://groups.google.com/d/optout.
>> >
>> >
>> > --
>> > You received this message because you are subscribed to the Google
>> > Groups
>> > "julia-stats" group.
>> > To unsubscribe from this group and stop receiving emails from it, send
>> > an
>> > email to [hidden email].
>> > For more options, visit https://groups.google.com/d/optout.
>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "julia-stats" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to [hidden email].
>> For more options, visit https://groups.google.com/d/optout.
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "julia-stats" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [hidden email].
> For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: Trouble with GLlM in new Julia install

Pedro L Vera
Great.
Thanks so much for all your help.

Pedro

--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: Trouble with GLlM in new Julia install

Milan Bouchet-Valat
In reply to this post by Andreas Noack
Le lundi 18 avril 2016 à 10:43 -0400, Andreas Noack a écrit :
> The issue is probably in OpenBLAS' optimized kernels for triangular
> multiplication on Excavator.
For reference, Andreas filed the issue upstream here:
https://github.com/xianyi/OpenBLAS/issues/841


Regards

> By running julia with OPENBLAS_CORETYPE=Piledriver you force OpenBLAS
> to use kernels for the older Piledriver microarchitecture. These are
> probably a bit slower but they don't have the bug that causes the
> freeze. I'll report this to the OpenBLAS developers and, hopefully,
> that are able to fix it before our next release. In the meantime, you
> can add
>
> export OPENBLAS_CORETYPE=Piledriver
>
> to your .profile.
>
> On Mon, Apr 18, 2016 at 10:20 AM, Pedro L Vera <[hidden email]>
> wrote:
> > That fixed it!
> >
> > The regression now works:
> >
> > julia> lm1 = fit(LinearModel, OptDen ~ Carb, form)
> > DataFrames.DataFrameRegressionModel{GLM.LinearModel{GLM.LmResp{Arra
> > y{Float64,1}},GLM.DensePredQR{Float64}},Float64}
> >
> > Formula: OptDen ~ 1 + Carb
> >
> > Coefficients:
> >                Estimate  Std.Error  t value Pr(>|t|)
> > (Intercept)  0.00508571 0.00783368 0.649211   0.5516
> > Carb           0.876286  0.0135345  64.7444    <1e-6
> >
> > Anything I should update or add to run a normal julia start?
> >
> > thanks for the help.
> >
> > Pedro
> >
> > On Mon, Apr 18, 2016 at 10:11 AM, Andreas Noack
> > <[hidden email]> wrote:
> > > Could you try to start julia with
> > >
> > > OPENBLAS_CORETYPE=Piledriver julia
> > >
> > > and see if the error is still there.
> > >
> > > On Mon, Apr 18, 2016 at 9:58 AM, Pedro L Vera <[hidden email]
> > om> wrote:
> > >>
> > >> i downloaded the binaries for 0.4.5 and still the same problem
> > with GLM.
> > >>
> > >> Did run runtests.jl but stopped it after about 20 min of no
> > screen
> > >> output and 100% CPU usage. Didntt think it should take that
> > long.
> > >> Maybe I'm wrong.
> > >>
> > >> Thanks.
> > >>
> > >> On Sun, Apr 17, 2016 at 9:23 PM, Andreas Noack
> > >> <[hidden email]> wrote:
> > >> > Nobody suggested to build from source.
> > >> >
> > >> > It would be great if you could try the generic Linux binary
> > from
> > >> > http://julialang.org/downloads/ and/or to run the tests as
> > described in
> > >> > the
> > >> > last email. Thanks.
> > >> >
> > >> > On Sun, Apr 17, 2016 at 7:21 PM, Pedro L Vera <pedrolvera@gmai
> > l.com>
> > >> > wrote:
> > >> >>
> > >> >> I've sort of given up onI this. Tried compiling bleeding edge
> > binaries
> > >> >> or stable and could not. Installed a couple of dependencies
> > that
> > >> >> julila needed that I did not have (like m4) and stil did not
> > work
> > >> >> after spendig many hours.
> > >> >>
> > >> >> Maybe it's just a GLM issue. Other Julia feature seem to
> > work.
> > >> >>
> > >> >> Maybe i'll come back to this after it settles down a bit.
> > >> >>
> > >> >> Anyway, thanks for the suggestions.
> > >> >>
> > >> >> Best,
> > >> >>
> > >> >> Pedro
> > >> >>
> > >> >> On Sat, Apr 16, 2016 at 10:13 PM, Andreas Noack
> > >> >> <[hidden email]> wrote:
> > >> >> > Are you able to run all the julia tests? I.e. something
> > like julia
> > >> >> > $TESTDIR/runtests.jl all
> > >> >> >
> > >> >> > I'm not sure what TESTDIR is when you install from the
> > repos but it's
> > >> >> > probably not that difficult to find.
> > >> >> >
> > >> >> > I don't have access to an Excavator machine and I haven't
> > been able
> > >> >> > to
> > >> >> > reproduce the error on the Piledriver system that I have
> > access to.
> > >> >> >
> > >> >> > On Sat, Apr 16, 2016 at 11:23 AM, Pedro L Vera <pedrolvera@
> > gmail.com>
> > >> >> > wrote:
> > >> >> >>
> > >> >> >> I get the same error (and some new warnings) with the
> > latest 0.5
> > >> >> >> build,
> > >> >> >> as
> > >> >> >> below
> > >> >> >>
> > >> >> >>   _       _ _(_)_     |  A fresh approach to technical
> > computing
> > >> >> >>   (_)     | (_) (_)    |  Documentation: http://docs.julia
> > lang.org
> > >> >> >>    _ _   _| |_  __ _   |  Type "?help" for help.
> > >> >> >>   | | | | | | |/ _` |  |
> > >> >> >>   | | |_| | | | (_| |  |  Version 0.5.0-dev+3488 (2016-04-
> > 11 17:05
> > >> >> >> UTC)
> > >> >> >>  _/ |\__'_|_|_|\__'_|  |  Commit e1cf87f (4 days old
> > master)
> > >> >> >> |__/                   |  x86_64-unknown-linux-gnu
> > >> >> >>
> > >> >> >> julia> using GLM,RDatasets
> > >> >> >> WARNING: New definition
> > >> >> >>     write(GZip.GZipStream, Array{#T<:Any, N<:Any}) at
> > >> >> >> /home/pedro/.julia/v0.5/GZip/src/GZip.jl:456
> > >> >> >> is ambiguous with:
> > >> >> >>     write(Base.IO, Array{UInt8, N<:Any}) at io.jl:154.
> > >> >> >> To fix, define
> > >> >> >>     write(GZip.GZipStream, Array{UInt8, N<:Any})
> > >> >> >> before the new definition.
> > >> >> >>
> > >> >> >> julia> form = dataset("datasets","Formaldehyde")
> > >> >> >> 6x2 DataFrames.DataFrame
> > >> >> >> │ Row │ Carb │ OptDen │
> > >> >> >> ┝━━━━━┿━━━━━━┿━━━━━━━━┥
> > >> >> >> │ 1   │ 0.1  │ 0.086  │
> > >> >> >> │ 2   │ 0.3  │ 0.269  │
> > >> >> >> │ 3   │ 0.5  │ 0.446  │
> > >> >> >> │ 4   │ 0.6  │ 0.538  │
> > >> >> >> │ 5   │ 0.7  │ 0.626  │
> > >> >> >> │ 6   │ 0.9  │ 0.782  │
> > >> >> >>
> > >> >> >> julia> lm1 = fit(LinearModel, OptDen ~ Carb, form)
> > >> >> >> ^CERROR: InterruptException:
> > >> >> >>  [inlined code] from ./pointer.jl:17
> > >> >> >>  in geqrt!(::Array{Float64,2}, ::Array{Float64,2}) at
> > >> >> >> ./linalg/lapack.jl:434
> > >> >> >>  [inlined code] from ./abstractarray.jl:196
> > >> >> >>  in qrfact!(::Array{Float64,2}, ::Type{Val{false}}) at
> > >> >> >> ./linalg/qr.jl:88
> > >> >> >>  [inlined code] from ./linalg/qr.jl:90
> > >> >> >>  in GLM.DensePredQR{Float64}(::Array{Float64,2},
> > ::Array{Float64,1})
> > >> >> >> at
> > >> >> >> /home/pedro/.julia/v0.5/GLM/src/linpred.jl:39
> > >> >> >>  [inlined code] from ./boot.jl:304
> > >> >> >>  in
> > >> >> >>
> > >> >> >>
> > >> >> >>
> > fit(::Type{GLM.LinearModel{GLM.LmResp{Array{Float64,1}},GLM.DensePr
> > edQR{Float64}}},
> > >> >> >> ::Array{Float64,2}, ::Array{Float64,1}) at
> > >> >> >> /home/pedro/.julia/v0.5/GLM/src/lm.jl:63
> > >> >> >>  in
> > >> >> >>
> > >> >> >>
> > >> >> >>
> > fit(::Type{GLM.LinearModel{L<:GLM.LmResp{V<:DenseArray{T<:AbstractF
> > loat,1}},T<:GLM.LinPred}},
> > >> >> >> ::Array{Float64,2}, ::Array{Float64,1}) at
> > >> >> >> /home/pedro/.julia/v0.5/GLM/src/lm.jl:71
> > >> >> >>  [inlined code] from ./int.jl:32
> > >> >> >>  in #fit#55(::Array{Any,1}, ::Any,
> > >> >> >>
> > >> >> >>
> > >> >> >>
> > ::Type{GLM.LinearModel{L<:GLM.LmResp{V<:DenseArray{T<:AbstractFloat
> > ,1}},T<:GLM.LinPred}},
> > >> >> >> ::DataFrames.Formula, ::DataFrames.DataFrame) at
> > >> >> >>
> > /home/pedro/.julia/v0.5/DataFrames/src/statsmodels/statsmodel.jl:55
> > >> >> >>  in eval(::Module, ::Any) at ./boot.jl:237
> > >> >> >>
> > >> >> >> Thanks.
> > >> >> >>
> > >> >> >> Pedro
> > >> >> >>
> > >> >> >>
> > >> >> >> --
> > >> >> >> You received this message because you are subscribed to
> > the Google
> > >> >> >> Groups
> > >> >> >> "julia-stats" group.
> > >> >> >> To unsubscribe from this group and stop receiving emails
> > from it,
> > >> >> >> send
> > >> >> >> an
> > >> >> >> email to [hidden email].
> > >> >> >> For more options, visit https://groups.google.com/d/optout
> > .
> > >> >> >
> > >> >> >
> > >> >> > --
> > >> >> > You received this message because you are subscribed to the
> > Google
> > >> >> > Groups
> > >> >> > "julia-stats" group.
> > >> >> > To unsubscribe from this group and stop receiving emails
> > from it,
> > >> >> > send
> > >> >> > an
> > >> >> > email to [hidden email].
> > >> >> > For more options, visit https://groups.google.com/d/optout.
> > >> >>
> > >> >> --
> > >> >> You received this message because you are subscribed to the
> > Google
> > >> >> Groups
> > >> >> "julia-stats" group.
> > >> >> To unsubscribe from this group and stop receiving emails from
> > it, send
> > >> >> an
> > >> >> email to [hidden email].
> > >> >> For more options, visit https://groups.google.com/d/optout.
> > >> >
> > >> >
> > >> > --
> > >> > You received this message because you are subscribed to the
> > Google
> > >> > Groups
> > >> > "julia-stats" group.
> > >> > To unsubscribe from this group and stop receiving emails from
> > it, send
> > >> > an
> > >> > email to [hidden email].
> > >> > For more options, visit https://groups.google.com/d/optout.
> > >>
> > >> --
> > >> You received this message because you are subscribed to the
> > Google Groups
> > >> "julia-stats" group.
> > >> To unsubscribe from this group and stop receiving emails from
> > it, send an
> > >> email to [hidden email].
> > >> For more options, visit https://groups.google.com/d/optout.
> > >
> > >
> > > --
> > > You received this message because you are subscribed to the
> > Google Groups
> > > "julia-stats" group.
> > > To unsubscribe from this group and stop receiving emails from it,
> > send an
> > > email to [hidden email].
> > > For more options, visit https://groups.google.com/d/optout.
> >
> > --
> > You received this message because you are subscribed to the Google
> > Groups "julia-stats" group.
> > To unsubscribe from this group and stop receiving emails from it,
> > send an email to [hidden email].
> > For more options, visit https://groups.google.com/d/optout.
> >
> -- 
> You received this message because you are subscribed to the Google
> Groups "julia-stats" group.
> To unsubscribe from this group and stop receiving emails from it,
> send an email to [hidden email].
> For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.