0.5 performance

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

0.5 performance

David Anthoff

Should non-core devs start to look at 0.5 performance, or is it all still too much a work in progress to report back how performance for existing code bases changes between 0.4 and 0.5?

 

Just for fun I ran one of my code bases today on 0.4 and on 0.5. I see a pretty significant increase in runtime on 0.5, about 35%:

 

julia 0.4: 665.095440 seconds (2.03 G allocations: 36.515 GB, 0.53% gc time)

julia 0.5: 904.061203 seconds (4.55 G allocations: 75.579 GB, 0.72% gc time)

 

Also, is there a preference from the core team on how to report this kind of stuff? I don’t have the time nor expertise to track down the root cause of what I’m seeing there, and the code I’m using is not public at this point. I’d be happy to give julia core devs access to the repo if someone wanted to investigate what is going on.

 

Cheers,

David

 

--

David Anthoff

University of California, Berkeley

 

http://www.david-anthoff.com

 

Reply | Threaded
Open this post in threaded view
|

Re: 0.5 performance

Stefan Karpinski
Definitely! Reports of performance regressions are incredibly valuable. You can just open an issue titled with "perf regression: $(description)" and some code to reproduce if possible. We're very much in the phase of trying to take care of these.

On Fri, Apr 22, 2016 at 2:14 PM, David Anthoff <[hidden email]> wrote:

Should non-core devs start to look at 0.5 performance, or is it all still too much a work in progress to report back how performance for existing code bases changes between 0.4 and 0.5?

 

Just for fun I ran one of my code bases today on 0.4 and on 0.5. I see a pretty significant increase in runtime on 0.5, about 35%:

 

julia 0.4: 665.095440 seconds (2.03 G allocations: 36.515 GB, 0.53% gc time)

julia 0.5: 904.061203 seconds (4.55 G allocations: 75.579 GB, 0.72% gc time)

 

Also, is there a preference from the core team on how to report this kind of stuff? I don’t have the time nor expertise to track down the root cause of what I’m seeing there, and the code I’m using is not public at this point. I’d be happy to give julia core devs access to the repo if someone wanted to investigate what is going on.

 

Cheers,

David

 

--

David Anthoff

University of California, Berkeley

 

http://www.david-anthoff.com

 


Reply | Threaded
Open this post in threaded view
|

Re: 0.5 performance

Viral Shah
+1

This is definitely the right time to start testing codes with 0.5 and report regressions. In my perfect world, we would have 0.5 at JuliaCon!

-viral

On Friday, April 22, 2016 at 11:53:34 PM UTC+5:30, Stefan Karpinski wrote:
Definitely! Reports of performance regressions are incredibly valuable. You can just open an issue titled with "perf regression: $(description)" and some code to reproduce if possible. We're very much in the phase of trying to take care of these.

On Fri, Apr 22, 2016 at 2:14 PM, David Anthoff <[hidden email]> wrote:

Should non-core devs start to look at 0.5 performance, or is it all still too much a work in progress to report back how performance for existing code bases changes between 0.4 and 0.5?

 

Just for fun I ran one of my code bases today on 0.4 and on 0.5. I see a pretty significant increase in runtime on 0.5, about 35%:

 

julia 0.4: 665.095440 seconds (2.03 G allocations: 36.515 GB, 0.53% gc time)

julia 0.5: 904.061203 seconds (4.55 G allocations: 75.579 GB, 0.72% gc time)

 

Also, is there a preference from the core team on how to report this kind of stuff? I don’t have the time nor expertise to track down the root cause of what I’m seeing there, and the code I’m using is not public at this point. I’d be happy to give julia core devs access to the repo if someone wanted to investigate what is going on.

 

Cheers,

David

 

--

David Anthoff

University of California, Berkeley

 

<a href="http://www.david-anthoff.com" target="_blank" rel="nofollow" onmousedown="this.href=&#39;http://www.google.com/url?q\x3dhttp%3A%2F%2Fwww.david-anthoff.com\x26sa\x3dD\x26sntz\x3d1\x26usg\x3dAFQjCNHPDijM7QgcCRbDWt_-VR5r_HnDGg&#39;;return true;" onclick="this.href=&#39;http://www.google.com/url?q\x3dhttp%3A%2F%2Fwww.david-anthoff.com\x26sa\x3dD\x26sntz\x3d1\x26usg\x3dAFQjCNHPDijM7QgcCRbDWt_-VR5r_HnDGg&#39;;return true;">http://www.david-anthoff.com

 


Reply | Threaded
Open this post in threaded view
|

RE: 0.5 performance

David Anthoff

Alright, I got it in shape so that one can run the example. Here is the issue:

 

https://github.com/JuliaLang/julia/issues/16047

 

The code is not public, but I’m happy to give access to any core developer who wants to figure out what is going on.

 

From: [hidden email] [mailto:[hidden email]] On Behalf Of Viral Shah
Sent: Saturday, April 23, 2016 7:12 AM
To: julia-dev <[hidden email]>
Subject: Re: [julia-dev] 0.5 performance

 

+1

 

This is definitely the right time to start testing codes with 0.5 and report regressions. In my perfect world, we would have 0.5 at JuliaCon!

 

-viral

On Friday, April 22, 2016 at 11:53:34 PM UTC+5:30, Stefan Karpinski wrote:

Definitely! Reports of performance regressions are incredibly valuable. You can just open an issue titled with "perf regression: $(description)" and some code to reproduce if possible. We're very much in the phase of trying to take care of these.

 

On Fri, Apr 22, 2016 at 2:14 PM, David Anthoff <[hidden email]> wrote:

Should non-core devs start to look at 0.5 performance, or is it all still too much a work in progress to report back how performance for existing code bases changes between 0.4 and 0.5?

 

Just for fun I ran one of my code bases today on 0.4 and on 0.5. I see a pretty significant increase in runtime on 0.5, about 35%:

 

julia 0.4: 665.095440 seconds (2.03 G allocations: 36.515 GB, 0.53% gc time)

julia 0.5: 904.061203 seconds (4.55 G allocations: 75.579 GB, 0.72% gc time)

 

Also, is there a preference from the core team on how to report this kind of stuff? I don’t have the time nor expertise to track down the root cause of what I’m seeing there, and the code I’m using is not public at this point. I’d be happy to give julia core devs access to the repo if someone wanted to investigate what is going on.

 

Cheers,

David

 

--

David Anthoff

University of California, Berkeley

 

http://www.david-anthoff.com