Cost of @view and reshape

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Cost of @view and reshape

Alexey Cherkaev
I'm writing RadauIIA (for now, fixed order 5 with 3 points) method for ODE BVP (basically, collocation method). In the process, I construct an overall non-linear equation that needs to be solved. It takes "mega-vector" x[j] as an argument. However, internally it is more convenient to reshape it to y[m,i,n] where m is the index of original ODE vector, i is the index of the collocation point on time element (or layer) and n is time element index. Also, some inputs to the method (ODE RHS function and BVP function) expect z[m]-kind vector. So far I chose to pass a @view of the "mega-vector" to them.

The alternatives for reshaping and @view would be:
  • Use the inline function or a macro that maps the indices between mega-vector and arrays (I've tried it, didn't see any difference in performance or memory allocation, but @code_warntype has less "red" spots)
  • Copy relevant pieces of mega-vector into preallocated arrays of desired shape. This can also be an alternative for @view.
Is there some kind of rule of thumb where which one would be preferable? And are there any high costs associated with @view and reshape?

Reply | Threaded
Open this post in threaded view
|

Re: Cost of @view and reshape

Michael Borregaard

You can reshape the view. This should be cheap.
Reply | Threaded
Open this post in threaded view
|

Re: Cost of @view and reshape

Chris Rackauckas
In reply to this post by Alexey Cherkaev
reshape makes a view, and views are cheap. Don't worry about this.

BTW, I would love to add a collocation method to JuliaDiffEq. Would you consider making this a package?

On Sunday, October 30, 2016 at 3:52:37 AM UTC-7, Alexey Cherkaev wrote:
I'm writing RadauIIA (for now, fixed order 5 with 3 points) method for ODE BVP (basically, collocation method). In the process, I construct an overall non-linear equation that needs to be solved. It takes "mega-vector" x[j] as an argument. However, internally it is more convenient to reshape it to y[m,i,n] where m is the index of original ODE vector, i is the index of the collocation point on time element (or layer) and n is time element index. Also, some inputs to the method (ODE RHS function and BVP function) expect z[m]-kind vector. So far I chose to pass a @view of the "mega-vector" to them.

The alternatives for reshaping and @view would be:
  • Use the inline function or a macro that maps the indices between mega-vector and arrays (I've tried it, didn't see any difference in performance or memory allocation, but @code_warntype has less "red" spots)
  • Copy relevant pieces of mega-vector into preallocated arrays of desired shape. This can also be an alternative for @view.
Is there some kind of rule of thumb where which one would be preferable? And are there any high costs associated with @view and reshape?

Reply | Threaded
Open this post in threaded view
|

Re: Cost of @view and reshape

Alexey Cherkaev
The package is available at https://github.com/mobius-eng/RadauBVP.jl

I haven't put it into METADATA yet. I would like to improve documentation and add some tests before doing this.

However, it is already usable, mostly optimised (except for sparsity, it is coming) and I believe it is the only available free ODE BVP solver for Julia right now (the only other alternative I am aware of is `bvpsol` from ODEInterface.jl, but it is not free and from limited amount of tests I've done, RadauBVP is faster).


On Sunday, October 30, 2016 at 8:40:07 PM UTC+2, Chris Rackauckas wrote:
reshape makes a view, and views are cheap. Don't worry about this.

BTW, I would love to add a collocation method to JuliaDiffEq. Would you consider making this a package?

On Sunday, October 30, 2016 at 3:52:37 AM UTC-7, Alexey Cherkaev wrote:
I'm writing RadauIIA (for now, fixed order 5 with 3 points) method for ODE BVP (basically, collocation method). In the process, I construct an overall non-linear equation that needs to be solved. It takes "mega-vector" x[j] as an argument. However, internally it is more convenient to reshape it to y[m,i,n] where m is the index of original ODE vector, i is the index of the collocation point on time element (or layer) and n is time element index. Also, some inputs to the method (ODE RHS function and BVP function) expect z[m]-kind vector. So far I chose to pass a @view of the "mega-vector" to them.

The alternatives for reshaping and @view would be:
  • Use the inline function or a macro that maps the indices between mega-vector and arrays (I've tried it, didn't see any difference in performance or memory allocation, but @code_warntype has less "red" spots)
  • Copy relevant pieces of mega-vector into preallocated arrays of desired shape. This can also be an alternative for @view.
Is there some kind of rule of thumb where which one would be preferable? And are there any high costs associated with @view and reshape?

Reply | Threaded
Open this post in threaded view
|

Re: Cost of @view and reshape

Mauro
Cool!

On Tue, 2016-11-01 at 10:09, Alexey Cherkaev <[hidden email]> wrote:

> The package is available at https://github.com/mobius-eng/RadauBVP.jl
>
> I haven't put it into METADATA yet. I would like to improve documentation
> and add some tests before doing this.
>
> However, it is already usable, mostly optimised (except for sparsity, it is
> coming) and I believe it is the only available free ODE BVP solver for
> Julia right now (the only other alternative I am aware of is `bvpsol` from
> ODEInterface.jl, but it is not free and from limited amount of tests I've
> done, RadauBVP is faster).

Concerning BVP, what about ApproxFun:
https://github.com/ApproxFun/ApproxFun.jl#solving-ordinary-differential-equations

Concerning sparse Jacobians: I once wrote a matrix coloring package:
https://github.com/mauro3/MatrixColorings.jl It needs some love, but if
you think that it would be useful for you, ping me and I'll try to
update it.


> On Sunday, October 30, 2016 at 8:40:07 PM UTC+2, Chris Rackauckas wrote:
>>
>> reshape makes a view, and views are cheap. Don't worry about this.
>>
>> BTW, I would love to add a collocation method to JuliaDiffEq. Would you
>> consider making this a package?
>>
>> On Sunday, October 30, 2016 at 3:52:37 AM UTC-7, Alexey Cherkaev wrote:
>>>
>>> I'm writing RadauIIA (for now, fixed order 5 with 3 points) method for
>>> ODE BVP (basically, collocation method). In the process, I construct an
>>> overall non-linear equation that needs to be solved. It takes "mega-vector"
>>> x[j] as an argument. However, internally it is more convenient to reshape
>>> it to y[m,i,n] where m is the index of original ODE vector, i is the index
>>> of the collocation point on time element (or layer) and n is time element
>>> index. Also, some inputs to the method (ODE RHS function and BVP function)
>>> expect z[m]-kind vector. So far I chose to pass a @view of the
>>> "mega-vector" to them.
>>>
>>> The alternatives for reshaping and @view would be:
>>>
>>>    - Use the inline function or a macro that maps the indices between
>>>    mega-vector and arrays (I've tried it, didn't see any difference in
>>>    performance or memory allocation, but @code_warntype has less "red" spots)
>>>    - Copy relevant pieces of mega-vector into preallocated arrays of
>>>    desired shape. This can also be an alternative for @view.
>>>
>>> Is there some kind of rule of thumb where which one would be preferable?
>>> And are there any high costs associated with @view and reshape?
>>>
>>>