[ANN] MXNet.jl v0.0.3 - Flexible and Efficient Deep Learning for Julia

Previous Topic Next Topic
classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view

[ANN] MXNet.jl v0.0.3 - Flexible and Efficient Deep Learning for Julia

Chiyuan Zhang
Checkout the examples and the documents.

Relation to Mocha.jl: currently I will maintain both packages, but I will treat MXNet.jl as a successor to Mocha.jl when the project become more mature. dmlc/mxnet is a collaboration from authors of multiple different deep learning libraries and we provide support for Python, R, and Julia (and maybe more). MXNet.jl introduce an external dependency, but the default CPU only dependency is very easy to build automatically, basically acting as the CPU Backend of Mocha.jl. For GPUs, the built-in multi-GPU of MXNet.jl support is definitely attractive when comparing to Mocha.jl.

v.0.03 (2015.10.27)

  • Model prediction API.
  • Model checkpoint loading and saving.
  • IJulia Notebook example of using pre-trained imagenet model as classifier.
  • Symbol saving and loading.
  • NDArray saving and loading.
  • Optimizer gradient clipping.
  • Model training callback APIs, default checkpoint and speedometer callbacks.
  • Julia Array / NDArray data iterator.
  • Sphinx documentation system and documents for dynamically imported libmxnet APIs.


You received this message because you are subscribed to the Google Groups "julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.