Not Just PyTorch and TensorFlow: 4 Other Deep Learning Libraries You Should Know | by Minh Tran


A quick introduction to JAX, MXNet, MATLAB, and Flux

Photo by Gabriel Sollmann on Unsplash

Machine learning libraries accelerate the deep learning revolution. They lowered the barrier of entry for practitioners by abstracting many difficult things such as GPU speedup, matrix algebra, and automatic differentiation. In both industries and academia, two deep learning libraries reign supreme: PyTorch and TensorFlow. In this article, I will introduce you to some other deep learning libraries that have considerable usage, either because they achieve speedup in some ways, or because they are used by very specific groups. Let’s begin!

What is it? A open-source and in-development numerical framework originally developed by Google (think NumPy but for GPU).

Who uses it? Many teams within Google, such as DeepMind.

Why should you know about it? JAX was developed by Google to accelerate numerical computing on GPU and Google’s own hardware TPU. Using ideas such as accelerated linear algebra, just-in-time compilation (JIT), and automatic vectorization, JAX achieved great speedup and scale. Even though their syntax are similar to minimize the learning curve, JAX has a different design philosophy from NumPy. JAX encourages functional programming though functions such as vmap and pmap (vectorize + parallelize).

Currently, many high-level APIs have been developed for JAX. Notable ones are Haiku and Flax.

What is it? An open-source veteran machine learning framework with front-end bindings for multiple languages including Python, C++, R, Java, and Perl.

Who uses it? Amazon AWS.

Why should you know about it? MXNet most powerful features are its support for many programming languages and its scalability. Benchmark tests by NVIDIA shows that MXNet is faster than PyTorch and TensorFlow on some deep learning tasks.

MXNet comes with Gluon, a high-level API to build neural networks. It also has an ecosystem for image classification (GluonCV) and NLP (GluonNLP).

What is it? An add-on toolbox for MATLAB users that can create and train neural networks for a variety of tasks.

Who uses it? Academia and industries such as aerospace and mechanical engineering. For example, Airbus used it to detect defects inside airplanes.

Why should you know about it? Whatever you feel about MATLAB, it is still a popular programming ecosystem amongst academics and engineers. It has great user support and, in my opinion, the best documentation out of all the deep learning libraries in this list. The deep learning toolbox is geared toward people who want to build systems using minimal programming. Simulink, a graphical programming interface within MATLAB, offers ways to create easy-to-understand deep learning pipelines.

What is it? An open-source machine learning library built for Julia programming language.

Who uses it? Computing-intensive fields such as pharmaceuticals and finances. For example, Astrazeneca used it to predict drug toxicity.

Why should you know about it? Julia programming language gained momentum over the years amongst data scientists, quants, and bioinformatics researchers. It is comparable to C/C++ in terms of speed, and it was designed to be beginners-friendly like Python. An implementation of Julia deep learning on Google TPU showed >200x speedup compared to CPU. If you are already coding in Julia, Flux is a great library to look into.

I hope that, with this short article, you are introduced to some other deep learning libraries. They all support efficient speedups, GPU scaling, and deployment into productions. There are excellent learning sources for all of them on the internet. Happy coding!

[1] https://www.deepmind.com/blog/using-jax-to-accelerate-our-research

[2] https://github.com/aws/sagemaker-python-sdk

[3] https://developer.nvidia.com/deep-learning-performance-training-inference

[4] https://www.mathworks.com/company/user_stories/case-studies/airbus-uses-artificial-intelligence-and-deep-learning-for-automatic-defect-detection.html

[5] https://twitter.com/jeffdean/status/1054951415339192321?lang=en

[6] https://julialang.org/blog/2012/02/why-we-created-julia/

[7] https://juliacomputing.com/case-studies/astra-zeneca/





Source link

Be the first to comment

Leave a Reply

Your email address will not be published.


*