Flux Machine Learning for Julia
There was a HUGE announcement on the Julia blog a few days ago. The convergence of a language for machine learning and marrying it with a compiler just got a bit closer. Julia announced Flux, a machine learning frame work for Julia.
Julia Language started out with the goal to create a language that was elegant for computations (i.e. math and machine learning), easy to code, and can take advantage of all that a hardware can offer by a specialized compiler. This isn’t python, which can be everything to anyone. It offers a Just In Time (JIT) compiling of your code so can get a helluva amount of speed over other dynamically typed code.
From the bechmarking above, Julia beats the ‘pants off’ the commonly used languages and software for Machine Learning and Data Science. Granted, it’s not as fast as C in most cases but it gets awfully close. My guess is that this will narrow the better the compiler gets.
The early days of machine learning were rough. Platforms like KNIME and RapidMiner tried to make them easier by building GUI’s over different algorithms. At the same time Python and R were being repurporsed for machine learning. This works but at the expense of speed. Java was way faster than R and Python, but machine learning specialists loved coding in R and Python. So we made do with a hodgepodge of languages and deployment schemes.
Data Scientists and Machine Learning experts don’t like to sit still, so they started to automate things. H2O.ai created AutoML in their open source H2O, and the rest of the market followed.
All this is working exceedingly well, but the need for speed ever drives us forth. Especially when we want to use computationally heavy algorithms and techniques like Deep Learning. To do that, Yann LeCun (creator of CNN’s) suggests we look at the problem differently. He suggests we use the term differential programming. We look at machine learning as developing “a new kind of software” that is differentiable and optimizable.
Enter Flux: A Machine Learning Framework for Julia!
Since we originally proposed the need for a first-class language, compiler and ecosystem for machine learning (ML), there have been plenty of interesting developments in the field. Not only have the tradeoffs in existing systems, such as TensorFlow and PyTorch, not been resolved, but they are clearer than ever now that both frameworks contain distinct “static graph” and “eager execution”interfaces. Meanwhile, the idea of ML models fundamentally being differentiable algorithms — often called differentiable programming – has caught on.
via Julia Blog
Flux
What is Flux exactly? It’s…
… a language to write differentiable algorithms, and Flux takes Julia to be this language. Being designed from the ground up for mathematical and numerical computing, Julia is unusually well-suited for expressing ML algorithms. Meanwhile, its mix of modern design and new ideas in the compiler makes it easier to address the high performance needs of cutting edge ML.
via Julia Blog
Granted, this is in it’s nascent form right now but Julia is really pushing forward with new libraries. I was unaware that Julia can easily compile on GPUs and TPUs. So, I must check this out.
Going forward, I think I need to stick with Python to automate the boring stuff and start looking at Julia for my future machine learning needs.
Stay tuned.