Flux is a library for machine learning. It comes "batteries-included" with many useful tools built in, but also lets you use the full power of the Julia language where you need it. We follow a few key principles:
- Doing the obvious thing. Flux has relatively few explicit APIs. Instead, writing down the mathematical form will work – and be fast.
- Extensible by default. Flux is written to be highly flexible while being performant. Extending Flux is as simple as using your own code as part of the model you want - it is all high-level Julia code.
- Play nicely with others. Flux works well with unrelated Julia libraries from images to differential equation solvers, rather than duplicating them.
Download Julia 1.9 or later, preferably the current stable release. You can add Flux using Julia's package manager, by typing
] add Flux in the Julia prompt. For Nvidia GPU support, you will also need to install the
CUDA and the
cuDNN packages. For AMD GPU support, install the
AMDGPU package. For acceleration on Apple Silicon, install the
The quick start page trains a simple neural network.
This rest of the guide provides a from-scratch introduction to Flux's take on models and how they work, starting with fitting a line. Once you understand these docs, congratulations, you also understand Flux's source code, which is intended to be concise, legible and a good reference for more advanced concepts.
If you're interested in hacking on Flux, the source code is open and easy to understand – it's all just the same Julia code you work with normally. You might be interested in our intro issues to get started, or our contributing guide.