flux

The Elegant Machine Learning Stack


Models that look like mathematics. Seamless derivatives, GPU training and deployment. A set of small, nimble tools that each do one thing and do it well.

Try It Out

Installation? What installation? Just use Julia's package manager and you're done.

Pkg.add("Flux")
W = randn(2, 10)
b = randn(2)

y = σ(W * x .+ b)

Where Python is executable pseudocode, Julia is executable math. Models look just like the description in the paper, and you have the full power and simplicity of the Julia language (including control flow, multiple dispatch and macros).

Flux is lightweight, and hackable to the core. The whole stack – including automatic differentiation and GPU kernels – is only a few thousand lines of clean Julia code. There's no monolithic C++ underbelly, so it's easy to use custom components and push the state of the art.

function gpu_add(a, b, c)
  i = (blockIdx().x-1) * blockDim().x + threadIdx().x
  c[i] = a[i] + b[i]
  return nothing
end
model = Chain(
  Dense(10, 5, σ),
  Dense(5, 2),
  softmax)

Despite being simple and flexible, Flux has street smarts. The tools are graph-aware and can make smart optimisations to speed and memory usage, previously only possible in clunky "define before run" frameworks.

If your appetite is whet, check out the docs to get going.

Researchers, users and developers of Flux