Flux provides a single, intuitive way to define models, just like mathematical notation. Julia transparently compiles your code, optimising and fusing kernels for the GPU, for the best performance.
GPU kernels can be written directly in Julia via CUDAnative. Flux is uniquely hackable and any part can be tweaked, from GPU code to custom gradients and layers.
Import trained ONNX models as Flux scripts, for high-quality inference or for transfer learning.
We introduce Policy Guided Monte Carlo (PGMC), a computational framework using reinforcement learning to improve Markov chain Monte Carlo (MCMC) sampling. arXiv
We draw inspiration from both classical system identification and modern machine learning in order to solve estimation problems for real-world, physical systems. link
This paper presents reverse-mode algorithmic differentiation (AD) based on source code transformation, in particular of the Static Single Assignment (SSA) form used by modern compilers. arXiv
We describe a method and implementation for offloading suitable sections of Julia programs to TPUs via the Google XLA compiler. arXiv