One of the main strengths of Julia lies in an ecosystem of packages globally providing a rich and consistent user experience.
This is a non-exhaustive list of Julia packages, nicely complementing
Flux in typical machine learning and deep learning workflows. To add your project please send a PR. See also academic work citing Flux or citing Zygote.
- Flux's model-zoo contains examples from many domains.
- ObjectDetector.jl provides ready-to-go image detection via YOLO.
- Metalhead.jl includes many state-of-the-art computer vision models which can easily be used for transfer learning.
- UNet.jl is a generic UNet implementation.
- Transformers.jl provides components for Transformer models for NLP, as well as providing several trained models out of the box.
- TextAnalysis.jl provides several NLP algorithms that use Flux models under the hood.
- AlphaZero.jl provides a generic, simple and fast implementation of Deepmind's AlphaZero algorithm.
- ReinforcementLearning.jl offers a collection of tools for doing reinforcement learning research in Julia.
- GraphNeuralNetworks.jl is a fresh, performant and flexible graph neural network library based on Flux.jl.
- GeometricFlux.jl is the first graph neural network library for julia.
- NeuralOperators.jl enables training infinite dimensional PDEs by learning a continuous function instead of using the finite element method.
- SeaPearl.jl is a Constraint Programming solver that uses Reinforcement Learning based on graphs as input.
- FluxArchitectures.jl is a collection of advanced network architectures for time series forecasting.
- RobustNeuralNetworks.jl includes classes of neural networks that are constructed to naturally satisfy robustness constraints.
Utility tools you're unlikely to have met if you never used Flux!
- FastAI.jl is a Julia port of Python's fast.ai library.
- FluxTraining.jl is a package for using and writing powerful, extensible training loops for deep learning models. It supports callbacks for many common use cases like hyperparameter scheduling, metrics tracking and logging, checkpointing, early stopping, and more. It powers training in FastAI.jl
Commonly used machine learning datasets are provided by the following packages in the julia ecosystem:
- MLDatasets.jl focuses on downloading, unpacking, and accessing benchmark datasets.
- GraphMLDatasets.jl: a library for machine learning datasets on graph.
Tools to put data into the right order for creating a model.
- Augmentor.jl is a real-time library augmentation library for increasing the number of training images.
- DataAugmentation.jl aims to make it easy to build stochastic, label-preserving augmentation pipelines for vision use cases involving images, keypoints and segmentation masks.
- MLUtils.jl (replaces MLDataUtils.jl and MLLabelUtils.jl) is a library for processing Machine Learning datasets.
- ParameterSchedulers.jl standard scheduling policies for machine learning.
Packages based on differentiable programming but not necessarily related to Machine Learning.
- The SciML ecosystem uses Flux and Zygote to mix neural nets with differential equations, to get the best of black box and mechanistic modelling.
- DiffEqFlux.jl provides tools for creating Neural Differential Equations.
- Flux3D.jl shows off machine learning on 3D data.
- RayTracer.jl combines ML with computer vision via a differentiable renderer.
- Duckietown.jl Differentiable Duckietown simulator.
- The Yao.jl project uses Flux and Zygote for Quantum Differentiable Programming.
- AtomicGraphNets.jl enables learning graph based models on atomic systems used in chemistry.
- DiffImages.jl differentiable computer vision modeling in Julia with the Images.jl ecosystem.
- Turing.jl extends Flux's differentiable programming capabilities to probabilistic programming.
- Omega.jl is a research project aimed at causal, higher-order probabilistic programming.
- Stheno.jl provides flexible Gaussian processes.
- OnlineStats.jl provides single-pass algorithms for statistics.
Some useful and random packages!
- AdversarialPrediction.jl provides a way to easily optimise generic performance metrics in supervised learning settings using the Adversarial Prediction framework.
- Mill.jl helps to prototype flexible multi-instance learning models.
- MLMetrics.jl is a utility for scoring models in data science and machine learning.
- Torch.jl exposes torch in Julia.
- ValueHistories.jl is a utility for efficient tracking of optimization histories, training curves or other information of arbitrary types and at arbitrarily spaced sampling times.
- InvertibleNetworks.jl Building blocks for invertible neural networks in the Julia programming language.
- ProgressMeter.jl progress meters for long-running computations.
- TensorBoardLogger.jl easy peasy logging to tensorboard in Julia
- ArgParse.jl is a package for parsing command-line arguments to Julia programs.
- Parameters.jl types with default field values, keyword constructors and (un-)pack macros.
- BSON.jl is a package for working with the Binary JSON serialisation format.
- DataFrames.jl in-memory tabular data in Julia.
- DrWatson.jl is a scientific project assistant software.
This tight integration among Julia packages is shown in some of the examples in the model-zoo repository.
Julia has several other libraries for making neural networks.
Lux.jl (earlier ExplicitFluxLayers.jl) shares much of the design, use-case, and NNlib.jl / Optimisers.jl back-end of Flux. But instead of encapsulating all parameters within the model structure, it separates this into 3 components: a model, a tree of parameters, and a tree of model states.
Flux's training docs talk about changes from Zygote's implicit to explicit gradients, dictionary-like to tree-like structures. (See also Zygote's description of these.) Lux also uses Zygote, but uses the word "explicit" to mean something unrelated, namely storing the tree of parameters (and of state) separately from the model.