A Julia package for using and writing powerful, extensible training loops for deep learning models.
Implements a training loop to take the boilerplate out of training deep learning models
Lets you add features to training loops through reusable callbacks
Comes with callbacks for many common use cases like hyperparameter scheduling , metrics tracking and logging , checkpointing , early stopping , and more …
Is extensible by creating custom, reusable callbacks or even custom training loops
You don’t want to implement your own metrics tracking and hyperparameter scheduling or insert common training feature here for the 10th time
You want to use composable and reusable components that enhance your training loop
You want a simple training loop with reasonable defaults that can grow to the needs of your project
Install like any other Julia package using the package manager:
]add FluxTraining
After installation, import it, create a
Learner
from a
Flux
.
jl
model, data iterators, an optimizer, and a loss function. Finally train with
fit!
.
using
FluxTraining
learner
=
Learner
(
model
,
lossfn
)
fit!
(
learner
,
10
,
(
trainiter
,
validiter
)
)
A full example training an image classifier on the MNIST dataset
The documentation of FastAI . jl which features many end-to-end examples
The design of FluxTraining.jl’s two-way callbacks is adapted from fastai ’s training loop.