finetune!
function
defined in module
FastAI
finetune!(learner, nepochs[, base_lr = 0.002; kwargs...])
Behaves like the fastai implementation
fastai.Learner.fine_tune
.
freezeepochs = 1
: Number of epochs to train with the backbone completely frozen.
grouper = FastAI.defaultgrouper(learner.model)
:
ParamGrouper
which assigns groups
1
(backbone) or
2
(head) for every parameter in
learner.model
. The default expects
learner.model
to be a
Chain(backbone, head)
.
backbone_factor = 0.1
: Factor by which updates to backbone model are discounted during the second phase of training.
Any additional keyword arguments are passed to
fitonecycle!
.
There is
1
method for FastAI.finetune!
:
The following pages link back here:
How to train a model, Introduction, Saving and loading models for inference, fastai API comparison