OAdam
struct
defined in module
Flux.Optimise
OAdam(η = 0.0001, β::Tuple = (0.5, 0.9), ϵ = 1.0e-8)
OAdam (Optimistic Adam) is a variant of Adam adding an "optimistic" term suitable for adversarial training.
Learning rate (
η
): Amount by which gradients are discounted before updating the weights.
Decay of momentums (
β::Tuple
): Exponential decay for the first (β1) and the second (β2) momentum estimate.
opt
=
OAdam
(
)
opt
=
OAdam
(
0.001
,
(
0.9
,
0.995
)
)
There are
3
methods for Flux.Optimise.OAdam
:
The following pages link back here:
Flux.jl , deprecations.jl , optimise/Optimise.jl , optimise/optimisers.jl