Dropout
struct
defined in module
Flux
Dropout(p; dims=:, rng = default_rng_value())
Dropout layer.
While training, for each input, this layer either sets that input to
0
(with probability
p
) or scales it by
1 / (1 - p)
. To apply dropout along certain dimension(s), specify the
dims
keyword. e.g.
Dropout(p; dims = 3)
will randomly zero out entire channels on WHCN input (also called 2D dropout). This is used as a regularisation, i.e. it reduces overfitting during training.
In the forward pass, this layer applies the
Flux.dropout
function. See that for more details.
Specify
rng
to use a custom RNG instead of the default. Custom RNGs are only supported on the CPU.
Does nothing to the input once
Flux.testmode!
is
true
.
julia> m = Chain(Dense(1 => 1), Dropout(1));
julia> Flux.trainmode!(m);
julia> y = m([1]);
julia> y == [0]
true
julia> m = Chain(Dense(1000 => 1000), Dropout(0.5));
julia> Flux.trainmode!(m);
julia> y = m(ones(1000));
julia> isapprox(count(==(0), y) / length(y), 0.5, atol=0.1)
true
There are
3
methods for Flux.Dropout
:
The following pages link back here:
models/blocks.jl , Flux.jl , layers/normalise.jl