dropout
function
defined in module
Flux
dropout([rng = rng_from_array(x)], x, p; dims=:, active=true)
The dropout function. If
active
is
true
, for each input, either sets that input to
0
(with probability
p
) or scales it by
1 / (1 - p)
.
dims
specifies the unbroadcasted dimensions, e.g.
dims=1
applies dropout along columns and
dims=2
along rows. If
active
is
false
, it just returns the input
x
.
Specify
rng
for custom RNGs instead of the default RNG. Note that custom RNGs are only supported on the CPU.
Warning: when using this function, you have to manually manage the activation state. Usually in fact, dropout is used while training but is deactivated in the inference phase. This can be automatically managed using the
Dropout
layer instead of the
dropout
function.
The
Dropout
layer is what you should use in most scenarios.
There are
2
methods for Flux.dropout
:
The following pages link back here:
models/blocks.jl , layers/normalise.jl