AdaBelief
	
struct defined in module 
	Flux.Optimise
			AdaBelief(η = 0.001, β::Tuple = (0.9, 0.999), ϵ = 1.0e-8)
The AdaBelief optimiser is a variant of the well-known Adam optimiser.
			Learning rate (
			η): Amount by which gradients are discounted before updating the weights.
			Decay of momentums (
			β::Tuple): Exponential decay for the first (β1) and the second (β2) momentum estimate.
			
			
			
			opt
			 
			=
			 
			
			AdaBelief
			(
			)
			
			
			
			opt
			 
			=
			 
			
			AdaBelief
			(
			0.001
			,
			 
			
			(
			0.9
			,
			 
			0.8
			)
			)There are
			3
			methods for Flux.Optimise.AdaBelief:
		
The following pages link back here:
Flux.jl , deprecations.jl , optimise/Optimise.jl , optimise/optimisers.jl