RMSProp

struct defined in module Flux.Optimise


			RMSProp(η = 0.001, ρ = 0.9, ϵ = 1.0e-8)

Optimizer using the RMSProp algorithm. Often a good choice for recurrent networks. Parameters other than learning rate generally don't need tuning.

Parameters

  • Learning rate ( η): Amount by which gradients are discounted before updating the weights.

  • Momentum ( ρ): Controls the acceleration of gradient descent in the prominent direction, in effect damping oscillations.

Examples


			
			
			
			opt
			 
			=
			 
			

			RMSProp
			(
			)
			

			

			
			opt
			 
			=
			 
			

			RMSProp
			(
			0.002
			,
			 
			0.95
			)
Methods