Introduction

Tutorials

Developer guide

API Reference

public gated_mlp_blockfunction

gated_mlp(gate_layer, inplanes::Integer, hidden_planes::Integer, 
          outplanes::Integer = inplanes; dropout = 0., activation = gelu)

Feedforward block based on the implementation in the paper “Pay Attention to MLPs”. (reference)

Arguments

  • gate_layer: Layer to use for the gating.
  • inplanes: Number of dimensions in the input.
  • hidden_planes: Number of dimensions in the intermediate layer.
  • outplanes: Number of dimensions in the output - by default it is the same as inplanes.
  • dropout: Dropout rate.
  • activation: Activation function to use.