explicit
- Inputs outputs
- Forward pass in a neural network classically means applying the function at each layer
implicit
- Specify the conditions that we want the layer’s output to satisfy. For example, a layer with
- The output of the layer is the solution to this equation for .
- relatively few or even just one implicit layer are often enough
- Calculate gradients using implicit function theorem rather than backpropagating through each step of the solve (memory intensive)
examples
- Fixed point layer
- Solving for is the equivalent of repeatedly applying a standard dense layer with activation to the same input.
- (neural) ODE / SDE / equilibrium models