Safe Haskell | Safe-Inferred |
---|---|
Language | Haskell2010 |
Synapse.NN.Layers.Activations
Description
Provides activation functions - unary functions that are differentiable almost everywhere and so they can be used in backward loss propagation.
Synopsis
- type ActivationFn a = SymbolMat a -> SymbolMat a
- activateScalar :: Symbolic a => ActivationFn a -> a -> a
- activateMat :: Symbolic a => ActivationFn a -> Mat a -> Mat a
- newtype Activation a = Activation {
- unActivation :: ActivationFn a
- layerActivation :: Activation a -> LayerConfiguration (Activation a)
- relu :: (Symbolic a, Fractional a) => ActivationFn a
- sigmoid :: (Symbolic a, Floating a) => ActivationFn a
ActivationFn
type alias and Activation
newtype
type ActivationFn a = SymbolMat a -> SymbolMat a Source #
ActivationFn
is a type alias that represents unary functions that differentiable almost everywhere.
activateScalar :: Symbolic a => ActivationFn a -> a -> a Source #
Applies activation function to a scalar to produce new scalar.
activateMat :: Symbolic a => ActivationFn a -> Mat a -> Mat a Source #
Applies activation function to a scalar to produce new scalar.
newtype Activation a Source #
Activation
newtype wraps ActivationFn
s - unary functions that can be thought of as activation functions for neural network layers.
Any activation function must be differentiable almost everywhere and so
it must be function that operates on Symbol
s, which is allows for function to be differentiated when needed.
Constructors
Activation | |
Fields
|
Instances
AbstractLayer Activation Source # | |
Defined in Synapse.NN.Layers.Activations Methods inputSize :: Activation a -> Maybe Int Source # outputSize :: Activation a -> Maybe Int Source # nParameters :: Activation a -> Int Source # getParameters :: SymbolIdentifier -> Activation a -> [SymbolMat a] Source # updateParameters :: Activation a -> [Mat a] -> Activation a Source # symbolicForward :: (Symbolic a, Floating a, Ord a) => SymbolIdentifier -> SymbolMat a -> Activation a -> (SymbolMat a, SymbolMat a) Source # |
layerActivation :: Activation a -> LayerConfiguration (Activation a) Source #
Creates configuration for activation layer.
Activation functions
relu :: (Symbolic a, Fractional a) => ActivationFn a Source #
ReLU function.