Safe Haskell | Safe-Inferred |
---|---|
Language | Haskell2010 |
Synapse.NN.Optimizers
Contents
Description
This module implements several optimizers that are used in training.
Synopsis
- class Optimizer optimizer where
- type OptimizerParameters optimizer a :: Type
- optimizerInitialParameters :: Num a => optimizer a -> Mat a -> OptimizerParameters optimizer a
- optimizerUpdateStep :: Num a => optimizer a -> (a, Mat a) -> (Mat a, OptimizerParameters optimizer a) -> (Mat a, OptimizerParameters optimizer a)
- optimizerUpdateParameters :: (Symbolic a, Optimizer optimizer) => optimizer a -> (a, Gradients (Mat a)) -> [(SymbolMat a, OptimizerParameters optimizer a)] -> [(Mat a, OptimizerParameters optimizer a)]
- data SGD a = SGD {
- sgdMomentum :: a
- sgdNesterov :: Bool
Optimizer
typeclass
class Optimizer optimizer where Source #
Optimizer
typeclass represents optimizer - algorithm that defines an update rule of neural network parameters.
Associated Types
type OptimizerParameters optimizer a :: Type Source #
OptimizerParameters
represent optimizer-specific parameters that it needs to implement update rule.
Methods
optimizerInitialParameters :: Num a => optimizer a -> Mat a -> OptimizerParameters optimizer a Source #
Returns initial state of optimizer-specific parameters for given variable.
Arguments
:: Num a | |
=> optimizer a | Optimizer itself. |
-> (a, Mat a) | Learning rate and gradient of given parameter. |
-> (Mat a, OptimizerParameters optimizer a) | Given parameter and current state of optimizer-specific parameters. |
-> (Mat a, OptimizerParameters optimizer a) | Updated parameter and a new state of optimizer-specific parameters. |
Performs the update step of optimizer.
Instances
Optimizer SGD Source # | |
Defined in Synapse.NN.Optimizers Associated Types type OptimizerParameters SGD a Source # Methods optimizerInitialParameters :: Num a => SGD a -> Mat a -> OptimizerParameters SGD a Source # optimizerUpdateStep :: Num a => SGD a -> (a, Mat a) -> (Mat a, OptimizerParameters SGD a) -> (Mat a, OptimizerParameters SGD a) Source # |
optimizerUpdateParameters Source #
Arguments
:: (Symbolic a, Optimizer optimizer) | |
=> optimizer a | Optimizer itself. |
-> (a, Gradients (Mat a)) | Learning rate and gradients of all parameters. |
-> [(SymbolMat a, OptimizerParameters optimizer a)] | Given parameters and current state of optimizer-specific parameters. |
-> [(Mat a, OptimizerParameters optimizer a)] | Updated parameters and a new state of optimizer-specific parameters. |
optimizerUpdateParameters
function updates whole model using optimizer by performing optimizerUpdateStep
for every parameter.
Optimizers
SGD
is a optimizer that implements stochastic gradient-descent algorithm.
Constructors
SGD | |
Fields
|
Instances
Optimizer SGD Source # | |
Defined in Synapse.NN.Optimizers Associated Types type OptimizerParameters SGD a Source # Methods optimizerInitialParameters :: Num a => SGD a -> Mat a -> OptimizerParameters SGD a Source # optimizerUpdateStep :: Num a => SGD a -> (a, Mat a) -> (Mat a, OptimizerParameters SGD a) -> (Mat a, OptimizerParameters SGD a) Source # | |
Show a => Show (SGD a) Source # | |
Eq a => Eq (SGD a) Source # | |
type OptimizerParameters SGD a Source # | |
Defined in Synapse.NN.Optimizers | |
type DType (SGD a) Source # | |
Defined in Synapse.NN.Optimizers |