synapse-0.1.0.0: Synapse is a machine learning library written in pure Haskell.
Safe HaskellSafe-Inferred
LanguageHaskell2010

Synapse.NN.Training

Description

This module provides datatypes and functions that implement neural networks training.

Synopsis

Callbacks datatype and associated type aliases

type CallbackFnOnTrainBegin model optimizer a Source #

Arguments

 = IORef (model a)

Initial model state.

-> IORef [OptimizerParameters optimizer a]

Initial optimizer parameters.

-> IO () 

Type of callback that is called at the beginning of training.

type CallbackFnOnEpochBegin model optimizer a Source #

Arguments

 = IORef Int

Current epoch.

-> IORef (model a)

Model state at the beginning of the epoch processing.

-> IORef [OptimizerParameters optimizer a]

Optimizer parameters at the beginning of the epoch processing.

-> IORef (BatchedDataset a)

Batched shuffled dataset.

-> IO () 

Type of callback that is called at the beginning of training epoch.

type CallbackFnOnBatchBegin model optimizer a Source #

Arguments

 = IORef Int

Current epoch.

-> IORef Int

Current batch number.

-> IORef (model a)

Model state at the beginning of the batch processing.

-> IORef [OptimizerParameters optimizer a]

Optimizer parameters at the beginning of the batch processing.

-> IORef (Sample (Mat a))

Batch that is being processed.

-> IORef a

Learning rate value.

-> IO () 

Type of callback that is called at the beginning of training batch.

type CallbackFnOnBatchEnd model optimizer a Source #

Arguments

 = IORef Int

Current epoch.

-> IORef Int

Current batch number.

-> IORef (model a)

Model state at the end of the batch processing.

-> IORef [OptimizerParameters optimizer a]

Optimizer parameters at the end of the batch processing.

-> IORef (Vec a)

Metrics that were recorded on this batch.

-> IO () 

Type of callback that is called at the end of training batch.

type CallbackFnOnEpochEnd model optimizer a Source #

Arguments

 = IORef Int

Current epoch.

-> IORef (model a)

Model state at the end of the epoch processing.

-> IORef [OptimizerParameters optimizer a]

Optimizer parameters at the end of the epoch processing.

-> IO () 

Type of callback that is called at the end of training epoch.

type CallbackFnOnTrainEnd model optimizer a Source #

Arguments

 = IORef (model a)

Model state at the end of the training.

-> IORef [OptimizerParameters optimizer a]

Optimizer parameters at the end of the training.

-> IORef (Vec (RecordedMetric a))

Recorded metrics.

-> IO () 

Type of callback that is called at the end of training.

data Callbacks model optimizer a Source #

Callbacks record datatype holds all callbacks for the training.

All callbacks take IORefs to various training parameters, which allows to affect training in any way possible.

This interface should be used with caution, because some changes might break the training completely.

Constructors

Callbacks 

Fields

emptyCallbacks :: Callbacks model optimizer a Source #

Returns empty Callbacks record. It could also be used to build your own callbacks upon.

Hyperparameters datatype

data Hyperparameters a Source #

Hyperparameters datatype represents configuration of a training.

Constructors

Hyperparameters 

Fields

newtype RecordedMetric a Source #

RecordedMetric newtype wraps vector of results of metrics.

Constructors

RecordedMetric 

Fields

Training

train Source #

Arguments

:: (Symbolic a, Floating a, Ord a, Show a, RandomGen g, AbstractLayer model, Optimizer optimizer) 
=> model a

Trained model.

-> optimizer a

Optimizer that will be during training.

-> Hyperparameters a

Hyperparameters of training.

-> Callbacks model optimizer a

Callbacks that will be used during training.

-> g

Generator of random values that will be used to shuffle dataset.

-> IO (model a, [OptimizerParameters optimizer a], Vec (RecordedMetric a), g)

Updated model, optimizer parameters at the end of training, vector of recorded metrics (loss is also recorded and is the first in vector), updated generator of random values.

train function allows training neural networks on datasets with specified parameters.