Machine learning in Haskell.
Provides a very simple implementation of deep (i.e. - multi-layer), fully connected (i.e. - _not_ convolutional) neural networks. Hides the type of the internal network structure from the client code, while still providing type safety, via existential type quantification and dependently typed programming techniques, ala Justin Le. (See Justin's blog post.)
The API offers a single network creation function: randNet
, which allows the user to create a randomly initialized network of arbitrary internal structure by supplying a list of integers, each specifying the output width of one hidden layer in the network. (The input/output widths are determined automatically by the compiler, via type inference.) The type of the internal structure (i.e. - hidden layers) is existentially hidden, outside the API, which offers the following benefits:
Client generated networks of different internal structure may be stored in a common list (or, other Functorial data structure).
The exact structure of the network may be specified at run time, via: user input, file I/O, etc., while still providing GHC enforced type safety, at compile time.
Complex networks with long training times may be stored, after being trained, so that they may be recalled and used again, at a later date/time, without having to re-train them.
Haskell_ML
Various examples of machine learning, in Haskell.
To get started, or learn more, visit the wiki page.