Skip to content

Latest commit

 

History

History
198 lines (134 loc) · 5.35 KB

README.md

File metadata and controls

198 lines (134 loc) · 5.35 KB

NaiveNeurals

Naive implementation of perceptron neural network. Under heavy development.

  • Implement 3 layer MLP network with SGD back-propagation algorithm
  • Test coverage at least 80%
  • Allow model export/import to json
  • Prepare network learning examples and analysis for:
    • Classification problem
    • Regression problem
    • Time series problem
    • Data compression
  • Use MLP network for MNIST dataset
  • Implement various activation functions
    • Tanh
    • Softmax
    • Softplus
    • Gaussian
  • Explore back-propagation algorithms:
    • SGD with momentum
    • ADAM
    • Levenberg-Marquardt
  • Add support for more than 1 hidden layer
  • Create full documentation

Major inspiration for this work comes from book Machine Learning - An Algorithmic Perspective.

Getting started

git clone https://github.com/stovorov/NaiveNeurals
cd NaiveNeurals

Prepare environment (using virtualenv)

Requires Python 3.6

source set_env.sh     # sets PYTHONPATH
make venv
source venv/bin/activate
make test

If you are using Ubuntu based system you must install tkinter

$ sudo apt-get install python3.6-tk

Usage

Simple training

from NaiveNeurals.MLP.network import NeuralNetwork
from NaiveNeurals.data.dataset import DataSet

nn = NeuralNetwork()

nn.setup_network(input_data_size=2, output_data_size=1,
                 hidden_layer_number_of_nodes=5)

# every list in inputs represents one network input and data pushed into network
inputs =  [[0, 0, 1, 1], [1, 0, 1, 0]]
targets = [[1, 0, 0, 1]]

data_set = DataSet(inputs, targets)
nn.train(data_set)

If convergence is not achieved, ConvergenceError is raised.

Network setup

There are 2 categories of network configuration parameters:

  1. Network architecture (number of nodes, weights, activation functions etc.)

  2. Learning configuration (algorithm: CG, CG_MOM, learning rate, target error rate etc.)

from NaiveNeurals.MLP.network import NeuralNetwork, LearningConfiguration
from NaiveNeurals.MLP.activation_functions import Linear, Tanh

nn = NeuralNetwork()

# with LearningConfiguration one can set multiple parameters for solver algorithm 
learning_configuration = LearningConfiguration(learning_rate=0.01,
                                               target_error=0.003,
                                               solver='GD_MOM',
                                               max_epochs=1000,
                                               solver_params={'alpha': 0.95})

nn.setup_network(input_data_size=1,
                 output_data_size=1,
                 hidden_layer_number_of_nodes=25,
                 hidden_layer_bias=1,
                 output_layer_bias=-0.7,
                 hidden_layer_act_func=Tanh(),
                 output_layer_act_func=Linear())

nn.set_learning_params(learning_configuration)

Batch learning with validation

In most cases it is recommended to split dataset into a few smaller subsets and validation set. In batch mode network will switch training sets every 50 epochs and check error rate with validation data.

from NaiveNeurals.MLP.network import NeuralNetwork
from NaiveNeurals.utils import ConvergenceError

train_data_set1 = ...
train_data_set2 = ...
train_data_set3 = ...
validation_set = ...

nn = NeuralNetwork()

try:
    nn.train_with_validation([train_data_set1, train_data_set2, train_data_set3], validation_set)
except ConvergenceError:
    pass

Model export/import

Once model is trained it can be exported to dict and stored as json file using export_model method:

from NaiveNeurals.MLP.network import NeuralNetwork
import json

nn = NeuralNetwork()

nn.setup_network(input_data_size=2, output_data_size=1,
                 hidden_layer_number_of_nodes=5)

# training procedure ...

with open('test.json', 'w+') as fil:
    fil.writelines(json.dumps(nn.export_model()))

Model can be imported using load_model method:

from NaiveNeurals.MLP.network import NeuralNetwork

model_dict = ... # loaded model - Dict

nnn = NeuralNetwork()
nnn.load_model(model_dict)

Further Reading

If this project got your attention you can read about details below:

Implementation details

Classification problem example

Regression problem example

Time series problem example

References

Machine Learning - An Algorithmic Perspective (2nd edition)

Stephen's Marsland homepage

Mat's Mazur Blog

Python neural network pt.1

Python neural network pt.2

Activation function in neural networks

Gradient Descent with Momentum

SoftMax explained