A Java Library to implement simple 3 layered Neural Networks
This is a well documented java library to implement simple 3 layered Neural Networks. It is an easy to use library and is quite simple. I am still developing it but the backpropagation algorithm is fully functional.
You can also link this to a Processing sketch and draw the entire network to your screen and watch the individual layers and their values and the outputs.
After training the network, you can save the weights to a file and load those weights again the next time. This helps you to train the network and then use the trained network multiple times to perform particular tasks. Also, you could this helps you to resume the learning process from where it left.
You can also link this to a Processing sketch and draw the entire network to your screen and watch the individual layers and their values and the outputs.
After training the network, you can save the weights to a file and load those weights again the next time. This helps you to train the network and then use the trained network multiple times to perform particular tasks. Also, you could this helps you to resume the learning process from where it left.
How To Use
Once you have imported the library into your project, you can create a Neural Network Object using the constructor. You would need to initialize it with the number of inputs, number of neurons in hidden layer, number of outputs and the Learning Rate.
Simple Use Of Neural Networks
Using Back Propagation Algorithm
Here, I have used 100 cycles to train the network. But mostly you need thousands of training cycles to get a working network.
Number of cycles is totally dependent on the task you want to perform and there is no thumb rule to calculate this. Similar is the case for choosing the learning rate. Usually more the number of cycles, better the network gets. But for a constant learning rate, the network would start oscillating between states for a large number of iterations.
You can find more information on the usage in the documentation.
Number of cycles is totally dependent on the task you want to perform and there is no thumb rule to calculate this. Similar is the case for choosing the learning rate. Usually more the number of cycles, better the network gets. But for a constant learning rate, the network would start oscillating between states for a large number of iterations.
You can find more information on the usage in the documentation.
Structure
The Neural Net uses 3 arrays for representing the individual 3 layers of the network -
- double[] i - for the input layer
- double[] j - for the hidden layer
- double[] k - for the output layer
- double[][] w1 - Stores the weights from layer i to layer j
w1[i][j] - represents weight from input i to hidden j - double[][] w2 - Stores the weights from layer j to layer k
w2[j][k] - represents weight from hidden j to output k