Back propagation example
Backpropagation is a common method for training a neural network.For this tutorial, we’re going to use a neural network with two inputs, two hidden neurons, two output neurons. Additionally, the hidden and output neurons will include a bias.
Here’s the basic structure with some random initial weights and bias
The goal of backpropagation is to optimize the weights so that the neural network can learn how to correctly map arbitrary inputs to outputs.
For this tutorial we’re going to work with a single training set: given inputs 0.05 and 0.10, we want the neural network to output 0.01 and 0.99.
The Forward PassHere’s how we calculate the total net input for h1:
neth1=w1∗i1+w2∗i2+b1∗1
neth1=0.15∗0.05+0.2∗0.1+0.35∗1=0.3775
We then squash it using the logistic function to get the output of h1:
outh1=11+e−neth1=11+e−0.3775=0.59326992
Carrying out the same process for h2 we get:
outh2=0.596884378
We repeat this process for the output layer neurons, using the output from the hidden layer neurons as inputs.
Here’s the output for o1:
neto1=w5∗outh1+w6∗outh2+b2∗1
neto1=0.4∗0.593269992+0.45∗0.596884378+0.6∗1=1.105905967
outo1=11+e−neto1=11+e−1.105905967=0.75136507
And carrying out the same process for o2 we get:
outo2=0.772928465
Calculating the Total Error
We can now calculate the error for each output neuron using the squared error function and sum them to get the total error:
Etotal=∑12(target−output)2
For example, the target output for o1 is 0.01 but the neural network output 0.75136507, therefore its error is:
Eo1=12(targeto1−outo1)2=12(0.01−0.75136507)2=0.274811083
Repeating this process for o2 (remembering that the target is 0.99) we get:
Eo2=12(targeto2−outo2)2=12(0.99−0.772928465)2=0.023560026
The total error for the neural network is the sum of these errors:
Etotal=Eo1+Eo2=0.274811083+0.023560026=0.298371109
The Backwards PassHidden Layer
Next, we’ll continue the backwards pass by calculating new values for w1,w2,w3 and w4
Big picture, here’s what we need to figure out:
Comments
Post a Comment