Back propagation example

 Backpropagation is a common method for training a neural network.For this tutorial, we’re going to use a neural network with two inputs, two hidden neurons, two output neurons. Additionally, the hidden and output neurons will include a bias.

Here’s the basic structure with some random initial weights and bias


The goal of backpropagation is to optimize the weights so that the neural network can learn how to correctly map arbitrary inputs to outputs.

For this tutorial we’re going to work with a single training set: given inputs 0.05 and 0.10, we want the neural network to output 0.01 and 0.99.

The Forward Pass

To begin, lets see what the neural network currently predicts given the weights and biases above and inputs of 0.05 and 0.10. To do this we’ll feed those inputs forward though the network.

We figure out the total net input to each hidden layer neuron, squash the total net input using an activation function (here we use the logistic function), then repeat the process with the output layer neurons.

Here’s how we calculate the total net input for  h1:

neth1=w1i1+w2i2+b11

neth1=0.150.05+0.20.1+0.351=0.3775

We then squash it using the logistic function to get the output of  h1:

outh1=11+eneth1=11+e0.3775=0.59326992

Carrying out the same process for h2  we get:

outh2=0.596884378

We repeat this process for the output layer neurons, using the output from the hidden layer neurons as inputs.

Here’s the output for o1:

neto1=w5outh1+w6outh2+b21

neto1=0.40.593269992+0.450.596884378+0.61=1.105905967

outo1=11+eneto1=11+e1.105905967=0.75136507

And carrying out the same process for o2 we get:

outo2=0.772928465

Calculating the Total Error

We can now calculate the error for each output neuron using the squared error function and sum them to get the total error:

Etotal=12(targetoutput)2

For example, the target output for o1 is 0.01 but the neural network output 0.75136507, therefore its error is:

Eo1=12(targeto1outo1)2=12(0.010.75136507)2=0.274811083

Repeating this process for o2 (remembering that the target is 0.99) we get:

Eo2=12(targeto2outo2)2=12(0.990.772928465)2=0.023560026

The total error for the neural network is the sum of these errors:

Etotal=Eo1+Eo2=0.274811083+0.023560026=0.298371109

The Backwards Pass

Our goal with backpropagation is to update each of the weights in the network so that they cause the actual output to be closer the target output, thereby minimizing the error for each output neuron and the network as a whole.
Output Layer

Consider w5. We want to know how much a change in w5 affects the total error, aka 
Etotalw5






Hidden Layer

Next, we’ll continue the backwards pass by calculating new values for  w1,w2,w3 and w4

Big picture, here’s what we need to figure out:








Example:


import numpy as np
x1=0.6
x2=0.8 
v11=0.2
v12=0.1
v21=0.1
v22=0.3
w11=0.5
w21=0.2
T=0.9
alp=0.5
for i in range(100):
    neth1=x1*v11+x2*v21
    outh1=1/(1+np.exp(-neth1))
    neth2=x1*v12+x2*v22
    outh2=1/(1+np.exp(-neth2))
    neto1=outh1*w11+outh2*w21
    outo1=1/(1+np.exp(-neto1))

    O=outo1
    E=T-O
    print(O,E)
    delta_out=E*outo1*(1-outo1)
    delta_w11=delta_out*outh1
    delta_w22=delta_out*outh2


    delta_h1=delta_out*w11*outh1*(1-outh1)
    delta_v11=delta_h1*x1
    delta_v21=delta_h1*x2

    delta_h2=delta_out*w21*outh2*(1-outh2)
    delta_v12=delta_h2*x1
    delta_v22=delta_h2*x2

    w11=w11+alp*delta_w11
    w21=w21+alp*delta_w22
    
    v11=v11+alp*delta_v11
    v12=v12+alp*delta_v12
    v21=v21+alp*delta_v21
    v22=v22+alp*delta_v22

print("V11=",v11,"V12=",v12,"V21=",v21,"V22=",v22,"W11=",w11,"W21=",w21)

Output= 0.5962358769907005 
Error= -0.30376412300929956 
V11= 0.2027150501278511 V12= 0.10107260142323136 V21= 0.10362006683713479 
V22= 0.3014301352309751 W11= 0.5201040661418385 W21= 0.221003849178323

Note: Parameter updation  after one round is shown.You can run this for 1000 epochs to see that the error reduces to 0 and the final parameters can be selected.

Comments

Popular posts from this blog

NEURAL NETWORKS AND DEEP LEARNING CST 395 CS 5TH SEMESTER HONORS COURSE NOTES - Dr Binu V P, 9847390760

Syllabus CST 395 Neural Network and Deep Learning