python 2.7 - Could someone explain this neural network machine learning code? -


import numpy np  def nonlin(x, deriv=false):     if (deriv == true):         return (x * (1 - x))     return 1 / (1 + np.exp(-x))  x = np.array([[1,1,1],               [3,3,3],               [2,2,2]               [2,2,2]])  y = np.array([[1],               [1],               [0],               [1]])  np.random.seed(1)  syn0 = 2 * np.random.random((3, 4)) - 1 syn1 = 2 * np.random.random((4, 1)) - 1  j in xrange(100000):     l0 = x     l1 = nonlin(np.dot(l0, syn0))     l2 = nonlin(np.dot(l1, syn1))     l2_error = y - l2     if (j % 10000) == 0:         print "error: " + str(np.mean(np.abs(l2_error)))      l2_delta = l2_error * nonlin(l2, deriv=true)      l1_error = l2_delta.dot(syn1.t)      l1_delta = l1_error * nonlin(l1, deriv=true)      syn1 += l1.t.dot(l2_delta)     syn0 += l0.t.dot(l1_delta)  print "output after training" print l2 

could explain printing out , why important. code seems make no sense me. for loop supposed optimize neurons in network given dataset. how this? nonlin function for? do?:

syn0 = 2 * np.random.random((3, 4)) - 1 syn1 = 2 * np.random.random((4, 1)) - 1 

this simple code train neural network:

this activation of neuron, function returns activation function or derivative (it sigmoid function).

def nonlin(x, deriv=false):     if (deriv == true):         return (x * (1 - x))     return 1 / (1 + np.exp(-x)) 

in picture can see neuron, function appears step function in picture:

https://blog.dbrgn.ch/images/2013/3/26/perceptron.png

these raining data, 4 data 3 features

x = np.array([[1,1,1],           [3,3,3],           [2,2,2]           [2,2,2]]) 

the label of every data (binary classification task)

y = np.array([[1],           [1],           [0],           [1]]) 

this code initialize weights of neural network(it neural network 2 layers of weights):

np.random.seed(1)  syn0 = 2 * np.random.random((3, 4)) - 1 syn1 = 2 * np.random.random((4, 1)) - 1 

in picture weights vnm , wkn , matrices (the value of every link).

https://www.google.es/search?q=multilayer+perceptron&espv=2&biw=1920&bih=964&source=lnms&tbm=isch&sa=x&ved=0ahukewjtjp6jr_jpahuhorokhbiacdsq_auibigb#imgrc=5npsallen2cfqm%3a

in case have 3 input neurons, 4 hidden neurons , 1 output neuron. value of every link storaged in syn0 , syn1.

this code trains neural network, passes every data, evaluates error , updates weights using propagation:

for j in xrange(100000):     l0 = x     l1 = nonlin(np.dot(l0, syn0))     l2 = nonlin(np.dot(l1, syn1))     l2_error = y - l2     if (j % 10000) == 0:         print "error: " + str(np.mean(np.abs(l2_error)))      l2_delta = l2_error * nonlin(l2, deriv=true)      l1_error = l2_delta.dot(syn1.t)      l1_delta = l1_error * nonlin(l1, deriv=true)      syn1 += l1.t.dot(l2_delta)     syn0 += l0.t.dot(l1_delta) 

Comments

Popular posts from this blog

python - How to insert QWidgets in the middle of a Layout? -

python - serve multiple gunicorn django instances under nginx ubuntu -

module - Prestashop displayPaymentReturn hook url -