Hello everyone. I need help with debugging my Neural Network. Its a simple system by design currently setup to approximate the XOR function. I started to conceptulize this about a week ago because I was going to use it for my creature simulator however I moved to use my own design on that due to having issues understanding how to use the output to get useful data. My current model I found is more Akin to a HTM Neuron Model and not your standard Neural network like this one is.

However the idea that came to my head last night about using a Neural network to approximate the functionality of an 8 bit computer not only sounds well within reach but its awesome and must be done. Plus I will have not issues turning my input and output data into something useful.

Plans are to get this working well. Train a bunch of these little systems to approximate the AND,OR,NOT,NAND,XOR etc... etc... and string them all together to create what should amount to a working 8 bit computer.

If Google deep mind can look at a picture and tell me its a cat sitting or a shelf like in the video I seen earlier or write pretty nice song lyrics the it should literally have very little issues in being shown how a bunch of computers work, basically show it how to be emulators and literally turn itself into a universal emulator. Like I stated last night about this topic the results of an artificial Neuron emulating a CPU are far reaching and until I at least finish this simple little project we will never know how well these work to emulate anything.

This is where you guys come it

Meet NORMAN. He is a Network Of Recursively Modeled Neurons.

I "finished" NORMAN last night and all seemed to be going well because the trainer seemed to work well and the Error rate was going down however it can only approximate the last function it put in which is for the most part worthless.

When it ask for your input place a 1 or a 0. It is running a 3 input XOR problem. The training data can be found in its function and should be self explanatory how to use it for tinkering, just give it a value to pick which data set to use.

Place the VISUALIZE_NEURAL_NETWORK command anywhere you want to in the system if you wish to view the current state and to see how the network is shaping up.

Any questions feel free to ask. I cant wait to make this larger but until I get the basics to work there is no point. It can run almost 1 MILLION passes forward and Backward through the system in around 2 seconds which on my utterly horrible old computer amazed the hell out of me.

Rem Project: Norman
Rem Created: Tuesday, October 17, 2017
Rem ***** Main Source File *****
#constant num_I =3
#constant num_H =4
#constant num_O =1
#constant Learning_Constant = .001
Global ERROR#
//--------------------------------------------------------------------+
// Weights and Neurons Allocated ]
// ------------3 X 4 --------------------------------------------+
Dim Weights1#(num_I,num_H) as float //INPUT - HIDDEN LAYER Weights ]
// ------------4 X 2 --------------------------------------------+
Dim Weights2#(num_H,num_O) as float //HIDDEN LAYER - OUTPUT Weights ]
//--------------------------------------------------------------------+
Dim I_NEURONS#(num_i) as float // Input Layer |
Dim H_NEURONS#(num_H) as float // Hidden Layer |
Dim O_NEURONS#(num_O) as float // Output Layer |
Dim T_NEURONS#(num_O) as float // Trainer Outputs- Test Data |
//==== Initialize=====================================================#
NEW_NEURAL_NETWORK(num_I,num_H,num_O)
print "Training...";
sync
for t = 1 to 9
for cycle = 1 to 100000
TRAINER(t)
FEED_FORWARD()
BACK_PROPAGATE()
next
print ".";
// Visualize_Neural_network()
sync
next
print ".";
sync
// for cycle = 1 to 26000 // IF Left in This version of the Trainer Raises my error level to slightly below 50% which is terrible,
// for t = 1 to 9 Its possibly due to my system being messed up or perhaps because its cycling through and never allowing
// TRAINER(t) a single Neuron to shift its weights enough but I feel in theory this training method should be better
// FEED_FORWARD() atleast for fine tuning of the network
// BACK_PROPAGATE()
// next
// next
FEED_FORWARD()
VISUALIZE_NEURAL_NETWORK()
do
input "Enter Binary input:",in
input "Enter Binary input:",in1
input "Enter Binary input:",in2
I_NEURONS#(1) = in
I_NEURONS#(2) = in1
I_NEURONS#(3) = in2
FEED_FORWARD()
VISUALIZE_NEURAL_NETWORK()
loop
//======= NEW_NEURAL_NETWORK(int IN,int HIDDEN,int OUT)
function NEW_NEURAL_NETWORK()
for y = 1 to num_I
for x = 1 to num_H
f# = .01 * rnd(100)
Weights1#(y,x) = f#
next;next
for y = 1 to num_H
for x = 1 to num_O
f# = .01 * rnd(100)
Weights2#(Y,X) = f#
next;next
endfunction
//====== FEED FORWARD()
function FEED_FORWARD()
for W = 1 to num_H
H_NEURONS#(W) = (I_NEURONS#(1) * WEIGHTS1#(1,W)) + (I_NEURONS#(2) + WEIGHTS1#(2,W)) + (I_NEURONS#(3) + WEIGHTS1#(3,W))
H_NEURONS#(W) = sigmoid#(H_NEURONS#(W))
next
O_NEURONS#(1) = (H_NEURONS#(1) * WEIGHTS2#(1,1)) + (H_NEURONS#(2) + WEIGHTS2#(2,1)) + (H_NEURONS#(3) + WEIGHTS2#(3,1)) + (H_NEURONS#(4) + WEIGHTS2#(4,1))
O_NEURONS#(1) = sigmoid#(O_NEURONS#(1)) // ACTIVATION FUNCTION
ENDFUNCTION
//====== BACK_PROPAGATE()
function BACK_PROPAGATE()
Expected# = T_NEURONS#(1)
Results# = O_NEURONS#(1)
//DONT KNOW WHICH ERROR FUNCTION IF EITHER IS CORRECT.
//sq#=(Expected# - Results#)
// ERROR# = ((sq# * sq#)/2)* Learning_Constant
ERROR# = (Expected# - Results#) * Learning_Constant
//===============================================================
//============================================ NOT SURE IF I AM USING THE ERROR CORRECT Matter fact Im pretty sure I am not.
for y = 1 to num_H
for x = 1 to num_O
Weights2#(Y,X) = (Weights2#(Y,X)) + (ERROR#)
next
next
for y = 1 to num_I
for x = 1 to num_H
Weights1#(y,x) = (Weights1#(Y,X)) + (ERROR#)
next
next
endfunction
//===== VISUALIZE NEURAL NETWORK
function VISUALIZE_NEURAL_NETWORK()
cls
SET CURSOR 0,0
Print "Input 1: [";I_NEURONS#(1);"] :W1[";WEIGHTS1#(1,1);"] :W2[";WEIGHTS1#(1,2);"] :W3[";WEIGHTS1#(1,3);"] :W4[";WEIGHTS1#(1,4);"]"
Print "Input 2: [";I_NEURONS#(2);"] :W1[";WEIGHTS1#(2,2);"] :W2[";WEIGHTS1#(2,2);"] :W3[";WEIGHTS1#(2,3);"] :W4[";WEIGHTS1#(2,4);"]"
Print "Input 3: [";I_NEURONS#(3);"] :W1[";WEIGHTS1#(3,3);"] :W2[";WEIGHTS1#(3,2);"] :W3[";WEIGHTS1#(3,3);"] :W4[";WEIGHTS1#(3,4);"]"
print " "
Print "Hidden 1: [";H_NEURONS#(1);"] ["; H_NEURONS#(2);"] [";H_NEURONS#(3);"] [";H_NEURONS#(4);"]"
print " "
Print "Weights [W1: ";WEIGHTS2#(1,1);"] [W2: ";WEIGHTS2#(2,1);"] [W3: ";WEIGHTS2#(3,1);"] [W4: ";WEIGHTS2#(4,1);"]"
print " "
Print "Outputs: [";O_NEURONS#(1);"] ";"Error: [";Error#/LEARNING_CONSTANT;"]"
print " "
print
endfunction
//===== TRAINER (SET) <-- SET CHOSES WHICH INPUT DATA TO TRAIN YOUR NETWORK WITH
function TRAINER(SET)
select SET
case 1:
I_NEURONS#(1) = 0
I_NEURONS#(2) = 0
I_NEURONS#(3) = 0
T_NEURONS#(1) = 0
endcase
case 2:
I_NEURONS#(1) = 0
I_NEURONS#(2) = 0
I_NEURONS#(3) = 1
T_NEURONS#(1) = 1
endcase
case 3:
I_NEURONS#(1) = 0
I_NEURONS#(2) = 1
I_NEURONS#(3) = 0
T_NEURONS#(1) = 1
endcase
case 4:
I_NEURONS#(1) = 0
I_NEURONS#(2) = 1
I_NEURONS#(3) = 1
T_NEURONS#(1) = 0
endcase
case 5:
I_NEURONS#(1) = 1
I_NEURONS#(2) = 0
I_NEURONS#(3) = 0
T_NEURONS#(1) = 1
endcase
case 6:
I_NEURONS#(1) = 1
I_NEURONS#(2) = 0
I_NEURONS#(3) = 1
T_NEURONS#(1) = 0
endcase
case 7:
I_NEURONS#(1) = 1
I_NEURONS#(2) = 1
I_NEURONS#(3) = 0
T_NEURONS#(1) = 0
endcase
case 8:
I_NEURONS#(1) = 1
I_NEURONS#(2) = 1
I_NEURONS#(3) = 0
T_NEURONS#(1) = 0
endcase
case 9:
I_NEURONS#(1) = 1
I_NEURONS#(2) = 1
I_NEURONS#(3) = 1
T_NEURONS#(1) = 1
endcase
endselect
endfunction
//====== SIGMOID (float X)
function SIGMOID#(X# as float)
results#=1.0/(1.0+exp(-x#))
endfunction results#
// Possible code to delete or used at a later date
//============= LINE 60 IN FEED FORWARD VERY BOTTOM
// H_NEURONS(1) = (I_NEURONS(1) * WEIGHTS1(1,1)) + (I_NEURONS(2) + WEIGHTS1(2,1)) + (I_NEURONS(3) + WEIGHTS1(3,1))
// H_NEURONS(2) = (I_NEURONS(1) * WEIGHTS1(1,2)) + (I_NEURONS(2) + WEIGHTS1(2,2)) + (I_NEURONS(3) + WEIGHTS1(3,2))
// H_NEURONS(3) = (I_NEURONS(1) * WEIGHTS1(1,3)) + (I_NEURONS(2) + WEIGHTS1(2,3)) + (I_NEURONS(3) + WEIGHTS1(3,3))
// H_NEURONS(4) = (I_NEURONS(1) * WEIGHTS1(1,4)) + (I_NEURONS(2) + WEIGHTS1(2,4)) + (I_NEURONS(3) + WEIGHTS1(3,4))
//============ 62 ib FEED FORWARD TO ASSIGN VALUES TO SECOND SET OF WEIGHTS
// for W2 = 1 to num_O
// O_NEURONS(W2) = (H_NEURONS(1) * WEIGHTS2(1,W2)) + (H_NEURONS(2) + WEIGHTS2(2,W2)) + (H_NEURONS(3) + WEIGHTS2(3,W2)) + (H_NEURONS(4) + WEIGHTS2(4,W2))
// next

I am not sure why but I am almost 100% sure that its because I am not correctly calculating the ERROR# or I perhaps am not using it properly. I have been trying to get it working and studying up since like 10:30 this morning and the way they write the mathmatics is confusing the hell out of me since I am not really good at Calculus and never had any formal training in it. I believe I may understand it but would like some input from anyone that can help because Im getting burnt and have started just randomly changing shit for reasons even im not sure of at this point and I almost broke it a little bit ago so I think its best to leave it here for the night and pray someone has any information that can help.

Thanks in advance,

~Sedit