Skip to content

Commit

Permalink
Merge branch 'legacy-nn-2' into legacy
Browse files Browse the repository at this point in the history
  • Loading branch information
generic-github-user committed Sep 1, 2022
2 parents 13badf1 + 6c72e0b commit d16c8b1
Show file tree
Hide file tree
Showing 21 changed files with 7,128 additions and 0 deletions.
2 changes: 2 additions & 0 deletions legacy/hurricane/.gitattributes
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Auto detect text files and perform LF normalization
* text=auto
12 changes: 12 additions & 0 deletions legacy/hurricane/Documentation.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
To use Hurricane:


Place the "Hurricane.js" file in the same folder/directory as your HTML file.
Then place this code in the "<body></body>" section of your HTML: <script src="Hurricane.js"></script>. This references the JavaScript file that contains Hurricane. See the "example.html" file for an example usage of Hurricane.


There are two functions that can be used with Hurricane: train( ) and run( ).

train( ) accepts 5 arguments: inputCount, hiddenNodesCount, mutationSize, accuracy, and trainingData. trainingData is an array filled with integers or floating point numbers, and the other 4 parameters are integers (mutationSize and accuracy can be a float). This is what the function looks like with blank parameters: train(inputCount,hiddenNodesCount,mutationSize,accuracy,trainingData). With values filled in the function arguments/in actual code, it looks like this: train(9,5,1000,0.1,[1,2,3,4,5,6,7,8,9,10,34,35,36,37,38,39,40,41,42,43,20,19,18,17,16,15,14,13,12,11,10]);. This function must be called before run( ). This function trains the neural network for use based on a given set of training data. It takes much longer than run( ), but it only needs to be run once.

run( ) accepts only one argument: inputValues. This is an array filled with integers or floating point numbers. This is what the function looks like with blank parameters: run(inputValues). With values filled in the function argument/in actual code, it looks like this: run([20,21,22,23,24,25,26,27,28]);. This function can be called whenever the neural network has to be reevaluated with new inputs. It runs much faster than train( ).
21 changes: 21 additions & 0 deletions legacy/hurricane/Example.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
<html>
<head>
<title>
Example Usage of Hurricane
</title>
</head>

<body>

<script src="Hurricane.js"></script> <!--Import Hurricane.js-->
<script>

train(9,5,1,0.1,[1,2,3,4,5,6,7,8,9,10,34,35,36,37,38,39,40,41,42,43,14,15,16,17,18,19,20,21,22,23,101,102,103,104,105,106,107,108,109,110,10,9,8,7,6,5,4,3,2,1,46,47,48,49,50,51,52,53,54,55,-10,-9,-8,-7,-6,-5,-4,-3,-2,-1]); //Run training function for neural network (this must be done before the neural network can be run)

run([20,21,22,23,24,25,26,27,28]); //Run the neural network, return the output, and log the data
document.write(Math.round(output[0]));

</script>

</body>
</html>
223 changes: 223 additions & 0 deletions legacy/hurricane/Hurricane.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,223 @@

//Hurricane, a simple feedforward neural network in JavaScript.



{ //Defines variables

var inputCount_ = 20;
var hiddenNodesCount_ = 10;
var stepSize_ = 10000;
var trainingData_ = [];

var trainingDataSetLength = inputCount_ + 1; //Length of a set of training data
var input = []; //Creates array to store values for the input layer
for(u=0;u<inputCount_;u++){
input.push(0);
}
var synapse1 = []; //Creates array to store values for the first layer of synapses
for(k=0;k<input.length*hiddenNodesCount_;k++){ //Adds synapses for each input
synapse1.push(0); //Adds synapse
}
var hidden = []; //Creates array to store values for hidden layer
for(j=0;j<hiddenNodesCount_;j++){
hidden.push(0);
}
var synapse2 = []; //Creates array to store values for the second layer of synapses
for(d=0;d<hiddenNodesCount_;d++){
synapse2.push(0);
}
var output = [0]; //Creates array to store output
var randSynapse = Math.floor(Math.random()*synapse1.length); //Picks a random synapse from synapse layer 1 to mutate
var trainingDataSet = Math.floor(Math.random()*(trainingData_.length/trainingDataSetLength)); //Picks a random set of training data to use for synapse layer 1
var oldDifference = 0; //Error of previous mutation
var cost = 0; //Error of current mutation
var randMutation = (Math.random()/10000)*stepSize_ - (Math.random()/10000)*stepSize_;
var done = 0;
var outputChange = output[0];

}

function runNeuralNet(){

for(d=0;d<hidden.length;d++){
hidden[d] = 0;
}
for(j=0;j<input.length;j++){
for(w=0;w<hidden.length;w++){
hidden[w] = hidden[w] + input[j] * synapse1[j*hiddenNodesCount_+w];
}
}
output[0] = 0;
for(e=0;e<hidden.length;e++){
output[0] = output[0] + (hidden[e] * synapse2[e]);
}

}

function logData(){ //Logs neural network data for reference

console.log("Hurricane:"); //Clears console
console.log("");
console.log("");

console.log(input); //Logs input
console.log(synapse1); //Logs synapse layer 1
console.log(hidden); //Logs hidden layer
console.log(synapse2); //Logs synapse layer 2
console.log(output); //Logs output

console.log("");
console.log("");

console.log("Output = " + output) //Logs output

}



function train(inputCount,hiddenNodesCount,stepSize,accuracy,trainingData){ //Trains neural network

inputCount_ = inputCount;
hiddenNodesCount_ = hiddenNodesCount;
stepSize_ = stepSize;
trainingData_ = trainingData;

trainingDataSetLength = inputCount_ + 1; //Length of a set of training data
input = []; //Creates array to store values for the input layer
for(u=0;u<inputCount_;u++){
input.push(0);
}
synapse1 = []; //Creates array to store values for the first layer of synapses
for(k=0;k<input.length*hiddenNodesCount_;k++){ //Adds synapses for each input
synapse1.push(0); //Adds synapse
}
hidden = []; //Creates array to store values for hidden layer
for(j=0;j<hiddenNodesCount_;j++){
hidden.push(0);
}
synapse2 = []; //Creates array to store values for the second layer of synapses
for(d=0;d<hiddenNodesCount_;d++){
synapse2.push(0);
}
output = [0]; //Creates array to store output
randSynapse = Math.floor(Math.random()*synapse1.length); //Picks a random synapse from synapse layer 1 to mutate
trainingDataSet = Math.floor(Math.random()*(trainingData_.length/trainingDataSetLength)); //Picks a random set of training data to use for synapse layer 1
oldDifference = 0; //Error of previous mutation
cost = 0; //Error of current mutation
randMutation = (Math.random()/10000)*stepSize_ - (Math.random()/10000)*stepSize_;
done = 0;


do{ //Mutates and tests synapses




randSynapse = Math.floor(Math.random()*synapse1.length); //Picks random synapse from layer 1 of synapses to mutate

oldDifference = 0;
trainingDataSet = 0; //Picks random set of training data to use in training
for(qv=0;qv<(trainingData_.length/trainingDataSetLength);qv++){
for(g=0;g<input.length;g++){
input[g] = trainingData_[(trainingDataSet*trainingDataSetLength)+g];
}
runNeuralNet();
oldDifference += Math.abs(output[0] - trainingData_[(trainingDataSet*trainingDataSetLength)+input.length]);
trainingDataSet++;

}


randMutation = (Math.random()/10000)*stepSize_ - (Math.random()/10000)*stepSize_;
synapse1[randSynapse] += randMutation;


cost = 0;
trainingDataSet = 0; //Picks random set of training data to use in training
for(qw=0;qw<(trainingData_.length/trainingDataSetLength);qw++){
for(g=0;g<input.length;g++){
input[g] = trainingData_[(trainingDataSet*trainingDataSetLength)+g];
}
runNeuralNet();
cost += Math.abs(output[0] - trainingData_[(trainingDataSet*trainingDataSetLength)+input.length]);
trainingDataSet++;

}

if(cost > oldDifference){
synapse1[randSynapse] -= 2*randMutation;
}
if(cost / (trainingData_.length/trainingDataSetLength) < accuracy){
done = 1;
}





randSynapse = Math.floor(Math.random()*synapse2.length); //Picks random synapse from layer 1 of synapses to mutate

oldDifference = 0;
trainingDataSet = 0; //Picks random set of training data to use in training
for(qv=0;qv<(trainingData_.length/trainingDataSetLength);qv++){
for(g=0;g<input.length;g++){
input[g] = trainingData_[(trainingDataSet*trainingDataSetLength)+g];
}
runNeuralNet();
oldDifference += Math.abs(output[0] - trainingData_[(trainingDataSet*trainingDataSetLength)+input.length]);
trainingDataSet++;

}



cost = 0;
trainingDataSet = 0; //Picks random set of training data to use in training
for(qw=0;qw<(trainingData_.length/trainingDataSetLength);qw++){
for(g=0;g<input.length;g++){
input[g] = trainingData_[(trainingDataSet*trainingDataSetLength)+g];
}
runNeuralNet();
cost += Math.abs(output[0] - trainingData_[(trainingDataSet*trainingDataSetLength)+input.length]);
trainingDataSet++;

}


synapse2[randSynapse] += stepSize_;
runNeuralNet();
outputChange = output[0];
synapse2[randSynapse] -= stepSize_;
runNeuralNet();
outputChange -= output[0];

if(outputChange > 0){
synapse2[randSynapse] -= stepSize_;
}
else{
synapse2[randSynapse] += stepSize_;
}


if(cost / (trainingData_.length/trainingDataSetLength) < accuracy){
done = 1;
}



}while(done == 0)

}

function run(inputValues){ //Activates neural network use

input = [];
for(i=0;i<inputValues.length;i++){ //Fills array with a "0" for every input
input.push(inputValues[i]);
}
runNeuralNet();
logData();
return(output[0]);

}
21 changes: 21 additions & 0 deletions legacy/hurricane/LICENSE
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
MIT License

Copyright (c) 2018 Hurricane Neural Network and Machine Learning Library

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
Loading

0 comments on commit d16c8b1

Please sign in to comment.