DEV Community

Cover image for Train a Neural Network to Count, with Javascript & Dannjs
Matias Vazquez-Levi for daily.dev

Posted on • Updated on • Originally published at daily.dev

Train a Neural Network to Count, with Javascript & Dannjs

Counting is pretty intuitive for us humans, not as much for neural networks. This is why you are here, learning to build your first neural network with Dannjs. Using some machine learning concepts, we are going to build a neural network in JavaScript that learns to count number after number. Dannjs is a new open-source library that allows for the simple implementation of neural networks in your projects.

Making a neural network yourself is a good way to familiarise yourself with the concept itself. This is also relatively simple to do. The library we will use, Dannjs, makes it even easier to have clean and simple code.

Prerequisites:

  • Nodejs
  • Dannjs
  • Some abstract knowledge about Neural Networks

Let’s understand the basics of binary numbers first.

To teach our Neural Network to count, we will translate our well-known integers into binary data (4-bits). Each bit is going to correspond to one input/output neuron in the neural network.

Ideally, we want to train our network to output the next number we feed it. For example, If we feed the network [0,0,0,1] we want it to output [0,0,1,0]. Since there are only 4 spaces for boolean information, our model will only count up to 15, which is equal to 1111 in binary. You could try adding more inputs and outputs to your model. In this case, your dataset would have to be bigger to accommodate 5 bits of information.

Project Setup

First off, we’re going to install Dannjs locally in our project directory.

npm i dannjs
Enter fullscreen mode Exit fullscreen mode

We would now need to require the node module we installed in our main js file.

const Dannjs = require('dannjs');
const Dann = Dannjs.dann;
Enter fullscreen mode Exit fullscreen mode

Creating the Neural Network

This is how we would create the neural network. We are giving it 4 input neurons for the 4 bits of binary data to input. We are also going to give it 4 output neurons for the 4 bits of binary data the model is going to have to output.

const countingNN = new Dann(4,4);
Enter fullscreen mode Exit fullscreen mode

We now need to add some hidden layers. I’ve found that 16 neurons worked well enough. You could experiment with this value later. A hidden layer is basically a neuron layer that can perform computations. The name ‘hidden’ comes from the fact that you do not need to see the values of each neuron, unlike the input/output layers. You can learn more about hidden layers & the basics surrounding it here. We are also going to set the activation function to 'leakyReLU' , activation functions are also explained in the link above.

const countingNN = new Dann(4,4);
countingNN.addHiddenLayer(16,'leakyReLU');
countingNN.lr = 0.01;
Enter fullscreen mode Exit fullscreen mode

Technically, we have finished the creation of our model by now. We could test it right away with the Dann.log(); command or by feeding the model some 4-bit data.

countingNN.log();
countingNN.feedForward([0,0,1,0],{log:true});
Enter fullscreen mode Exit fullscreen mode

The log function displays information about the model we just created. We also specify in the feedForward options that we want to log the predictions of the model.

Dann NeuralNetwork:
  Layers:
    Input Layer:   4
    hidden Layer: 16  (leakyReLU)
    output Layer: 4  (sigmoid)
  Other Values:
    Learning rate: 0.01
    Loss Function: mse
    Latest Loss: 0

Prediction:  [0.5398676880698,0.6730987170697,0.6748741672290,0.6377646387674]
Enter fullscreen mode Exit fullscreen mode

We can see that this gives us some random predictions. Obviously, this was going to happen because we never trained the model… In order to train our model, we need some sort of dataset telling the neural network what to output according to what input is given.

Setting up the Dataset

To train the model, we’re going to need a dataset. Here is a lightweight js dataset for 4-bits binary counting. It basically looks like this:

const dataset4bit = [

    //...
    {
        input:[1,0,1,0],
        target:[1,0,1,1]
    },
    //  {
    //      input:[1,0,1,1],
    //      target:[1,1,0,0]
    //  },
    {
        input:[1,1,0,0],
        target:[1,1,0,1]
    },

    //...
];
Enter fullscreen mode Exit fullscreen mode

You can access the dataset here

We can see that this dataset contains one number x in 4-bit binary, as the input value and the number x+1 in 4-bit binary as the target value. I commented out the element [1,0,1,1] so we can have a test sample the neural network has never seen. To access the data, we can copy the code included in the GitHub gist above and save it in binaryCountData.js in the same directory as our project. We can then require the file:

const dataset = require('./binaryCountData.js').dataset;
Enter fullscreen mode Exit fullscreen mode

We can now access the data this way:

dataset[i].input
dataset[i].target
Enter fullscreen mode Exit fullscreen mode

Now that we have access to the dataset let’s apply it by calling Dann.backpropagate(); for each data point in our dataset array. This will tune the weights of the model according to the data you give it.

for (data of dataset) {
    countingNN.backpropagate(data.input,data.target);
}
Enter fullscreen mode Exit fullscreen mode

This counts as 1 epoch. We iterated through every element in our dataset once. Sadly, 1 epoch is not enough for a Neural Network to train properly. We would need to perform multiple epochs in order to achieve satisfying results. Let's also add a Dann.feedForward(); to test what the model outputs after training.

const epoch = 100000;
for (let e=0; e < epoch;e++) {
    for (data of dataset) {
        countingNN.backpropagate(data.input,data.target);
    }
}

countingNN.feedForward([1,0,1,1],{log:true});
Enter fullscreen mode Exit fullscreen mode

And after training 100 000 epochs, it outputs:

Prediction:  [0.999884854,0.9699951248,0.020084607062,0.008207215405]
Enter fullscreen mode Exit fullscreen mode

We made it! We can see that it guesses pretty close to [1,1,0,0], which is a good answer.
This is what the final js code should look like:

const Dannjs = require('dannjs');
const Dann = Dannjs.dann;
const dataset = require('./binaryCountData.js').dataset4bit;

const countingNN = new Dann(4,4);
countingNN.addHiddenLayer(16,'leakyReLU');

countingNN.lr = 0.01;

countingNN.feedForward([1,0,1,1],{log:true});
const epoch = 100000;
for (let e=0; e < epoch;e++) {
    for (data of dataset) {
        countingNN.backpropagate(data.input,data.target);
    }
}
countingNN.feedForward([1,0,1,1],{log:true});
Enter fullscreen mode Exit fullscreen mode

In this tutorial, we trained a neural network to count binary integers. We learned how binary numbers worked in order to digitalize them into our neural network. We then learned how to use Dannjs, the new neural network library that just came out, without any prerequisite ML knowledge. Feel free to tweak all the settings you want, experiment, play around, you’ll eventually get a grasp of what affects what. The only source of knowledge is experience.


daily.dev delivers the best programming news every new tab. We will rank hundreds of qualified sources for you so that you can hack the future.
Daily Poster

Top comments (0)