Home / Blog / Weights in Neural Network | Binary Classification
Binary Classification

Weights in Neural Network | Binary Classification

Calculating the Number of Weights in the Neural Network | Binary Classification

What are the Weights in Neural Network

The weights represents the strength of the connection between the units. A weight may bring down the importance of an input value or it may elevate it. Weight near zero means changing the input will not change the output.

Negative weights means increasing the input will decrease the output. So the weights decides how much influence the input will have the output.

Here we discuss how we can calculate the values of weights from a neural network. 

As we discuss a complete neural network in our previous post you can read here to understand the neural network structure for better understanding. Here is the link to my previous post so kindly first read that post for a deep understanding.

As we can see in the following graphical representation there are 4 inputs along 4 weights and one hidden layer. A which does not produce any output is called a hidden layer or we can say except very first and last layers all other layers are hidden layers. Now here the question we are going to discuss here is how we can calculate weights in a neural network.

Calculating the number of weights in neural network | Binary Classification

How many weights among the input layer and the first hidden layer?

Every single neuron in a hidden layer will receive every input value. If there are 4 inputs X1, X2, X3, X4 the first node in the hidden layer will receive all the coming 4 inputs, and the same for all other neurons every neuron has 4 inputs in this neural network.

So, we can calculate all weights in a neural network by using the following formula:

Number of all inputs x Number of all neurons = Total waits

4 x4 = 16

In this layer, the number of weights is 16. In the second hidden layer, there are 4 x 3 = 12 weights. From the second hidden layer to the output node there is 3 x 1=3 number of weights. To find out the total number of weights of this neural network by adding 16 + 12 + 3 = 31.

We are also calculating baise and adding them in our weights a total number of baise = total number of neurons,

16 + 12 + 3 + 8 = 39.

Total waits = 39

In the above image, there are 12 weights and 4 baise in the first hidden layer, and in the second hidden there are 8 weights and 2 baise, and at the last layer, there is one output layer it will compute a linear function.

Calculating the number of weights in neural network | Binary Classification

The output of a neural network will be 0 < σ

 < 1 means the output will be between 0 to 1 and we can interpret it by using probability because the value of a probability lies between 0 and 1.

Binary classification

We can drive our output by deciding our output in Yes or No in binary classification, for example, our probability for Yes is 0.27 and for No is 0.12 so we will set criteria for the best possible answer if the drive value is greater than 0.5 the output will be Yes otherwise No. If the type of your question has the input in Yes or No format then there is only one neuron in the output layer.

Post navigation

Introduction to Deep Learning | Artificial Neural Network for Beginners

Batch, Mini Batch & Stochastic Gradient Descent | What is Bias?

Neural Network in Deep Learning | Two-Layer Neural Network