a few months ago

Training your Neural Networks

If you haven't created your first neural network, you can look furthermore here. Now it's time to take the first step because you're going to train your Neural Network and if this interests you, let's go for it. 

More...

Creating
the
model

The first thing you should

do when creating a Neural Network is choosing how the overall

architecture is going

to be like.


When approaching a new problem you could research papers in the same field as you're interested in and seem what types of configurations researches are using. But it's really importatant for you to play around with the number of outputs and layer of a Neural Network. Here's an example of how a model looks like in PyTorch

Forward, Loss & Backpropagation

If you are not familiar with the building blocks of Neural Networks, you can check it my blog post about perceptrons. We have three main parts in a Neural Networks

  • Forward pass
  • Loss
  • Backpropagation pass

Let's suppose we have a input x, maybe a 28x28 image. This input goes through the network and is operated around with weights and bias. In the end of the process, we call an activation function such a Sigmoid to transform the variable between a range of values, usually between 0 and 1 or -1 and 1, this is the forward pass. When we have a value such as 0.6 after the Sigmoid, we can calculate our Loss.

It's all about the Loss

Perceiving the errors and adjusting them going backwards, this is what Neural Networks are all about

Let's suppose our label was 1.0, so our loss would be 0.4. With this prediction error we shall do the final step, backpropagation. This is where the magic happens, you can show your network the wrongs she's been doing and let she know what she can do to improve. She can use backpropagation to go back and adjust the weights and biases to make a better prediction. This is what this looks in PyTorch

If you are keen to learn the mathematical background around Forward, Loss ( such as choosing a optimizer, in this case SGD Stochastic Gradient Descent ) and Backpropagation let me know. 

And if you want to take a look at the code, here it's. If you want more content like this, I have a YouTube Channel

2 MIN READ
>