When you implement a neural network there are some techniques that are going to be really important.
For example, if you have a training set of m training examples, you might be used to processing the training set by having a for loop step through your m training examples.
But when you're implementing a neural network, you usually want to process your entire training set without using an explicit for loop to loop over your entire training set.
Usually in Neural network computation of, in your network, usually you have - a forward pass or forward propagation step, followed by a backward pass or what's called a backward propagation step.
It is advantageous to know a neural network can be organized in this forward propagation and a separate backward propagation.
It is easier to convey these ideas using logistic regression in order to make the ideas easier to understand.