What is the difference between back-propagation and feed-forward Neural Network?
A Feed-Forward Neural Network is a type of Neural Network architecture where the connections are "fed forward", i.e. do not form cycles (like in recurrent nets).
The term "Feed forward" is also used when you input something at the input layer and it travels from input to hidden and from hidden to output layer.
The values are "fed forward".
Both of these uses of the phrase "feed forward" are in a context that has nothing to do with training per se.
- Backpropagation is a training algorithm consisting of 2 steps: 1) Feed forward the values 2) calculate the error and propagate it back to the earlier layers. So to be precise, forward-propagation is part of the backpropagation algorithm but comes before back-propagating.
There is no pure backpropagation or pure feed-forward neural network.
Backpropagation is algorithm to train (adjust weight) of neural network. Input for backpropagation is output_vector, target_output_vector, output is adjusted_weight_vector.
Feed-forward is algorithm to calculate output vector from input vector. Input for feed-forward is input_vector, output is output_vector.
When you are training neural network, you need to use both algorithms.
When you are using neural network (which have been trained), you are using only feed-forward.
Basic type of neural network is multi-layer perceptron, which is Feed-forward backpropagation neural network.
There are also more advanced types of neural networks, using modified algorithms.
Also good source to study : ftp://ftp.sas.com/pub/neural/FAQ.html Best to understand principle is to program it (tutorial in this video) https://www.youtube.com/watch?v=KkwX7FkLfug