feature scaling is critical step in gradient descent code example

Example: what is feature scaling

'''What is feature scaling?
Feature scaling is a method used to normalize the range of independent
variables or features of data. In data processing, it is also known as 
data normalization and is generally performed during the data preprocessing
step.

Few advantages of normalizing the data are as follows:

1. It makes your training faster.
2. It prevents you from getting stuck in local optima.
3. It gives you a better error surface shape.
4. Wweight decay and bayes optimization can be done more conveniently.

Hovewer, there are few algorithms such as Logistic Regression and 
Decision Trees that are not affected by scaling of input data.

With regard to neural networks it makes a difference

Tags:

Misc Example