Linear Regression

 LINEAR REGRESSION


When we are trying to find a line that fits a set of data best, we are performing Linear Regression.

LOSS

For each data point, we calculate loss, a number that measures how bad the model’s (in this case, the line’s) prediction was. This is also  being referred to as error.

Gradient Descent

As we try to minimize loss, we take each parameter we are changing, and move it as long as we are decreasing loss. The process by which we do this is called gradient descent.


CONVERGENCE

Convergence is when the loss stops changing (or changes very slowly) when parameters are changed.

LEARNING RATE 

We have to choose a learning rate, which will determine how far down the loss curve we go.
A small learning rate will take a long time to converge — you might run out of time or cycles before getting an answer. A large learning rate might skip over the best value. It might never converge



Comments

Popular posts from this blog

Support Vector Machines

ACCURACY, RECALL, PRECISION, F1 SCORE

Random Forest