Week 2
Multiple Feature Linear Regression
Let's assume we have multiple features.
repsents each feature
represents the total number of features.
represents the feature
still represents the features of the training example but it is now a row vector with multiple values
represents the value of the feature of the training example
Training model will be revised:
Previously:
Now:
Note: This is not multivariate regression.
Vectorization
Using vectorization makes your code shorter and is faster for computation.
Without vectorization: OR
With vectorization: , this is much more efficient for larger because of code optimizations in the NumPy library (parallel computation as opposed to a for loop)
Implement Gradient Descent with Multiple Regression with Vectorization:
represents parameters of the model
b is a number
Therefore,
Cost function:
Gradient Descent algorithm:
Update rule for multiple features:
An alternative to Gradient Descent
Normal equations can be used for linear regression to solve for w, b without iterations
Disadvantages are that it doesn't generalize to other learning algorithms, and it is slow when number of features are large.
Feature Scaling
If one feature has a large range of values, and the other has a small range of values, a good model should be able to scale the feature with the weights. If the feature are not on the same scale, gradient descent will work but it will be very slow.
Scaling means to apply some transformation to the data so that the parameters should be in the same range (usually from 0 to 1). Rescaling results in comparable range of values helping to speed up gradient descent.
Implementation of feature scaling:
Divide by Maximum:
Mean Normalization:
Z-score Normalization:
Aim for about or similar for each feature
Convergence and Learning Rate
If decreases by in one iteration, declare convergence.
Learning rate:
If is too small, gradient descent takes a lot more iterations to converge
If is too large, cost function may bounce around, sometimes going up as well.
Try 0.001, 0.01, 0.1, 1, etc.
Feature engineering is where you use intuition to design new features by transforming or combining original features.
Last updated