interview book
author: Taha Heidari
Email: taha.heidari@aalto.fi
What is the Normal Equation in Linear regression:
It is an alternative way to Gradient descent algorithm to find the best parameters without any iterations as a closed-form solution just using algebra and matrix calculation as follows:
import numpy as np np.linalg.inv(X.T@X)@X.T@y
where hypothesis space is:
in matrix form
review:
review:
- It is slow when the dataset and number of features are large
- It only works for linear regression and not other methods like logistic regression
https://eli.thegreenplace.net/2014/derivation-of-the-normal-equation-for-linear-regression
https://www.youtube.com/watch?v=g8qF61P741w
https://prutor.ai/normal-equation-in-linear-regression/
What is batch gradient descent algorithm?
It refers to the gradient descent method which uses all of the training data to update the parameters at the same time instead of using a subset of data
where
What is vectorisation?
In order to have a much faster implementation of the ML algorithms we need to vectorise the formulation of the problem like:
Written with StackEdit.