Mar 18, 2025
HW 03 due March 20 at 11:59pm
Project exploratory data analysis due March 20 at 11:59pm
Statistics experience due April 22
Likelihood
Maximum likelihood estimation (MLE)
MLE for linear regression
Today we will introduce another way to find these estimators - maximum likelihood estimation.
We will see the least-squares estimator is equal to the maximum likelihood estimator when certain assumptions hold
Suppose the a basketball player shoots the ball, such that the probability of making the basket (successfully making the shot) is
What is the probability distribution for this random phenomenon?
Suppose the probability is
Suppose the probability is
Suppose the player shoots the ball three times. They are all independent and the player has the same probability
Let
Suppose the probability is
Suppose the probability is
Suppose the player shoots the ball three times. They are all independent and the player has the same probability
The player shoots the ball three times with the outcome
New question: What parameter value of
We will use a likelihood function to answer this question.
A likelihood function is a measure of how likely we are to observe our data under each possible value of the parameter(s)
Note that this is not the same as the probability function.
Probability function: Fixed parameter value(s) + input possible outcomes
Likelihood function: Fixed data + input possible parameter values
The likelihood function for the probability of a basket
Thus, if the likelihood for
What is the general formula for the likelihood function for
How does assuming independence simplify things?
How does having identically distributed data simplify things?
The likelihood function for
We want of the value of
The process of finding this value is maximum likelihood estimation.
There are three primary ways to find the maximum likelihood estimator
Approximate using a graph
Using calculus
Numerical approximation
What do you think is the approximate value of the MLE of
Use calculus to find the MLE of
Suppose the player shoots the ball
Suppose the player makes
“Maximum likelihood estimation is, by far, the most popular technique for deriving estimators.” (Casella and Berger 2024, 315)
MLEs have nice statistical properties (more on this next class)
Consistent
Efficient
Asymptotically normal
Note
If the normality assumption holds, the least squares estimator is the maximum likelihood estimator for
Recall the linear model
Suppose we have the simple linear regression (SLR) model
such that
We can write this model in the form below and use this to find the MLE
Let
The likelihood function for
The log-likelihood function for
1️⃣ Take derivative of
2️⃣ Find the
After a few steps…
3️⃣ We can use the second derivative to show we’ve found the maximum
Therefore, we have found the maximum. Thus, MLE for
Note that
We can use a similar process to find the MLEs for
Note:
The MLE
MLEs have nice properties, so this means the least-squares estimator
The MLE