Maximum Likelihood Estimation - Linear Regression

Illustrated using Julia

Author
Published

December 9, 2022

An example of estimating regression coefficients in a linear model via maximum likelihood, using Julia.

using Distributions
using Random
using Optim
using GLM
┌ Info: Precompiling Optim [429524aa-4258-5aef-a3af-852621145aeb]
└ @ Base loading.jl:1664
┌ Info: Precompiling GLM [38e38edf-8417-5370-95a0-9cbb8c7f171a]
└ @ Base loading.jl:1664

Generate some fake data

Random.seed!(0408)

#x data
𝐗 = hcat(ones(1000), randn(1000, 3))

#ground truth betas
𝚩 = [.5, 1, 2, 3]

#multiply data by betas
f₁(X) = X*𝚩

#make some error
ϵ = rand(Normal(0, .5), size(𝐗)[1])

#generate y
y = f₁(𝐗) + ϵ;

Define a function to optimize

function mle_lm(x, y, params)
    b = params[begin:end-1]
    σ = params[end]

= x*b

    residuals = y .-

    ll = -loglikelihood(Normal(0, σ), residuals)

    return ll
end
mle_lm (generic function with 1 method)

Run the optimization

start_params = [0.2, .5, 1, 1, 1]

res = optimize(params -> mle_lm(𝐗, y, params), start_params)

Optim.minimizer(res)
5-element Vector{Float64}:
 0.5220526008168841
 0.9925044101015756
 1.9951661029827337
 2.9979617225853903
 0.5170359122024899

Check against ‘base’ Julia solution

𝐗 \ y
4-element Vector{Float64}:
 0.5220524130050493
 0.9925035386795751
 1.9951668631756507
 2.9979619869409357