using Distributions
using Random
using Optim
using GLM
┌ Info: Precompiling Optim [429524aa-4258-5aef-a3af-852621145aeb]
└ @ Base loading.jl:1664
┌ Info: Precompiling GLM [38e38edf-8417-5370-95a0-9cbb8c7f171a]
└ @ Base loading.jl:1664
Illustrated using Julia
An example of estimating regression coefficients in a linear model via maximum likelihood, using Julia.
using Distributions
using Random
using Optim
using GLM
┌ Info: Precompiling Optim [429524aa-4258-5aef-a3af-852621145aeb]
└ @ Base loading.jl:1664
┌ Info: Precompiling GLM [38e38edf-8417-5370-95a0-9cbb8c7f171a]
└ @ Base loading.jl:1664
Generate some fake data
Random.seed!(0408)
#x data
= hcat(ones(1000), randn(1000, 3))
𝐗
#ground truth betas
= [.5, 1, 2, 3]
𝚩
#multiply data by betas
f₁(X) = X*𝚩
#make some error
= rand(Normal(0, .5), size(𝐗)[1])
ϵ
#generate y
= f₁(𝐗) + ϵ; y
Define a function to optimize
function mle_lm(x, y, params)
= params[begin:end-1]
b = params[end]
σ
= x*b
ŷ
= y .- ŷ
residuals
= -loglikelihood(Normal(0, σ), residuals)
ll
return ll
end
mle_lm (generic function with 1 method)
Run the optimization
= [0.2, .5, 1, 1, 1]
start_params
= optimize(params -> mle_lm(𝐗, y, params), start_params)
res
minimizer(res) Optim.
5-element Vector{Float64}:
0.5220526008168841
0.9925044101015756
1.9951661029827337
2.9979617225853903
0.5170359122024899
Check against ‘base’ Julia solution
\ y 𝐗
4-element Vector{Float64}:
0.5220524130050493
0.9925035386795751
1.9951668631756507
2.9979619869409357