Friday, October 9, 2015

A gem of a comment regarding likelihood optimization

An interesting comment on conducting likelihood optimization (frequentist approach) (from http://www.petrkeil.com/?p=393):

R code:
### The following R code does just this for a normal linear regression (eq1 above):
# Generate some data
a <- 5
b <- 2
sigma <- 0.5
x <- runif(1000,0,20)
y <- a+(b*x)+rnorm(1000)*sigma
dat <- data.frame(x=x,y=y)
plot(dat$x,dat$y)
 
# Write the likelihood function for the statistical model (in this case, normal linear regression)
loglike<- function(par,dat)
{
a <- par['a']
b <- par['b']
sigma <- par['sigma']
 
return(sum(dnorm(dat$y-a-(b*dat$x),mean=0,sd=sigma,log=T)))
}
 
# Maximise the log likelihood
res <- optim(par=c(a=1.5,b=1.5,sigma=0.6), fn=loglike, dat=dat, method='L-BFGS-B', lower=rep(0.2,3), upper=c(a=5,b=5,sigma=2), control=list(fnscale=-1))
res$value
res$par
 
# Plot the fitted result over the data to see if it looks like a good fit (which we know it will be).
x <- seq(from=min(dat$x),to=max(dat$x),by=0.1)
y <- res$par['a']+(res$par['b']*x)
plot(dat$x, dat$y, type='p')
points(x, y, type='l', col='red')
 
 
### And compare to LM
lmfit <- lm(y ~ x, dat)
summary(lmfit)$coeff
res$par
Created by Pretty R at inside-R.org