Lecture 18 Code

Binary Simulation

Let YiiidBernoulli(pi) with pi=g1(XiTβ)

1logit <- \(x){log(x/(1-x))}
2inv_logit <- \(x){exp(x)/(1+exp(x))}

3x <- rnorm(1000, 1)

4eta <- 3 - 8 * x

5y <- sapply(eta, \(x) rbinom(1, 1, inv_logit(x)))
1
Creating logit function
2
Creating inverse logit function
3
Simulating predictors from N(2,1)
4
Constructing linear model
5
Simulating outcome

Binary Model

Using g(μ)=log(μ1μ) and f(y)=py(1p)1y, find the maximum likelihood estimates for the simulated data from the previous slide using the optim function.

(β)=i=1nyilog(pi)+(1yi)log(1pi)

  • pi=g1(XTβ)

Code

1llb <- \(x, y, beta){
2  eta <- beta[1] + beta[2] * x
3  p <- inv_logit(eta)
4  ll <- y * log(p) + (1 - y) * log(1 - p)
5  return(-sum(ll))
}

6optim(c(0,0), llb, x = x, y = y)
1
Creating a function called llb for log-likelihood
2
Constructing linear model
3
Obtaining inverse logit value for linear model
4
Obtaining individual log-likelihood values
5
Returning log-likelihood value based of the sum for all observation, returning a negative value for optim
6
Using optim to find the values for β^
$par
[1]  2.697667 -7.624155

$value
[1] 137.0459

$counts
function gradient 
      67       NA 

$convergence
[1] 0

$message
NULL