2019年11月9日星期六

BLUP for mixed model

http://rpubs.com/enwuliu/548305

This is R code for the example from the paper "That Blup Is a Good Thing: The Estimation of Random Effects.".

Robinson, George K. "That Blup Is a Good Thing: The Estimation of Random Effects." Statistical Science 6, no. 1 (1991): 15-32.

In the book Generalized Additive Models An Introduction with R, the second version, page 81 the function llm that solved the linear mixed model also used Henderson's equations but did not mention the equations.





mixed_model_blup.R
enliu
2019-11-10
Y<-matrix(c(110,100,110,100,100,110,110,100,100),nrow=9,ncol=1)
X<-matrix(c(1,1,0,0,0,0,0,0,0,0,0,1,1,1,0,0,0,0,0,0,0,0,0,1,1,1,1),nrow=9,ncol=3)

Z<-matrix(c(1,0,0,0,0,0,0,0,0,
            0,0,1,0,0,0,0,0,0,
            0,0,0,0,0,1,1,0,0,
            0,1,0,1,1,0,0,1,1),nrow=9,ncol=4)
R<-diag(1,nrow=9,ncol=9)
G<-diag(0.1,nrow=4,ncol=4)
A<-t(X)%*%solve(R)%*%X
B<-t(X)%*%solve(R)%*%Z
C<-t(Z)%*%solve(R)%*%X
D<-t(Z)%*%solve(R)%*%Z+solve(G)

E<-cbind(A,B)
F<-cbind(C,D)

M<-rbind(E,F)

t(X)%*%solve(R)%*%Y
##      [,1]
## [1,]  210
## [2,]  310
## [3,]  420
t(Z)%*%solve(R)%*%Y
##      [,1]
## [1,]  110
## [2,]  110
## [3,]  220
## [4,]  500
N<-rbind(t(X)%*%solve(R)%*%Y,t(Z)%*%solve(R)%*%Y)

solve(M,N)
##             [,1]
## [1,] 105.6386574
## [2,] 104.2757378
## [3,] 105.4584366
## [4,]   0.3964857
## [5,]   0.5203875
## [6,]   0.7569272
## [7,]  -1.6738004



/*********Another method from the Generalized Additive Models An Introduction with R, the second version, p81


llm <- function(theta,X,Z,y) {
  ## untransform parameters...
  sigma.b <- exp(theta[1])
  sigma <- exp(theta[2])
  ## extract dimensions...
  n <- length(y); pr <- ncol(Z); pf <- ncol(X)
  ## obtain \hat \beta, \hat b...
  X1 <- cbind(X,Z)
  ipsi <- c(rep(0,pf),rep(1/sigma.b^2,pr))
 
  b1 <- solve(crossprod(X1)/sigma^2+diag(ipsi),t(X1)%*%y/sigma^2)
 
  ## compute log|Z'Z/sigma^2 + I/sigma.b^2|...
 
  ldet <- sum(log(diag(chol(crossprod(Z)/sigma^2 +
                              diag(ipsi[-(1:pf)])))))
  ## compute log profile likelihood...
  l <- (-sum((y-X1%*%b1)^2)/sigma^2 - sum(b1^2*ipsi) -n*log(sigma^2) - pr*log(sigma.b^2) - 2*ldet - n*log(2*pi))/2
  attr(l,"b") <- as.numeric(b1) ## return \hat beta and \hat b
  -l
}

y<-matrix(c(110,100,110,100,100,110,110,100,100),nrow=9,ncol=1)
x<-matrix(c(1,1,0,0,0,0,0,0,0,0,0,1,1,1,0,0,0,0,0,0,0,0,0,1,1,1,1),nrow=9,ncol=3)

z<-matrix(c(1,0,0,0,0,0,0,0,0,
            0,0,1,0,0,0,0,0,0,
            0,0,0,0,0,1,1,0,0,
            0,1,0,1,1,0,0,1,1),nrow=9,ncol=4)


X1 <- cbind(x,z)

crossprod(X1)
##      [,1] [,2] [,3] [,4] [,5] [,6] [,7]
## [1,]    2    0    0    1    0    0    1
## [2,]    0    3    0    0    1    0    2
## [3,]    0    0    4    0    0    2    2
## [4,]    1    0    0    1    0    0    0
## [5,]    0    1    0    0    1    0    0
## [6,]    0    0    2    0    0    2    0
## [7,]    1    2    2    0    0    0    5
n <- length(y); pr <- ncol(z); pf <- ncol(x)

theta<-c(log(sqrt(0.1)),log(1))

sigma.b <- exp(theta[1])
sigma <- exp(theta[2])

ipsi <- c(rep(0,pf),rep(1/sigma.b^2,pr))


llm(theta,x,z,y)
## [1] 92.34967
## attr(,"b")
## [1] 105.6386574 104.2757378 105.4584366   0.3964857   0.5203875   0.7569272
## [7]  -1.6738004





crossprod function



http://www.endmemo.com/program/R/crossprod.php

R crossprod Function


crossprod() function returns matrix cross-product.
crossprod(x, y = NULL)
tcrossprod(x, y = NULL)

x: numeric matrix
y: numeric matrix, if y=NULL, y is the same as x
...

> x <- matrix(1:9,3,3)
> x
     [,1] [,2] [,3]
[1,]    1    4    7
[2,]    2    5    8
[3,]    3    6    9

> crossprod(x)
     [,1] [,2] [,3]
[1,]   14   32   50
[2,]   32   77  122
[3,]   50  122  194

> tcrossprod(x)
     [,1] [,2] [,3]
[1,]   66   78   90
[2,]   78   93  108
[3,]   90  108  126

2019年11月4日星期一

Output variance -covariance matrix from Genmod

proc genmod data=outmi;
   model Oxygen= RunTime RunPulse/covb;
   ods output ParameterEstimates=gmparms
              ParmInfo=gmpinfo
              CovB=gmcovb;
run;