Every time I think I know what's going on, suddenly there's another layer of complications.
2017年5月12日星期五
2017年5月8日星期一
2017年5月7日星期日
Create an empty list and flatten the list
x<-c(2.3,0.3,5.2,3.1,1.1,0.9,2.0,0.7,1.4,0.3)
y<-c(0.8,2.8,4.0,2.4,1.2,0.0,6.2,1.5,28.8,0.7)
L<-vector("list",10)
for(i in 1:10){
L[[i]]<-y[i]-x #key is [[]] operator here
}
d<-unlist(L)
median(d) # This is Hodges-Lehmann estimator
y<-c(0.8,2.8,4.0,2.4,1.2,0.0,6.2,1.5,28.8,0.7)
L<-vector("list",10)
for(i in 1:10){
L[[i]]<-y[i]-x #key is [[]] operator here
}
d<-unlist(L)
median(d) # This is Hodges-Lehmann estimator
2017年5月5日星期五
Hodges-Lehmann Estimation of Location Shift
data e1041;
input group $ x;
datalines;
1 2.3
1 0.3
1 5.2
1 3.1
1 1.1
1 0.9
1 2.0
1 0.7
1 1.4
1 0.3
2 0.8
2 2.8
2 4.0
2 2.4
2 1.2
2 0.0
2 6.2
2 1.5
2 28.8
2 0.7
;
run;
proc npar1way hl alpha=.05 data=e1041;
class group;
var x;
exact hl;
ods select WilcoxonScores HodgesLehmann;
run;
2017年5月4日星期四
2017年5月2日星期二
linear vs nolinear
A regression model is called nonlinear, if the derivatives of the model with respect to the model parameters depends on one or more parameters. This definition is essential to distinguish nonlinear from curvilinear regression. A regression model is not necessarily nonlinear if the graphed regression trend is curved. A polynomial model such as y = b0 + b1x + b2x2 + e appears curved when y is plotted against x. It is, however, not a nonlinear model. To see this, take derivatives of y with respect to the parameters b0, b1, and b2: dy/db0 = 1, dy/db1 = x, dy/db2 = x2 None of these derivatives depends on a model parameter, the model is linear. In contrast, consider the log-logistic model y = d + (a - d)/(1 + exp{b log(x/g)}) + e Take derivatives with respect to d, for example: dy/dd = 1 - 1/(1 + exp{b log(x/g)}). The derivative involves other parameters, hence the model is nonlinear
.http://www.ats.ucla.edu/stat/sas/library/SASNLin_os.htm
订阅:
博文 (Atom)