rnmixGibbs {bayesm}R Documentation

Gibbs Sampler for Normal Mixtures

Description

rnmixGibbs implements a Gibbs Sampler for normal mixtures.

Usage

rnmixGibbs(Data, Prior, Mcmc)

Arguments

Data list(y)
Prior list(Mubar,A,nu,V,a,ncomp) (only ncomp required)
Mcmc list(R,keep) (R required)

Details

Model:
y_i ~ N(mu_{ind_i},Sigma_{ind_i}).
ind ~ iid multinomial(p). p is a ncomp x 1 vector of probs.

Priors:
mu_j ~ N(mubar,Sigma_j (x) A^{-1}). mubar=vec(Mubar).
Sigma_j ~ IW(nu,V).
note: this is the natural conjugate prior – a special case of multivariate regression.
p ~ Dirchlet(a).

Output of the components is in the form of a list of lists.
compsdraw[[i]] is ith draw – list of ncomp lists.
compsdraw[[i]][[j]] is list of parms for jth normal component.
jcomp=compsdraw[[i]][j]]. Then jth comp ~ N(jcomp[[1]],Sigma), Sigma = t(R)%*%R, R^{-1} = jcomp[[2]].

List arguments contain:

Value

a list containing:

probdraw R/keep x ncomp array of mixture prob draws
zdraw R/keep x nobs array of indicators of mixture comp identity for each obs
compdraw R/keep lists of lists of comp parm draws

Note

In this model, the component normal parameters are not-identified due to label-switching. However, the fitted mixture of normals density is identified as it is invariant to label-switching. See Allenby et al, chapter 5 for details. Use eMixMargDen or momMix to compute posterior expectation or distribution of various identified parameters.

Author(s)

Peter Rossi, Graduate School of Business, University of Chicago, Peter.Rossi@ChicagoGsb.edu.

References

For further discussion, see Bayesian Statistics and Marketing by Allenby, McCulloch, and Rossi, Chapter 5.
http://gsbwww.uchicago.edu/fac/peter.rossi/research/bsm.html

See Also

rmixture, rmixGibbs ,eMixMargDen, momMix

Examples

##
if(nchar(Sys.getenv("LONG_TEST")) != 0) {R=2000} else {R=10}

set.seed(66)
dim=5;  k=3   # dimension of simulated data and number of "true" components
sigma = matrix(rep(0.5,dim^2),nrow=dim);diag(sigma)=1
sigfac = c(1,1,1);mufac=c(1,2,3); compsmv=list()
for(i in 1:k) compsmv[[i]] = list(mu=mufac[i]*1:dim,sigma=sigfac[i]*sigma)
comps = list() # change to "rooti" scale
for(i in 1:k) comps[[i]] = list(mu=compsmv[[i]][[1]],rooti=solve(chol(compsmv[[i]][[2]])))
pvec=(1:k)/sum(1:k)

nobs=5000
dm = rmixture(nobs,pvec,comps)

Data=list(y=dm$x)
ncomp=9
Prior=list(ncomp=ncomp,a=c(rep(1,ncomp)))
Mcmc=list(R=R,keep=1)
out=rnmixGibbs(Data=Data,Prior=Prior,Mcmc=Mcmc)

tmom=momMix(matrix(pvec,nrow=1),list(comps))
if(R < 1000) {begin=1} else {begin=500}
pmom=momMix(out$probdraw[begin:R,],out$compdraw[begin:R])
mat=rbind(tmom$mu,pmom$mu)
rownames(mat)=c("true","post expect")
cat(" mu and posterior expectation of mu",fill=TRUE)
print(mat)
mat=rbind(tmom$sd,pmom$sd)
rownames(mat)=c("true","post expect")
cat(" std dev and posterior expectation of sd",fill=TRUE)
print(mat)
mat=rbind(as.vector(tmom$corr),as.vector(pmom$corr))
rownames(mat)=c("true","post expect")
cat(" corr and posterior expectation of corr",fill=TRUE)
print(t(mat))


[Package bayesm version 1.1-1 Index]