Monte Carlo simulations using Markov chains as the Gibbs sampler and Metropolis algorithm are widely used techniques for modelling stochastic problems for decision making. Like all other Monte Carlo approaches, MCMC exploits the law of large numbers via repeated random sampling. Samples are formed by running a Markov Chain that is constructed in such a way that its stationary distribution closely matches the input function, which is represented by a proposal distribution. In this paper, the fundamentals of MCMC methods are discussed, including the algorithm selection process, optimizations, as well as some efficient approaches for utilizing generalized linear mixed models. Another aim of this paper is to highlight the usage of the EM method to get accurate maximum likelihood estimates in the context of generalized linear mixed models. |
*** Title, author list and abstract as seen in the Camera-Ready version of the paper that was provided to Conference Committee. Small changes that may have occurred during processing by Springer may not appear in this window.