I'm fitting a longitudinal model (93000 observations within 24700 persons) using overdispersed poisson, as this distribution fits my data best. I added a quasi-level to a ML poisson as suggested in a different topic on this forum, to account for the larger variance.
When I switch to MCMC estimation to refine my estimates, they change drastically. I just wanted to make sure I'm not making a mistake somewhere, so it would be great if you could give me some feedback on possible mistakes or miss specifications. I also assume the fixed part estimates are specified in log odds? Otherwise some things definitely do not make sense.
Below is the code for the empty model, and the estimates for each estimation method. Increasing burn-in and iterations does not alter things too much in mcmc. ESS is quite small for the estimated parameters. Substantive results are largely the same for large estimates in the full model.
Thanks for your time,
Bram
Code: Select all
runmlwin C cons , ///
level3(id: cons ) ///
level2(wave: cons ) ///
level1(wave: ) discrete(distribution(poisson)) rigls nopause
runmlwin C cons , ///
level3(id: cons ) ///
level2(wave: cons ) ///
level1(wave: ) discrete(distribution(poisson)) mcmc(on) initsprevious nopause
RIGLS/MQL1 results /// MCMC results
Fixed part
cons .440 (.007) /// -.108 (.008)
Random part
L3 (person level)
var (cons) .873 (.010) /// 1.181 (.016)
L2 (overdispersion)
var (cons) .712 (.004) /// .096 (.003)