question on MCMC in ML overdispersed poisson

Welcome to the forum for runmlwin users. Feel free to post your question about runmlwin here. The Centre for Multilevel Modelling take no responsibility for the accuracy of these posts, we are unable to monitor them closely. Do go ahead and post your question and thank you in advance if you find the time to post any answers!

Go to runmlwin: Running MLwiN from within Stata >> http://www.bristol.ac.uk/cmm/software/runmlwin/
Post Reply
BramVanhoutte
Posts: 3
Joined: Wed Sep 26, 2012 11:59 am

question on MCMC in ML overdispersed poisson

Post by BramVanhoutte »

Dear,

I'm fitting a longitudinal model (93000 observations within 24700 persons) using overdispersed poisson, as this distribution fits my data best. I added a quasi-level to a ML poisson as suggested in a different topic on this forum, to account for the larger variance.
When I switch to MCMC estimation to refine my estimates, they change drastically. I just wanted to make sure I'm not making a mistake somewhere, so it would be great if you could give me some feedback on possible mistakes or miss specifications. I also assume the fixed part estimates are specified in log odds? Otherwise some things definitely do not make sense.

Below is the code for the empty model, and the estimates for each estimation method. Increasing burn-in and iterations does not alter things too much in mcmc. ESS is quite small for the estimated parameters. Substantive results are largely the same for large estimates in the full model.
Thanks for your time,
Bram

Code: Select all

runmlwin C cons  , ///
    level3(id: cons   ) ///
	level2(wave: cons   ) ///
	level1(wave: ) discrete(distribution(poisson))  rigls nopause
	
runmlwin C cons  , ///
    level3(id: cons   ) ///
	level2(wave: cons   ) ///
	level1(wave: ) discrete(distribution(poisson))  mcmc(on) initsprevious nopause


RIGLS/MQL1 results /// MCMC results
Fixed part
cons .440 (.007) /// -.108 (.008)

Random part
L3 (person level)
var (cons) .873 (.010) /// 1.181 (.016)

L2 (overdispersion)
var (cons) .712 (.004) /// .096 (.003)
GeorgeLeckie
Site Admin
Posts: 432
Joined: Fri Apr 01, 2011 2:14 pm

Re: question on MCMC in ML overdispersed poisson

Post by GeorgeLeckie »

Hi,

Yes, as far as I can see, this all looks fine

MQL1 is approximate is known to give parameter estimates that are biased towards zero (especially when you have small clusters which you do).
MCMC does not suffer from same problem.
So we expect to see MCMC estimates further away from zero than MQL1 estimates.
Your level-3 variance is much bigger when fitted by MCMC than MQL1 as expected.
Would be worth fitting the model by PQL2 as well (I would expect your estimates to lie in between the MQL1 and MCMC ones)

Yes fixed part parameter estimates are fitted on the log scale. Exponentiating them gives IRRs. You can use the irr option in runmlwin to report this.

Also, it might be worth simulating a dataset of the same dimensions with known parameter values and then fitting the model to it by each method.
This way you can get a better sense of the likely biases in your own data

Best wishes

George
BramVanhoutte
Posts: 3
Joined: Wed Sep 26, 2012 11:59 am

Re: question on MCMC in ML overdispersed poisson

Post by BramVanhoutte »

Great thanks!
especially the fact that the overdispersion diminished greatly made me a bit worried, but thanks for the confirmation.
Bram
Post Reply