Page 1 of 1

Step 4 of 2LevelImpute

Posted: Thu Jul 31, 2014 6:53 pm
by shakespeare
I'm trying to translate what I did in Realcom to Stat-JR using the 2LevelImpute template. My model is a 2 level model with all the variables at level one and location as the clustering variable. Outcome is binary and there are about 10-15 predictors in the MOI, depending on how I set things up. In the imputation model, I want to treat everything as a response, so for each variable I'll have something like y=B+u+e.

In Realcom I used a burn in of 2000 iterations and 1000 iterations between imputations. That burn in was probably excessive, but I wasn't sure how much was enough, so I erred on the side of caution based on my reading of the literature. There does not seem to be a burn in for the imputation model, but there is a burn in and iterations for the MOI. How is the burn in for the imputation model optimized? And what recommendations can be made about burn in and iterations for the MOI in my case?

Re: Step 4 of 2LevelImpute

Posted: Fri Aug 01, 2014 12:52 pm
by ChrisCharlton
The burnin for the imputation model is hardcoded to 1000 (see the line

Code: Select all

estinputs['burnin'] = '1000'
in 2LevelImpute.py), although this is after an adaptation phase of 5000 iterations. To choose the best values for the burnin and iterations for the MOI you will need to look at the model diagnostics after the model is run. As the model running process may take a while it is probably better to err on the side of caution to reduce the likelihood that you will need to run it again.

Re: Step 4 of 2LevelImpute

Posted: Fri Aug 01, 2014 1:28 pm
by shakespeare
Ok. That makes sense. A couple of other questions. Since I'm primarily a SAS and Stata programmer, I'm used to generating command files that I can save and run again at a later date. Is it possible to do the same by saving the input string?

I see in the 2LevelImpute example that there is a procedure to recover the imputed data files. It's not clear if the procedure copies or cuts and pastes the files (I haven't tried it yet). If the original files are left behind, it would be nice to know where those files are physically saved since after a few models are run, I might want to clean up my disk and delete unwanted files.

Re: Step 4 of 2LevelImpute

Posted: Fri Aug 01, 2014 1:54 pm
by ChrisCharlton
Yes, if you copy the input string from the bottom of the page then as long as you have the same template and data loaded you can paste this into the input box at a later date to automatically fill in the answers to the questions.

The imputed data files are currently only stored in memory and will disappear when you close the application. You can download them by either clicking the download button after a model run, or by selecting them using Choose in the Dataset menu and then using Download, also in the Dataset menu. If you want to clear them from memory the Dataset menu has a Drop option where you can choose which data to discard.

Re: Step 4 of 2LevelImpute

Posted: Fri Aug 01, 2014 2:34 pm
by shakespeare
Understand. Appreciate your help.