Hello,
I'm running cross-classified models using the 'runmlwin' command. My outcome variable is continuous and the predictor variable of interest is binary (IMD_25 coded 0/1).
All was going well until I noticed I'd made a silly error in my stata code to dichotomise the predictor variable (NB there's good logic for dichotomising this, I do use the full continuous scale in a sensitivity analysis too)... I had accidentally not retained the system missing values (N=3,324) as missing, they'd become coded as '0' instead. I corrected this and re-ran my cross-classified models expecting the N to change as well as the model results ....but everything stayed EXACTLY the same as before.
I've checked multiple times and I have definitely corrected the mistake, the predictor variable has been 'updated' to include the missing values but I don't understand why the models don't change - surely they should do as stata treats 'zero' and 'system missing' differently so I assume 'runmlwin' doesn't do anything odd?
Any thoughts on what might be going on here would be much appreciated as I'm struggling to understand it
Thank you very much
---------
For info:
*Ns for predictor variable 'IMD_25'
Coded 0: N =11,584
Coded 1: N = 4,323
System missing: N = 3,324
The code for my cross-classified model looks like this, in case helpful to see:
*Unadjusted model
cons=1
matrix b_1 = (0, 0, .33, .33, .33, .33, .33, .33, .33) //start values for unadjusted model
xi: runmlwin GEBTOT_C cons i.IMD_25, level 7(neighbourhood_w1: cons) level6(neighbourhood_w2: cons) level5(neighbourhood_w3: cons) level4(neighbourhood_w4: cons) level3(neighbourhood_w5: cons) level2(neighbourhood_w6: cons) level1(ID: cons) mcmc(cc)initsb(b_1) nopause
system missing in cross-classified models
-
- Posts: 1384
- Joined: Mon Oct 19, 2009 10:34 am
Re: system missing in cross-classified models
Which version of MLwiN were you using to run these model?
MLwiN should treat zero as a valid ID (unless you are running a multiple membership model) so I am not sure of the cause for what you are seeing. One thing that you can do is to remove the nopause option from your runmlwin command and then look in either the Model>Equations or the Model>Hierarchy Viewer windows to check that it is displaying the expected number of units for each classification. You could also try dropping the rows containing missing from within Stata and comparing the results to the model that includes them.
MLwiN should treat zero as a valid ID (unless you are running a multiple membership model) so I am not sure of the cause for what you are seeing. One thing that you can do is to remove the nopause option from your runmlwin command and then look in either the Model>Equations or the Model>Hierarchy Viewer windows to check that it is displaying the expected number of units for each classification. You could also try dropping the rows containing missing from within Stata and comparing the results to the model that includes them.
-
- Posts: 4
- Joined: Mon Jul 03, 2023 4:25 am
Re: system missing in cross-classified models
Since you mentioned using the full continuous scale of the predictor variable in a sensitivity analysis, consider running the model using the continuous version of 'IMD_25' to see if the results differ.
Re: system missing in cross-classified models
Hi Rachel,
It is possible that the reason why the model results have not changed is because the sample size of observations has not changed. When you update the IMD_25 variable to retain the missing values, but the number of observations remains the same, the model will not reflect the change. You can double check that the observations with the previous value of '0' have been replaced with missing values. Also, make sure that you have cleaned the data and there are no other issues during the model run.
Hope this helps!
It is possible that the reason why the model results have not changed is because the sample size of observations has not changed. When you update the IMD_25 variable to retain the missing values, but the number of observations remains the same, the model will not reflect the change. You can double check that the observations with the previous value of '0' have been replaced with missing values. Also, make sure that you have cleaned the data and there are no other issues during the model run.
Hope this helps!