when RMark and MARK results (seem to) differ | read first

posts related to the RMark library, which may not be of general interest to users of 'classic' MARK

when RMark and MARK results (seem to) differ | read first

Postby egc » Mon Feb 27, 2012 4:01 pm

Following is quoted from a recent posting by Jeff Laake, creator and maintainer of the RMark package. Please read and consider the following before posting anything that relates to 'differences in results between MARK and RMark':

If you get a difference between models constructed with RMark and MARK interface the reasons are:

1) data are different
2) design matrices are different
3) links are different

Even though it has been said many times, I'll repeat it. RMark is just a different interface and uses the same numerical software mark.exe to fit the models. If there are differences, check through the 3 reasons above. It will be one of them. A difference in links is quite common when running pre-defined models with MARK interface and comparing to the same in RMark. In particular parameters like pent and Psi have a default of an mlogit link in RMark but will get a sin link with the dot model in the pre-defined models in MARK interface. A dot model can be used with those parameters but the PIMS must be specified so they differ across the set of parameters that are summed to 1.


You can avoid the #1 reason above by using export.MARK within R and then use RMark import in the MARK interface. This will make sure that the data (including time intervals) and its structure are the same in the MARK interface. You should not get a difference between RMark run and re-running in MARK once you export into MARK interface. If you do that means there is a bug in the RMark interface like the one discovered recently.

-- jeff
Site Admin
Posts: 167
Joined: Thu May 15, 2003 3:25 pm

When RMark and MARK differ - revisited

Postby jlaake » Sun May 26, 2013 11:10 am

One of the most frequent questions I get is "why do the results from MARK and RMark differ"? Initially it was so frequent that Evan created the locked posting "when RMark and MARK results (seem to) differ | read first".

The content of that posting is correct but I want to revisit and elaborate some. First of all, I'll never discourage anyone from comparing the results of MARK and RMark, just remember that RMark doesn't fit models, MARK does and if there are differences it is for one of the reasons listed in that posting. In fact, I encourage you to compare the results because you'll become more comfortable with RMark and it helps me find any bugs that many still be lurking in RMark. My recent exchange with Kristin helped me discover a bug in model.average.list in which the non-revised model average variance formula was being used if the full v-c matrix was not specified. She was using it to model average derived parameters and noted the difference.

The other differences she noted were quite small ones (1e-5 to 1e-8) and resulted from differences in the design matrix formulation. Although in theory it shouldn't make a difference, you may get slight differences when the design matrices differ because the model fitting algorithm is numerical. This has been noted several times on the list and it primarily occurs with small data sets. One of the primary DM differences is in regard to factor variables like time. RMark uses the R contrast setting which by default is the treatment contrast which uses the first factor level as the intercept and in MARK Gary uses the SAS convention of the last factor level being the intercept. In the example below, I show how you can modify the level that is used as the intercept with the relevel function on the design data. Also, in the example I use I(1-c) instead of c in the formula to match the DM Kristin created in MARK. BOTTOM LINE: same model can be specified with many different DMs and these can result in very small differences in the values but if you construct the same DM with a formula that you did in the MARK interface you WILL get the same answer because it is the same code fitting the model. The best way to assure yourself of this is to use export.MARK to export the models you fitted into MARK and then re-run them in the MARK interface.

The deviance value put in the marklist model table is the residual deviance. If you are comparing models with and without covariates the deviance values will be quite different because the null deviance with
covariates is set to 0 by MARK. You can use a switch in model.table to use -2lnl instead of deviance.


Code: Select all
  #convert INP file
  sq=convert.inp("AG2011AUG.inp", use.comments=TRUE)
  #step one: process.data
  #step two: make.design.data
  # change level for time and mixture factor variable
  # p(~1)c() or M0
  # p(t)c() or Mt
  # p(~1)c(~1) or Mb
  # p(~1)c(),mixture or Mh
  # p(t)c(t) or Mtb
  # p(~1)c(~1),mixture or Mbh
  # p(t)c(),mixture or Mth
  # p(t)c(t),mixture or Mtbh
  # To check out the design matrix to see if it looks like the same design as in MARK interface
  ### SEE DERIVED N-hat RESULTS #####3
  # If you want to see more digits use; I think Gary's output is truncating and R is rounding
Posts: 1001
Joined: Fri May 12, 2006 12:50 pm
Location: Escondido, CA

Return to RMark

Who is online

Users browsing this forum: No registered users and 1 guest