Bryan Hamilton wrote:I'm interested in estimating time dependent recruitment (f) with Pradel models. I used this model:
Phi(t), f(t), p(loc+sex). Then I ran that model through the variance components process to get the shrinkage estimates by year. The annual estimates of f are quite different between the two methods.
- Code: Select all
time Estimates from f(time) Shrinkage Estimates from variance components
2005 0.74 (0.28) 0.23 (0.15)
2006 0 (0) 0.23 (0)
2007 0.26 (0.11) 0.26 (0.09)
2008 0.02 (0.08) 0.69 (0.09)
2009 0.22 (0.13) 0.06 (0.12)
2010 0.12 (0.09) 0.24 (0.08)
2011 0.24 (0.1) 0.05 (0.09)
2012 0.06 (0.07) 0.21 (0.06)
2013 0.23 (0.11) 0.13 (0.09)
2014 0.23 (0.13) 0.23 (0.1)
2015 0.24 (0.14) 0.1 (0.11)
What in the world is going on? There is little relationship between the model estimates and the vcv estimates. 2006 and 2008 are particularly different. The first year estimate of f from the model (0.74) is way too high, and the second year estimate (7E-26) too low. Is this causing the variance components to get pushed up and down?
I might be making this too complicated. Is there any reason to use vcv with a time only model?
First and (often the last) estimates from time-dependent Pradel models are usually wrong for various reasons -- the general recommendation is to ignore them, at the very least, the first estimate (this point is made in a couple of places in Chapter 13). As such, if you include first, or first+last estimates in the random effects approach, this can cause problems for all of the other estimates. The general suggestion, to be clean about things,that I usually follow is to look at only the second through last, or (if I'm feeling extra cautious) second through penultimate.
If you do this, for example, using the moth data (as presented in Appendix D), you'll see (below) the estimates are pretty comparable (with the overall RMSE of the shrunk estimates being somewhat smaller than for the ML estimtes, as expected):
- Code: Select all
Par. Num S-hat S-tilde
-------- ---------- ----------
19 0.177190 0.197301
20 0.587425 0.582724
21 0.027951 0.059953
22 2.563176 1.934976
23 0.379700 0.407058
24 0.546357 0.544375
25 0.220692 0.236796
26 1.142249 1.108735
27 0.000000 0.000000
28 2.607649 1.999485
29 0.387275 0.437767
30 0.235322 0.237598
31 0.329767 0.332896
32 0.231284 0.234873
33 0.000000 0.000000
On a somewhat related note, with only 11-12 estimates to play with, fitting a random effects model probably isn't going to work very well any way. I've generally found that using a MCMC approach to the problem (Appendix E) tends to be a bit more 'stable' for lower number of occasions, so perhaps worth a try (for 15 or more occasions, the VC and MCMC approaches are generally very close -- often out to decimal places).