Memory limits/simulation

questions concerning anlysis/theory using program DENSITY and R package secr. Focus on spatially-explicit analysis.

Memory limits/simulation

Postby JDJC » Fri May 17, 2013 10:43 pm

Hi folks,

Two separate topics here. First, I've been fitting null models for a reasonably sized (but pooled!) data set (64 individuals, ~42 occasions, 315 trap sites) successfully with null detection models and covariate-specific density models. I'd like to be able to consider some seasonal variation and landscape connectivity as detection covariates--trap covariates, specifically. However, I'm running into memory issues (can't allocate vector of ridiculous size), and was wondering how I might work around that. The end deliverable is some sort of extrapolated population size based on habitat, so I don't want to run detector arrays completely independently if possible; I'm hoping that this is an issue with mask size or that I can compare fit between multiple session/single session" models within an AIC framework, or at least fit comparably fit region.N's. I suppose my question is mostly what part of the input or data structure is the hang-up?

Secondly, I'm interested in looking at the effects of mark ambiguity and data loss within traditional and secr frameworks. The former is easy enough to simulate, but I was wondering if there were some way to simulate a "full" secr capture history and then jackknife the detections at a fixed rate to reflect inability to ID individuals...in a more convenient fashion than my limited R capabilities can code...

Thanks for any feedback,

John
JDJC
 
Posts: 10
Joined: Fri Sep 28, 2012 4:05 pm

Return to analysis help

Who is online

Users browsing this forum: No registered users and 4 guests