by horchard » Tue Mar 11, 2025 4:20 pm
Hi Murray,
Thanks for your prompt reply! I'll try setting those starting values.
My mask has a 30km buffer around the study area, as the study species is woodland caribou with a lot of captures at the edge of the area. We had some individuals moving 20km in a day. I imagine this mask size is slowing things down as you mentioned, but I also don't want it to be too small. Would increasing mask spacing help runtime as well?
Some further information: There are 245 individual samples, with a total of 35 spatial recaptures. 3 sampling occasions. The study area is 21,300 km^2. Flight transects were flown every 3 kms, so detectors are spaced in a 3x3km grid, for a total of 2379 detectors. Many of the detectors did not have any captures.
If you have any other suggestions for how I could speed up runtime, I would really appreciate it!
Here is what my script looks like:
maskarea = st_read(dsn = "/home/hsorchar/scratch/Churchill/", layer = "ChurchillRange_30kmMask")
traps <- read.traps(file = "/home/hsorchar/scratch/Churchill/detectors_final.txt", detector = "proximity", header = TRUE, sep = "\t")
caps <- read.capthist(captfile = "/home/hsorchar/scratch/Churchill/detection_history.txt", trapfile = "/home/hsorchar/scratch/Churchill/detectors_final.txt", fmt = "trapID", detector = "proximity", sep = "\t", header = TRUE)
mask <- make.mask(traps(caps), buffer = 30000, type = "polygon", spacing = 500, poly = maskarea, poly.habitat = T)
model_nocovariates.1 <- secr.fit(capthist='caps', model = list (D~1, sigma~K, g0~1), details = list(fastproximity = FALSE), buffer = 30000, trace = TRUE, method = 'BFGS', detectfn = 'HEX', CL = FALSE, verify = FALSE, mask = mask, ncores = 4)
Thank you again!