Interpretation of 'occasion' with sustained data collection

questions concerning anlysis/theory using program DENSITY and R package secr. Focus on spatially-explicit analysis.

Interpretation of 'occasion' with sustained data collection

Postby tyroneSawyer » Wed Oct 11, 2023 1:16 am

Hi there! I'm working on a project using secr in R to model the results of a cat population intervention on a South Australian island using camera trap data. The central difficulty I'm grappling with relates to how I should delineate 'occasions'. The data were collected continually over a period of many weeks, and come associated with a given date and time. Given that's the case, I'm unsure how to marry this to the classical discrete occasion model built into secr.

Basically, I see three options:

- throw out the occasions entirely, considering the collection period as one single occasion (I /may/ have seen something suggesting something similar to this for audio sampling, but I didn't look into it, and it didn't seem designed for long periods). Obviously you're losing information here, and it seems deeply inelegant, not preferred

- artificially break up the occasions by day (or probably more accurately putting the split at midday due to nocturnalism): seems good for many purposes? (if a little bit of an approximation)

- break up the occasions into a very fine grained system, on the order of minutes, so that no two readings occur on the same occasion: I don't know enough about the internals of the models to know to what extent this is statistically sound, but it intuitively makes sense to me; the parameters all just being scaled very differently with time, but I could also see this massively zero inflating the model and being very improper, aswell as being very slow.

Any help that could be offered would be greatly appreciated! I won't lie and say that I haven't noticed the author of all the documentation I've been reading answering every thread I've perused on here...
Posts: 1
Joined: Tue Oct 10, 2023 3:11 am

Re: Interpretation of 'occasion' with sustained data collect

Postby murray.efford » Wed Oct 11, 2023 6:55 pm

I think this is less of a problem than you think. Whether to use a single occasion or many only matters if you need to model temporal effects, including learned responses. Otherwise the number of observations of animal x at location y can be treated as a Poisson or binomial count. The default in is to collapse certain data types (proximity, count) to a single occasion, unless the model has a temporal component as mentioned. This gives (almost) identical estimates of density, and no useful information is lost. I think you'll find this confirmed in the literature on continuous time modelling.

So I would start with e.g. daily occasions, and cheerfully collapse to a single occasion for analysis (or rely on the 'details = list(fastproximity = TRUE)' default). The estimates of g0/lambda0 relate to the original duration (1 day) because the number of collapsed occasions is 'remembered' in the usage attribute of the collapsed dataset.

Continuous time data do confront you with the need to weed out repeat detections after a few seconds or minutes, as it's unrealistic to model these as independent events. Up to you where you draw the line.

Other contributions welcome!
Posts: 686
Joined: Mon Sep 29, 2008 7:11 pm
Location: Dunedin, New Zealand

Return to analysis help

Who is online

Users browsing this forum: No registered users and 1 guest