Polygon Search Detectors

questions concerning anlysis/theory using program DENSITY and R package secr. Focus on spatially-explicit analysis.

Polygon Search Detectors

Postby dgreen » Thu Feb 16, 2012 9:39 am

I'm having some trouble getting my data into the correct format for a polygon detector in SECR (in R). I'm hoping to use this package as a way to estimate current densities of lions and cheetahs in a protected area in east Africa. We regularly conduct area searches throughout regions in the reserve and identify these animals based on individual markings. From this data collection, I have (up to) twice daily observations of animals that were seen in these search areas. Additionally, there are times when we go out but do not see any of our animals that have been "marked."

My question is two-fold, but both relate to the data formatting for polygon search detectors. I have the polygon file working well, but cannot seem to get my capture file in the correct format. The correct columns should be "Session", "ID", "Occasion", "X", and "Y." However, I'm not exactly sure what the difference between Session and Occasion are in this case. If I have 30 different times in which we searched the polygon, does that mean that I'd have 30 sessions or 30 occasions? Similarly, is effort taken into account when a polygon was searched but no animals were sighted?

Thank you very much for your help. I really appreciate it!
dgreen
 
Posts: 2
Joined: Thu Feb 16, 2012 9:11 am

Re: Polygon Search Detectors

Postby murray.efford » Thu Feb 16, 2012 4:30 pm

Sounds like an interesting dataset...

Daily searches are definitely 'occasions' not 'sessions'. 'Sessions' are used for independent blocks of data, maybe different years or areas. Try ?session for more.

Occasions are numbered consecutively whether or not animals were seen, and by default the number of occasions is taken as the maximum in the capture file. Of course it's quite possible no animal was seen on the last occasion: in this case, make sure you specify the argument 'noccasions' in read.capthist. This will ensure the correct modelling of total effort.

It's not necessarily virtuous to use a large number of occasions: if models are really slow to fit then you may want to aggregate the daily search data into say weekly blocks using 'reduce' with the 'by' argument. See ?reduce.capthist.

Murray

P.S. Remember that R is case sensitive: the package name is 'secr' while 'SECR' refers to the general methodology.
murray.efford
 
Posts: 686
Joined: Mon Sep 29, 2008 7:11 pm
Location: Dunedin, New Zealand

Re: Polygon Search Detectors

Postby dgreen » Mon Feb 20, 2012 1:06 pm

Thanks for all of your help, I really appreciate it.

I believe I may be running into some additional problems in formatting my capture history because there are multiple occasions in which no animals are sighted. Here are the errors I get when I try to use the function "read.capthist":

Error in dimnames(w) <- list(1:nID, 1:nocc, 1:ndetector(traps)) :
length of 'dimnames' [1] not equal to array extent
In addition: Warning messages:
1: In make.capthist(capt, trps, fmt = fmt, noccasions = noccasions, :
detections with coordinates outside polygon(s) were dropped
2: In max(abs(captures[, 3])) :
no non-missing arguments to max; returning -Inf

I know the warning message 1 is because there are locations that do not match up with my polygon search area, but the other problems I can't seem to troubleshoot. Any help is greatly appreciated.

Thanks a lot!

David
dgreen
 
Posts: 2
Joined: Thu Feb 16, 2012 9:11 am

Re: Polygon Search Detectors

Postby murray.efford » Mon Feb 20, 2012 3:19 pm

Difficult to say what is going wrong. I would start by addressing the first warning: detections outside the searched polygons must reflect errors in detection coordinates or polygon definition. If that doesn't fix it (I don't expect it will!) please post some example data or send it to me offline.
Murray
murray.efford
 
Posts: 686
Joined: Mon Sep 29, 2008 7:11 pm
Location: Dunedin, New Zealand

Re: Polygon Search Detectors

Postby dlm122 » Thu Nov 01, 2012 11:57 pm

I seem to be having a similar problem as David. Or at least receiving the same error message. I am trying to use polygon detectors, and my captures are collected chimpanzee feces. My search areas, or polygons, are quadrats of a larger search grid, and so share vertices with the adjacent quadrat of the larger grid.

My detector file also has a covariate column with a proxy for effort, and I'm using this in lieu of "usage" (it is an integer covariate, which I entered as "/545", for example). For my capture file, I added the two columns "x" and "y" to include the coordinates of the captures (in UTM coordinates).

So, my script is this:

ugalla <- read.capthist('captXY.txt', 'grid.txt', fmt = "XY", detector = "polygon", covnames = TRUE, noccasions = TRUE)

and my error message is this:

Error in dimnames(w) <- list(1:nID, 1:nocc, 1:ndetector(traps)) :
length of 'dimnames' [1] not equal to array extent
In addition: Warning messages:
1: In make.capthist(capt, trps, fmt = fmt, noccasions = noccasions, :
detections with coordinates outside polygon(s) were dropped
2: In max(abs(captures[, 3])) :
no non-missing arguments to max; returning -Inf

As per your advice to David, I checked that the captures were inside the polygons by mapping both the polygon vertices and the capture coordinates, and all were inside. I have spent hours trying to troubleshoot, and am out of ideas!

Thanks so much for your help!

Deborah
dlm122
 
Posts: 8
Joined: Wed Oct 31, 2012 12:08 pm

Re: Polygon Search Detectors

Postby murray.efford » Fri Nov 02, 2012 12:16 am

Deborah

You have provided the logical value TRUE for the argument 'covnames' which expects a character vector, and for 'noccasions' which should be a number (see Arguments in ?read.capthist) If you have more than one occasion I suspect the latter is the most critical problem, as it will force the number of occasions to the numeric equivalent (1).

If this fails to solve the problem then please post a clip of each data file. The messages from read.capthist are not specific, so your problem is likely to be quite different to David's.

Note that you don't 'add' x and y columns; you replace the detectorID column with these. An integer polygon covariate should be fine. Note that models will fit slowly if the covariate has many levels.

Murray
murray.efford
 
Posts: 686
Joined: Mon Sep 29, 2008 7:11 pm
Location: Dunedin, New Zealand

Re: Polygon Search Detectors

Postby dlm122 » Sun Nov 04, 2012 11:30 am

Thank you! The problem was my capture data file, in which I kept the "detector" column and added the "x" and 'y" columns. Once I removed the "detector" column, everything worked beautifully. And I also changed the other arguments.

Regarding my covariate in my trap layout file (polygon file), I have included a column called "effort" which is the distance travelled in each polygon. I wanted to "weight" my captures (collection of feces) with the effort I spent. Is this possible? I don't really understand what the model is doing with my covariate.

Also, I'm using the default buffer zone of 100 metres after obtaining the following results:

mask.check(secr0)
Computing log likelihoods...
628.125 471.09375 314.0625
100 -1673.754 -1673.488 -1673.195
150 -1673.711 -1673.470 -1673.684
200 -1673.710 -1673.455 -1673.686

I'm not sure if this is OK, because the area around my searched polygons is suitable habitat for the chimpanzees.

Thanks so much for your help
dlm122
 
Posts: 8
Joined: Wed Oct 31, 2012 12:08 pm

Re: Polygon Search Detectors

Postby murray.efford » Sun Nov 04, 2012 4:11 pm

I'm glad you got over that hurdle!
It does make sense to consider using effort as a covariate in models for detection. This is a detector-(polygon-) level covariate. If you provide a name for it in the trapcovnames argument of read.capthist then that name will be available for use in formulae e.g.,
Code: Select all
read.capthist(..., trapcovnames = "effort")
secr.fit (..., model = g0 ~ effort)

Certainly the default buffer width is not appropriate for chimpanzees if you have surrounding habitat. You did well to use mask.check() but those likelihoods are varying quite a lot (they need to be nearly identical). A more definitive check uses mask.check() with LLonly = FALSE, but that fits the model 9 times: you won't want to do that often! I guess you need a buffer of several thousand metres.
Murray
murray.efford
 
Posts: 686
Joined: Mon Sep 29, 2008 7:11 pm
Location: Dunedin, New Zealand

Re: Polygon Search Detectors

Postby dlm122 » Mon Nov 05, 2012 5:20 pm

Thanks so much for the response. And now the next hurdle...
I created my read.capthist object with this script:

ugalla <- read.capthist('captXY.txt', 'grid_search.txt', fmt = "XY", detector = "polygon", trapcovnames = "effort", noccasions = "6")

and my trapfile looks like this:

# polyID X Y effort
8 225000 9391000 /168
8 225000 9395000 /168
8 229000 9395000 /168
8 229000 9391000 /168
15 229000 9383000 /294


No problems, so I wrote this script to fit the model:

ugalladensity_search <-secr.fit(ugalla, model = g0 ~ error, buffer = 5000, trace = TRUE)

and I get this error message:

Checking data
Preparing detection design matrices
Error in secr.design.MS(capthist, model, timecov, sessioncov, groups, :
covariate(s) error not found


I tried taking the forward slash out of the "effort" column, but then I cannot make a read.capthist object. I'm sorry for the ongoing issues!

Deborah
dlm122
 
Posts: 8
Joined: Wed Oct 31, 2012 12:08 pm

Re: Polygon Search Detectors

Postby murray.efford » Mon Nov 05, 2012 9:30 pm

Typo!
g0 ~ effort
murray.efford
 
Posts: 686
Joined: Mon Sep 29, 2008 7:11 pm
Location: Dunedin, New Zealand

Next

Return to analysis help

Who is online

Users browsing this forum: Google [Bot] and 4 guests