Polygon Search Detectors

questions concerning anlysis/theory using program DENSITY and R package secr. Focus on spatially-explicit analysis.

Re: Polygon Search Detectors

Postby dlm122 » Mon Nov 05, 2012 10:01 pm

:oops:

if you had any idea how many times I reread that script!!

Sorry - and thanks!
dlm122
 
Posts: 8
Joined: Wed Oct 31, 2012 12:08 pm

Re: Polygon Search Detectors

Postby dlm122 » Wed Nov 21, 2012 12:55 pm

OK - I ran the script, and one of the warning messages I received was that the variances could not be calculated. I tried method = "BFGS", which resulted in the same warning message, and I tried method = "Nelder-Mead", for which I received this warning message:

"could not invert Hessian to compute variance-covariance matrix"

I see another option is to include "details = list(hessian = "fdhess")". Do you recommend I try this? And do I still specify a method with this item included in the script?

Or do you think there is something inherently wrong with my covariate "effort"? It is simply an integer, which represents the metres traveled for each detector (polygon).

Thank you very much for your help!
dlm122
 
Posts: 8
Joined: Wed Oct 31, 2012 12:08 pm

Re: Polygon Search Detectors

Postby murray.efford » Fri Nov 23, 2012 2:38 am

Hi
This is a fairly non-specific indication that there's something a problem fitting the model. I can only guess what it might be - if you would like to send you data files to me offline I can have a look and may be able to suggest something.
Murray
murray.efford
 
Posts: 686
Joined: Mon Sep 29, 2008 7:11 pm
Location: Dunedin, New Zealand

Re: Polygon Search Detectors

Postby murray.efford » Sat Dec 01, 2012 5:03 pm

Rounding off this thread (hopefully!): we resolved the problem offline by (i) removing a large number of unsearched polygons from the detector data, (ii) collapsing a sequence of occasions with non-overlapping search areas to one occasion (function 'reduce'), (iii) recoding to reduce the memory demands of the C code underlying secr.fit for polygon detectors (for next release of secr, or contact MGE if you need it sooner), and (iv) selecting options (e.g. conditional likelihood, fewer levels of effort covariate) that reduce runtime.
Murray
murray.efford
 
Posts: 686
Joined: Mon Sep 29, 2008 7:11 pm
Location: Dunedin, New Zealand

Re: Polygon Search Detectors

Postby dlm122 » Fri Dec 21, 2012 12:44 pm

The problem was indeed resolved! Thank you very much. I do have a sort of general question. My data are from a large area (~625 sq km), which presumably, contains more than one community of chimpanzees (they are unhabituated) - meaning more than one home range (and home range centre). Is the model assuming a home range centre for each polygon, or one for the entire area surveyed? And how would you expect this discrepancy to affect the estimate?

Thank you very much!!
dlm122
 
Posts: 8
Joined: Wed Oct 31, 2012 12:08 pm

Re: Polygon Search Detectors

Postby murray.efford » Fri Dec 21, 2012 3:02 pm

Each animal has a home range. In the model we assume these are located independently of each other, but violations of this assumption are not usually critical.
Murray
murray.efford
 
Posts: 686
Joined: Mon Sep 29, 2008 7:11 pm
Location: Dunedin, New Zealand

Re: Polygon Search Detectors

Postby RJDan » Sat Mar 08, 2014 5:07 pm

Hi

I have, what I'm hoping will be, a quick question.
I've starting using secr after having worked in Rmark for a bit. I'm using the same data in both.
In MARK you'd generally provide an interval between occassions, for example if there are irregular intervals between occassions.
I've performed daily area searches in a single polygon over 23 days but on rainy days I did not do a search so there are four days with no 'usage'. I've included binary usage '1' for days surveyed and '0' for days unsurveyed but when I attempt to run secr.fit() I get an error that there are days without use.

#####
> aprilfit.constant<- secr.fit(aprilcapthist,timecov=apriltimecov, model=list(D~1,g0~1,sigma~1))
Checking data
Session April
Errors in traps
Occasions when no detectors 'used'
7 11 15 16 20
Error in secr.fit(aprilcapthist, timecov = apriltimecov, model = list(D ~ :
'verify' found errors in 'capthist' argument
######

I've worked around this by removing the days without survey by using the subset function.

I'm assuming irregular intervals between sample occassions is irrelevant here? or have I made a mistake in the way I've set my analysis up in secr?
RJDan
 
Posts: 3
Joined: Sat Mar 08, 2014 3:24 pm

Re: Polygon Search Detectors

Postby murray.efford » Sun Mar 09, 2014 9:34 pm

1. Please start a new thread for a new topic
2. The interval between occasions is used to scale survival in MARK - that's irrelevant in 'secr'. The only consideration is whether the population was closed over the entire span of sampling, and I don't think that's a problem for you.
3. secr is kindly pointing out that you have included phoney search occasions - times when in fact you did no sampling. You can override the check by setting verify = FALSE in secr.fit (I do not promise the analysis will work in that case and it is better to code the data correctly).
Murray
murray.efford
 
Posts: 686
Joined: Mon Sep 29, 2008 7:11 pm
Location: Dunedin, New Zealand

Re: Polygon Search Detectors

Postby Luciano » Sun Jun 09, 2019 9:59 pm

I have similar formatting issues using transect detectors. I have looked for the errors in other threads, and tried to apply the suggested solutions, unfortunately with no results.

My transects consist of three vertices each (start, middle, end points), and the animals' captures have been added as extra vertices along the corresponding transect.
I assume that there is some bug with the format of my files, even though they appear in the correct form.

When I use the function read.capthist, R studio gives me this message

> all.data <- read.capthist(captfile= "capfile190606.txt",
+ trapfile="trapfile190607.txt",
+ detector="transect", fmt="XY", trapcovnames=
+ c("rugg_100","rugg_500","rugg_1000",
+ "m_sl100","m_sl500",
+ "m_sl1000"))
Error in rep(NA, nfield - nvar) : invalid 'times' argument


I tried to create a trap object, but again get this error:
Error in read.traps(file = "trapfile190607.txt", detector = "transect", :
requires 3 fields (detectorID, x, y)


Or using make.capthist

Error in captures[, 1] : incorrect number of dimensions


Using a *.csv file as input, or creating a dataframe from either a *.txt or *.csv does not solve the problem.
I thought that I might have forgotten to add some animals as vertex in the trap file, but I checked several times and they have been all included.

I can send the files for a quick check if someone has time to have a look.

Thanks for any help

Luciano
Luciano
 
Posts: 2
Joined: Fri Jun 07, 2019 3:09 am

Re: Polygon Search Detectors

Postby murray.efford » Mon Jun 10, 2019 7:00 pm

Hi
The transect detector option is not one that gets used much, and I have no serious experience with it, but
1. I don't see why you are adding vertices for detection locations.
2. I guess there is an error in your detector file. Look at examples and try without covariates until you sort the basics. Covariates are probably at level of transect, not vertex, so add later (covariates(tr) <- df where df is dataframe).
3. After 5 years please start a new thread, especially in view of previous post.
Murray
murray.efford
 
Posts: 686
Joined: Mon Sep 29, 2008 7:11 pm
Location: Dunedin, New Zealand

PreviousNext

Return to analysis help

Who is online

Users browsing this forum: No registered users and 10 guests

cron