Any solution to memory limitation problems?

questions concerning anlysis/theory using program DENSITY and R package secr. Focus on spatially-explicit analysis.

Any solution to memory limitation problems?

Postby deb » Sun Feb 20, 2011 11:47 pm

Hi Murray
These problems relate to my skink capture data you looked at a few months ago. This computer has 3 GB of memory and I'd just like to check if there is anything I can do to get some fairly simple models running. E.g. I can run
mtb.hab.sig <- secr.fit(caps09, model = list(g0 ~ t+b+hab, sigma~hab), mask = M09_1_12, CL=TRUE, detectfn = 0)

but not
mtb.sp.sig <- secr.fit(caps09, model = list(g0 ~ t+b+sp, sigma~sp), mask = M09_1_12, CL=TRUE, detectfn = 0)

The latter tries to allocate a 200 Mb vector; sp has 3 levels whereas hab in the previous model has only 2 levels.

I get very different AICc and density by changing between hab and siz (also 2 levels) for example, so am keen to try other variables too.

Also, when using a continuous variable svl, secr wants a 6GB block even when I just use part of my dataset, e.g.
mtb.svl.sig <- secr.fit(caps09$In, model = list(g0 ~ t+b+svl, sigma~svl), mask = In09_1m_12m, CL=TRUE, detectfn = 0)

I've checked that svl is numeric and not a factor. You helped me with this variable once before and I'm wondering if every measurement is still being treated as a different level?

Happy to send you the files etc...
Cheers
Deb
deb
 
Posts: 1
Joined: Sun Oct 24, 2010 7:50 pm

Re: Any solution to memory limitation problems?

Postby murray.efford » Mon Feb 21, 2011 1:19 am

deb wrote:Hi Murray
These problems relate to my skink capture data you looked at a few months ago. This computer has 3 GB of memory and I'd just like to check if there is anything I can do to get some fairly simple models running. E.g. I can run
mtb.hab.sig <- secr.fit(caps09, model = list(g0 ~ t+b+hab, sigma~hab), mask = M09_1_12, CL=TRUE, detectfn = 0)

but not
mtb.sp.sig <- secr.fit(caps09, model = list(g0 ~ t+b+sp, sigma~sp), mask = M09_1_12, CL=TRUE, detectfn = 0)

The latter tries to allocate a 200 Mb vector; sp has 3 levels whereas hab in the previous model has only 2 levels.

I get very different AICc and density by changing between hab and siz (also 2 levels) for example, so am keen to try other variables too.

Also, when using a continuous variable svl, secr wants a 6GB block even when I just use part of my dataset, e.g.
mtb.svl.sig <- secr.fit(caps09$In, model = list(g0 ~ t+b+svl, sigma~svl), mask = In09_1m_12m, CL=TRUE, detectfn = 0)

I've checked that svl is numeric and not a factor. You helped me with this variable once before and I'm wondering if every measurement is still being treated as a different level?

Happy to send you the files etc...
Cheers
Deb


There are no miracle cures, I'm sorry, just compromises. Using individual covariates can be demanding because of the way secr covers all possible combinations, and I haven't had pressure to refine it - this may help, but I can't tackle it for a month or two.

I uploaded a new secr version 2.0 to CRAN today (binaries probably available there later in week); I don't remember making any changes that would help this problem, but it might be worth a try. I found I could run your 'mtb.hab.sig' and 'mtb.sp.sig' examples (at least the first few likelihood evaluations) using masks of about 6000 points/session (probably many more than is needed) in both 32-bit and 64-bit R 2.12.1 on 64-bit Windows 7. Also, I seriously doubt the value of complex models even if that is where AIC leads you - maybe check whether modelling time variation really affects the estimates?

You are right about continuous individual covariates: at one point in the code an array is allocated using a multiple of the number of unique levels, even when the covariate is continuous. Discretizing to say 5 classes instead of 50 should help, with no serious loss of information. Your tack of treating sessions separately seems a good idea - I would think you have enough data to estimate the detection function each time without needing to pool data, even if that might be elegant.

Another option available in principle, but maybe not in practice, is to pool data from grids of the same shape and divide the resulting density (and its SE) by the number of pooled grids. That way you drastically reduce the number of unique trap sites.

Murray
murray.efford
 
Posts: 686
Joined: Mon Sep 29, 2008 7:11 pm
Location: Dunedin, New Zealand

Re: Any solution to memory limitation problems?

Postby howeer » Mon Feb 28, 2011 4:03 pm

Hi Deb,
I often use the R function "memory.limit" to increase the amount of RAM allocated to R when fitting secr models
You should be able to allocate vectors much larger than 200Mb.

e.g.
memory.limit(3000) # this would request all 3 Gb of RAM be allocated to R. If you get an error message, try
memory.limit(2000)
then excecute secr.fit

It may help to close other programs before setting memory.limit. You can re-start them afterwards.

Cheers,
Eric
howeer
 
Posts: 39
Joined: Wed Jun 21, 2006 10:49 am


Return to analysis help

Who is online

Users browsing this forum: No registered users and 15 guests