Memory problems in R -
here phenomenon happens often. try manipulate sort of big data, example
a <- matrix( rnorm( 1e4 * 200 ), ncol= 1e4 ) gr <- factor( rep( 1:2, each= 100 ) ) l <- lm( ~ gr ) covs <- estvar( l ) cors <- cov2cor( covs )
quite often, following error reported: error: cannot allocate vector of size 509.5 mb
fine. remove variables don't need more , call garbage collector:
rm( a, l ) gc( true )
however, error persists. save r , start again. , -- miracle happens: memory available. why? if there not enough memory r allocate before, there enough now, changed? can force r somehow clean without saving data disk , waiting until loads them again? don't it.
my sessioninfo()
:
> sessioninfo() r version 3.0.1 (2013-05-16) platform: i486-pc-linux-gnu (32-bit) locale: [1] lc_ctype=en_us.utf8 lc_numeric=c lc_time=en_us.utf8 lc_collate=en_us.utf8 lc_monetary=en_us.utf8 [6] lc_messages=en_us.utf8 lc_paper=c lc_name=c lc_address=c lc_telephone=c [11] lc_measurement=en_us.utf8 lc_identification=c attached base packages: [1] graphics utils datasets grdevices stats methods base
p.s.: system appears have plenty of unused memory left, reported free
. top
reports r process (before error) using ~ 2gb out of 8, , there still plenty more left.
install , use 64-bit r take advantage of more ram. ?memory-limits
:
unix
address-space limit system-specific: 32-bit oses imposes limit of no more 4gb: 3gb. running 32-bit executables on 64-bit os have similar limits: 64-bit executables have infinite system-specific limit (e.g. 128tb linux on x86_64 cpus).
Comments
Post a Comment