Complete R Session Size
I personally use this function to get the available memory:
getAvailMem <- function(format = TRUE) {
gc()
if (Sys.info()[["sysname"]] == "Windows") {
memfree <- 1024^2 * (utils::memory.limit() - utils::memory.size())
} else {
# http://stackoverflow.com/a/6457769/6103040
memfree <- 1024 * as.numeric(
system("awk '/MemFree/ {print $2}' /proc/meminfo", intern = TRUE))
}
`if`(format, format(structure(memfree, class = "object_size"),
units = "auto"), memfree)
}
To get the total memory used by R, you may try mem_used()
from pryr
package. Unlike memory.size
, this one is not OS dependent, because it uses the R function gc()
underneath it. Try to look in the function body and also this pryr:::node_size
and pryr:::show_bytes
pryr::mem_used()
The help file ?pryr::mem_used
describes
R breaks down memory usage into Vcells (memory used by vectors) and Ncells (memory used by everything else). However, neither this distinction nor the "gc trigger" and "max used" columns are typically important. What we're usually most interested in is the the first column: the total memory used. This function wraps around gc() to return the total amount of memory (in megabytes) currently used by R.
You can also use pryr::mem_change
to track the size of the memory used by the R code. Try the example in its documentation page.
The numbers such as 28L
and 56L
used to refer node size with pryr:::node_size
comes from the help file of ?gc
, which describes
gc returns a matrix with rows "Ncells" (cons cells), usually 28 bytes each on 32-bit systems and 56 bytes on 64-bit systems, and "Vcells" (vector cells, 8 bytes each),
After removing a large object run gc()
to free memory