R on MacOS Error: vector memory exhausted (limit reached?)
I had the same problem, increasing the "R_MAX_VSIZE"
did not help in my case, instead cleaning the variables no longer needed solved the problem. Hope this helps those who are struggling here.
rm(large_df, large_list, large_vector, temp_variables)
For those using Rstudio, I've found that setting Sys.setenv('R_MAX_VSIZE'=32000000000)
, as has been suggested on multiple StackOverflow posts, only works on the command line, and that setting that parameter while using Rstudio does not prevent this error:
Error: vector memory exhausted (limit reached?)
After doing some more reading, I found this thread, which clarifies the problem with Rstudio, and identifies a solution, shown below:
Step 1: Open terminal,
Step 2:
cd ~
touch .Renviron
open .Renviron
Step 3: Save the following as the first line of .Renviron
:
R_MAX_VSIZE=100Gb
Note: This limit includes both physical and virtual memory; so setting _MAX_VSIZE=16Gb on a machine with 16Gb of physical memory may not prevent this error. You may have to play with this parameter, depending on the specs of your machine
This can be done through R studio as well.
library(usethis)
usethis::edit_r_environ()
when the tab opens up in R studio, add this to the 1st line: R_MAX_VSIZE=100Gb
(or whatever memory you wish to allocate).
Re-start R
and/or restart computer and run the R command again that gave you the memory error.