Shell Variable capacity
IIRC, bash does not impose a limit on how much data a variable can store. It is however limited by the environment that bash was executed under. See this answer for a more comprehensive explanation.
As a data point, I tried the following script in OS X 10.10.5, using the built-in bash on a Macbook Pro Retina with a 2.8 GHz Intel Core i7:
#!/bin/bash
humombo="X"
while true; do
humombo="$humombo$humombo"
echo "Time $(date "+%H:%M:%S"), chars $(echo "$humombo" | wc -c)"
done
Results: the size happily doubled again and again (note that the sizes include an extra byte for the single line end). Things started to slow down when humombo
passed 4MB; doubling from 256MB to 512MB took 48 seconds, and the script exploded after that:
mbpe:~ griscom$ ./delme.sh
Time 16:00:04, chars 3
Time 16:00:04, chars 5
Time 16:00:04, chars 9
Time 16:00:04, chars 17
Time 16:00:04, chars 33
Time 16:00:04, chars 65
Time 16:00:04, chars 129
Time 16:00:04, chars 257
Time 16:00:04, chars 513
Time 16:00:04, chars 1025
Time 16:00:04, chars 2049
Time 16:00:04, chars 4097
Time 16:00:04, chars 8193
Time 16:00:04, chars 16385
Time 16:00:04, chars 32769
Time 16:00:04, chars 65537
Time 16:00:04, chars 131073
Time 16:00:04, chars 262145
Time 16:00:04, chars 524289
Time 16:00:04, chars 1048577
Time 16:00:04, chars 2097153
Time 16:00:05, chars 4194305
Time 16:00:05, chars 8388609
Time 16:00:07, chars 16777217
Time 16:00:09, chars 33554433
Time 16:00:15, chars 67108865
Time 16:00:27, chars 134217729
Time 16:00:51, chars 268435457
Time 16:01:39, chars 536870913
bash(80722,0x7fff77bff300) malloc: *** mach_vm_map(size=18446744071562072064) failed (error code=3)
*** error: can't allocate region
*** set a breakpoint in malloc_error_break to debug
./delme.sh: xrealloc: cannot allocate 18446744071562068096 bytes
mbpe:~ griscom$
Two notes:
I suspect that the crash was more that the whole process took too much memory, rather than I hit the limit of a single variable's capacity.
While playing with this, I ran the same commands interactively, and when the loop exited bash was broken; I had to open a new terminal window to do anything. So, too much memory allocation breaks bash in unknown ways; my guess is that doing it inside a script cleans up upon exit.
Edit: I just tried the same code on a high-powered Ubuntu 18 system:
Time 18:03:02, chars 3
Time 18:03:02, chars 5
Time 18:03:02, chars 9
Time 18:03:02, chars 17
Time 18:03:02, chars 33
Time 18:03:02, chars 65
Time 18:03:02, chars 129
Time 18:03:02, chars 257
Time 18:03:02, chars 513
Time 18:03:02, chars 1025
Time 18:03:02, chars 2049
Time 18:03:02, chars 4097
Time 18:03:02, chars 8193
Time 18:03:02, chars 16385
Time 18:03:02, chars 32769
Time 18:03:02, chars 65537
Time 18:03:02, chars 131073
Time 18:03:02, chars 262145
Time 18:03:02, chars 524289
Time 18:03:02, chars 1048577
Time 18:03:02, chars 2097153
Time 18:03:02, chars 4194305
Time 18:03:02, chars 8388609
Time 18:03:03, chars 16777217
Time 18:03:04, chars 33554433
Time 18:03:07, chars 67108865
Time 18:03:12, chars 134217729
Time 18:03:23, chars 268435457
Time 18:03:43, chars 536870913
./delme.sh: xrealloc: cannot allocate 18446744071562068096 bytes
It took less than half the time, and died a bit more cleanly, but at the same character size. (BTW, the number in the error message, decimal 18446744071562068096, is 0xffff ffff 8000 0080, so clearly we're hitting some number-capacity limits here.)