Python deep getsizeof list with contents?
I wrote a tool called RememberMe
exactly for this. Basic usage:
from rememberme import memory
a = [1, 2, 3]
b = [a, a, a]
print(memory(a)) # 172 bytes
print(memory(b)) # 260 bytes. Duplication counted only once.
Hope it helps.
10000 * [x] will produce a list of 10000 times the same object, so the sizeof is actually closer to correct than you think. However, a deep sizeof is very problematic because it's impossible to tell Python when you want to stop the measurement. Every object references a typeobject. Should the typeobject be counted? What if the reference to the typeobject is the last one, so if you deleted the object the typeobject would go away as well? What about if you have multiple (different) objects in the list refer to the same string object? Should it be counted once, or multiple times?
In short, getting the size of a data structure is very complicated, and sys.getsizeof() should never have been added :S
If you list is only holding objects with the same length you could get a more accurate estimate number by doing this
def getSize(array):
return sys.getsizeof(array) + len(array) * sys.getsizeof(array[0])
Obviously it's not going to work as good for strings with variable length.
If you only want to calculate the size for debugging or during development and you don't care about the performance, you could iterate over all items recursively and calculation the total size. Note that this solution is not going to handle multiple references to same object correctly.
Have a look at guppy/heapy; I haven't played around with it too much myself, but a few of my co-workers have used it for memory profiling with good results.
The documentation could be better, but this howto does a decent job of explaining the basic concepts.