In-memory size of a Python structure

The recommendation from an earlier question on this was to use sys.getsizeof(), quoting:

>>> import sys
>>> x = 2
>>> sys.getsizeof(x)
14
>>> sys.getsizeof(sys.getsizeof)
32
>>> sys.getsizeof('this')
38
>>> sys.getsizeof('this also')
48

You could take this approach:

>>> import sys
>>> import decimal
>>> 
>>> d = {
...     "int": 0,
...     "float": 0.0,
...     "dict": dict(),
...     "set": set(),
...     "tuple": tuple(),
...     "list": list(),
...     "str": "a",
...     "unicode": u"a",
...     "decimal": decimal.Decimal(0),
...     "object": object(),
... }
>>> for k, v in sorted(d.iteritems()):
...     print k, sys.getsizeof(v)
...
decimal 40
dict 140
float 16
int 12
list 36
object 8
set 116
str 25
tuple 28
unicode 28

2012-09-30

python 2.7 (linux, 32-bit):

decimal 36
dict 136
float 16
int 12
list 32
object 8
set 112
str 22
tuple 24
unicode 32

python 3.3 (linux, 32-bit)

decimal 52
dict 144
float 16
int 14
list 32
object 8
set 112
str 26
tuple 24
unicode 26

2016-08-01

OSX, Python 2.7.10 (default, Oct 23 2015, 19:19:21) [GCC 4.2.1 Compatible Apple LLVM 7.0.0 (clang-700.0.59.5)] on darwin

decimal 80
dict 280
float 24
int 24
list 72
object 16
set 232
str 38
tuple 56
unicode 52

These answers all collect shallow size information. I suspect that visitors to this question will end up here looking to answer the question, "How big is this complex object in memory?"

There's a great answer here: https://goshippo.com/blog/measure-real-size-any-python-object/

The punchline:

import sys

def get_size(obj, seen=None):
    """Recursively finds size of objects"""
    size = sys.getsizeof(obj)
    if seen is None:
        seen = set()
    obj_id = id(obj)
    if obj_id in seen:
        return 0
    # Important mark as seen *before* entering recursion to gracefully handle
    # self-referential objects
    seen.add(obj_id)
    if isinstance(obj, dict):
        size += sum([get_size(v, seen) for v in obj.values()])
        size += sum([get_size(k, seen) for k in obj.keys()])
    elif hasattr(obj, '__dict__'):
        size += get_size(obj.__dict__, seen)
    elif hasattr(obj, '__iter__') and not isinstance(obj, (str, bytes, bytearray)):
        size += sum([get_size(i, seen) for i in obj])
    return size

Used like so:

In [1]: get_size(1)
Out[1]: 24

In [2]: get_size([1])
Out[2]: 104

In [3]: get_size([[1]])
Out[3]: 184

If you want to know Python's memory model more deeply, there's a great article here that has a similar "total size" snippet of code as part of a longer explanation: https://code.tutsplus.com/tutorials/understand-how-much-memory-your-python-objects-use--cms-25609


I've been happily using pympler for such tasks. It's compatible with many versions of Python -- the asizeof module in particular goes back to 2.2!

For example, using hughdbrown's example but with from pympler import asizeof at the start and print asizeof.asizeof(v) at the end, I see (system Python 2.5 on MacOSX 10.5):

$ python pymp.py 
set 120
unicode 32
tuple 32
int 16
decimal 152
float 16
list 40
object 0
dict 144
str 32

Clearly there is some approximation here, but I've found it very useful for footprint analysis and tuning.