How much memory is used by a numpy ndarray?

The array is simply stored in one consecutive block in memory. Assuming by "float" you mean standard double precision floating point numbers, then the array will need 8 bytes per element.

In general, you can simply query the nbytes attribute for the total memory requirement of an array, and itemsize for the size of a single element in bytes:

>>> a = numpy.arange(1000.0)
>>> a.nbytes
8000
>>> a.itemsize
8

In addtion to the actual array data, there will also be a small data structure containing the meta-information on the array. Especially for large arrays, the size of this data structure is negligible.


To get the total memory footprint of the NumPy array in bytes, including the metadata, you can use Python's sys.getsizeof() function:

import sys
import numpy as np

a = np.arange(1000.0)

sys.getsizeof(a)

8104 bytes is the result

sys.getsizeof() works for any Python object. It reports the internal memory allocation, not necessarily the memory footprint of the object once it is written out to some file format. Sometimes it is wildly misleading. For example, with 2d arrays, it doesn't dig into the memory footprint of vector.

See the docs here. Ned Batcheldor shares caveats with using sys.getsizeof() here.