Google Cloud Storage: Where do I get per-bucket statistics

(updated answer 2014/09/23 to reflect changes in the gsutil command)

gsutil du displays the amount of space (in bytes) being used by the objects in hierarchy under a given URL.

  • s gives a summary total instead of the size of each object.
  • h prints human readable sizes instead of bytes.

So:

$ gsutil du -sh gs://BUCKET_NAME
261.46 GB   gs://BUCKET_NAME

... gives the total size of objects in the bucket. However, it is calculated on request and can take a long time for buckets with many objects.

For production use, enable Access Logs & Storage Data. The storage data logs will give you the average size in bytes/per hour for each bucket for the previous day.

The access logs give details about each request to your logged buckets.

There is also information on loading the logs into BigQuery for analysis.


Delivery of access logs can be enabled per bucket as documented. When bucket logging is enabled, log files are written to the user defined logging bucket on a best effort hourly basis. You can pull log files from there and parse and count with your tool of choice. If you don't want to run analytics yourself on the raw logs you can use a service such as Qloudstat. (Disclaimer: I work for the company behind.)