Quickly finding the size of an S3 'folder'

The AWS CLI's ls command can do this: aws s3 ls --summarize --human-readable --recursive s3://$BUCKETNAME/$PREFIX --region $REGION


I prefer using the AWSCLI. I find that the web console often times out when there are too many objects.

  • replace s3://bucket/ with where you want to start from.
  • relies on awscli, awk, tail, and some bash-like shell
start=s3://bucket/ && \
for prefix in `aws s3 ls $start | awk '{print $2}'`; do
  echo ">>> $prefix <<<"
  aws s3 ls $start$prefix --recursive --summarize | tail -n2
done

or in one line form:

start=s3://bucket/ && for prefix in `aws s3 ls $start | awk '{print $2}'`; do echo ">>> $prefix <<<"; aws s3 ls $start$prefix --recursive --summarize | tail -n2; done

Output looks something like:

$ start=s3://bucket/ && for prefix in `aws s3 ls $start | awk '{print $2}'`; do echo ">>> $prefix <<<"; aws s3 ls $start$prefix --recursive --summarize | tail -n2; done
>>> extracts/ <<<
Total Objects: 23
   Total Size: 10633858646
>>> hackathon/ <<<
Total Objects: 2
   Total Size: 10004
>>> home/ <<<
Total Objects: 102
   Total Size: 1421736087

Seems like AWS added a menu item where it's possible to see the size:

size of S3 folder