Retrieving Automated Google Cloud SQL Backups

gcloud sdk commands now provide import/export functionality:

gcloud sql export sql <DATABASE_INSTANCE> \
    gs://<CLOUD_STORAGE_BUCKET>/cloudsql/export.sql.gz \
    --database <DATABASE_NAME>

This export can be downloaded using gsutil. It can also be imported using mysqlimport


That's the problem I've encountered and my solution was:

  1. Go to IAM Service Accounts Management
  2. Create a new Service account (I called it sql-backuper), download access key for it in JSON
  3. Grant Viewer, Storage Object Creator roles to it on main IAM page (currently GCloud doesn't have a separate read-only role for SQL)
  4. Set it up on the machine that will do backups: gcloud auth activate-service-account [email protected] --key-file /home/backuper/gcloud-service-account.json (gcloud auth documentation)
  5. Create a new bucket at GCloud Storage Browser
  6. Now on your backup machine you can run: gcloud sql instances export [sql-instance-name] gs://[bucket-name]/[file-name].gz --database [your-db-name] (gcloud sql documentation) and gsutil cp gs://[bucket-name]/[file-name].gz [local-file-name].gz (gsutil cp documentation)
  7. You've got a local DB copy which you can now use as you want

Note that you can now trigger an Export operation using the Cloud SQL REST API.

So your admin script can do that and then download the backup from Cloud Storage (You'll need to wait until the export operation finishes though).