Retrieving Automated Google Cloud SQL Backups
gcloud sdk commands now provide import/export functionality:
gcloud sql export sql <DATABASE_INSTANCE> \
gs://<CLOUD_STORAGE_BUCKET>/cloudsql/export.sql.gz \
--database <DATABASE_NAME>
This export can be downloaded using gsutil
. It can also be imported using mysqlimport
That's the problem I've encountered and my solution was:
- Go to IAM Service Accounts Management
- Create a new Service account (I called it
sql-backuper
), download access key for it in JSON - Grant Viewer, Storage Object Creator roles to it on main IAM page (currently GCloud doesn't have a separate read-only role for SQL)
- Set it up on the machine that will do backups:
gcloud auth activate-service-account [email protected] --key-file /home/backuper/gcloud-service-account.json
(gcloud auth documentation) - Create a new bucket at GCloud Storage Browser
- Now on your backup machine you can run:
gcloud sql instances export [sql-instance-name] gs://[bucket-name]/[file-name].gz --database [your-db-name]
(gcloud sql documentation) andgsutil cp gs://[bucket-name]/[file-name].gz [local-file-name].gz
(gsutil cp documentation) - You've got a local DB copy which you can now use as you want
Note that you can now trigger an Export
operation using the Cloud SQL REST API.
So your admin script can do that and then download the backup from Cloud Storage (You'll need to wait until the export operation finishes though).