Cloud run job to backup a Cloud SQL instance to a GCS bucket.
Commands:
After running gcloud builds submit
to build & deploy the container, run the following command to schedule running the backup job
Remember to change the --location
, --oauth-service-account-email
and --uri
to match your environment.
Also make sure service account has the correct permissions to run the job.
- Create a workflow with
gcloud workflows deploy sql-backup --source workflow.yaml --location europe-north1 --project $(gcloud config get-value project)
NOTE: This uses the default service account. You can and you SHOULD change it with --service-account
flag. Default service account has too many permissions for least privilege principle.
- Execute the workflow. Remember to change the data to match your environment.
gcloud workflows execute sql-backup --location europe-north1 --project $(gcloud config get-value project) --data='{"bucket":"BUCKET_NAME","database":"DB_NAME","instance":"SQL_INSTANCE_NAME"}'
- Schedule the workflow to run every so often with Cloud Scheduler from Cloud Console.
- Login to gcloud cli with
gcloud auth application-default login
- Set project id with
gcloud config set project <project-id>
- Download latest terraform
- cd to terraform folder
- run
terraform apply
with just installed terraform - enter asked variables
- enjoy deploying. (takes a while)
- go and give permissions to cloud build. https://console.cloud.google.com/cloud-build/settings/service-account?project= Cloud Run and Service Accounts
- run
gcloud builds submit
in project root to build and deploy the container - Remember to give sql instance service account permissions to access bucket:
SA_EMAIL="serviceAccount:"$(gcloud sql instances describe CHANGE_ME_TO_DB_NAME --format='value(serviceAccountEmailAddress)' --project $(gcloud config get-value project))
gcloud projects add-iam-policy-binding $(gcloud config get-value project) --member=$SA_EMAIL --role roles/storage.objectUser
gcloud scheduler jobs create http cloudsqlbackup-schedule \
--location europe-west1 \
--schedule "0 0 * * *" \
--uri https://europe-north1-run.googleapis.com/apis/run.googleapis.com/v1/namespaces/taikuri/jobs/cloudsqlbackup:run \
--oauth-service-account-email X@developer.gserviceaccount.com \
--oauth-token-scope https://www.googleapis.com/auth/cloud-platform
gcloud storage buckets create gs://tietokanta-backups \
--enable-autoclass \
--location europe-north1 \
--public-access-prevention
Lifecycle management is used to delete old backups if there are more than 365 backups in the bucket or if the backups are older than 365 days. (manages costs)
gcloud storage buckets update gs://tietokanta-backups --versioning --lifecycle-file bucket-lifecycle-config.json
gcloud alpha monitoring channels create \
--display-name "Email notif for maansiirto" \
--description "Job that does backups of db" \
--type email \
--channel-labels email_address=maansiirto@hiondigital.com
Remember to change monitoring-alert-policy.json to match your environment. Especially notificationChannels array. Use the returned data from last command to change array values.
gcloud alpha monitoring policies create --policy-from-file=monitoring-alert-policy.json
Bad developer experience. Types and stuff don't work well with vscode nor webstorm Next time: use regular ts
Error:
The client is not authorized to make this request
Cloud SQL instance does not exist / query is trying to access name of non existing instance