Deploying Django To App Engine With Github Actions
May 10, 2022Deploying to Google Cloud Platform (GCP) on Github actions has not been a straight forward process. The blog posts online are incomplete and hard to follow. So, hopefully I can help a little bit with what I found to make it easy to deploy this website to GCP App engine and perform migrations on a Cloud SQL database.
Deployment
from this blog post I was able to get the general structure of how to deploy from GitHub actions: https://www.ayobamiadewole.com/Blog/Github-Actions. My app.yaml varies a bit because I do not want to store my secrets in source control. This leads to some pretty interesting problems later on... But anyways here is the app.yaml file that I am using currently:
1runtime: python39
2
3handlers:
4# This configures Google App Engine to serve the files in the app's static
5# directory.
6- url: /static
7 static_dir: static/
8
9# This handler routes all requests not caught above to your main app. It is
10# required when static routes are defined, but can be omitted (along with
11# the entire handlers section) when there are no static files defined.
12- url: /.*
13 script: auto
14
15includes:
16 - env_variables.yaml
The app.yaml is pretty basic and basically follows the app.yaml found in the GCP App Engine tutorial for Django from GCP's website. The only difference is the following part:
1includes:
2 - env_variables.yaml
Which allows me to not store my environment variables in source control. However, now I am also dependent on sourcing this env_variables.yaml file into the rest of the pipeline so I do not have to track too many duplicated secrets in GitHub actions and maintain their values.
Initially, I was not sure how to get this env_variables.yaml file accessible in the environment variables. So, I posted a question on the very much scary and every programmers favorite website to copy from; StackOverFlow!
https://stackoverflow.com/questions/72093178/how-do-i-read-a-yaml-file-and-outputting-the-yaml-file-values-as-environment-var
Basically the solution, someone suggested to me was to run this one-liner:
1python -c 'from pathlib import Path;from ruamel.yaml import YAML; print( "".join( [f"{k}={v!r}\n" for k, v in YAML().load(Path("env_variables.yaml"))["env_variables"].items() if not k.__eq__("DATABASE_CONNECTION_ADDRESS")] ) )'
Which basically reads the environment variables keys and values and stores them in the following format KEY=VALUE and this gets sourced into the the environment using set -a; eval $(python -c ...); set +a.
So great! We can now turn a YAML file into environment variables! We now need to base64 encode our env_variables.yaml file and store that value in Github secrets. By running the following command in your terminal: base64 env_variables.yaml | pbcopy , it will base64 encode our yaml file and copy the contents to your clipboard (assuming you are using a mac). We can then follow the tutorial on Github's website on how to store this base64 value into a GitHub secret: https://docs.github.com/en/actions/security-guides/encrypted-secrets
Then we can start with this yaml file as base and put it at the following path in your project: .github/workflows/deploy.yaml
1# .github/workflows/deploy
2
3name: deploy-app-to-gcp
4on:
5 push:
6 branches: [ main]
7 paths:
8 - '**'
9jobs:
10 deploy:
11 name: Deploy
12 runs-on: ubuntu-latest
13
14 steps:
15 - uses: actions/checkout@v1
16 - uses: actions-hub/gcloud@master
17 env:
18 PROJECT_ID: ${{secrets.GCLOUD_PROJECT_PROD_ID}}
19 APPLICATION_CREDENTIALS: ${{secrets.GCLOUD_GITHUB_CREDENTIALS}}
20 with:
21 args: app deploy app.yaml
We need to create two more secrets, now the we have a base for our GitHub actions workflow.
- GCLOUD_PROJECT_PROD_ID
- GCLOUD_GITHUB_CREDENTIALS
The GCLOUD_PROJECT_PROD_ID value should be the name of your project in the GCP console.
The GCLOUD_GITHUB_CREDENTIALS is a json file that is base64 encoded version of a service account key from GCP with the correct permissions to run a deployment.
Read more about the GCP action here: https://github.com/actions-hub/gcloud
Cool, Cool! Now that we have those secrets we can add the following step so that the Django app can be deployed with the environment variables it needs to work
1- name: get env file
2 run: |
3 echo "${{secrets.ENV_FILE}}" | base64 --decode > ./env_variables.yaml
And Boom! you have a working deployment!
# .github/workflows/deploy
name: deploy-app-to-gcp
on:
push:
branches: [ main]
paths:
- '**'
jobs:
deploy:
name: Deploy
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- name: get env file
run: |
echo "${{secrets.ENV_FILE}}" | base64 --decode > ./env_variables.yaml
- uses: actions-hub/gcloud@master
env:
PROJECT_ID: ${{secrets.GCLOUD_PROJECT_PROD_ID}}
APPLICATION_CREDENTIALS: ${{secrets.GCLOUD_GITHUB_CREDENTIALS}}
with:
args: app deploy app.yaml
Migration
Now that we can successfully deploy our Django app. We need a way to make sure our migrations get applied to our application. We can added another job to our yaml file and call it migrate .
1migrate:
2 name: Migrate Database
3 runs-on: ubuntu-latest
4 needs: deploy
5 steps:
6 - uses: actions/checkout@v1
We will load our env_variables.yaml from our secrets again and as well as our GCLOUD_GITHUB_CREDENTIALS. Then we will write both to disk so they are accessible in the rest of this job.
1- name: get env file
2 run: |
3 echo "${{secrets.ENV_FILE}}" | base64 --decode > ./env_variables.yaml
4 echo "${{secrets.GCLOUD_GITHUB_CREDENTIALS}}" | base64 --decode > ./secrets.json
To run our Django migration we will need to install our python dependancies.
1- name: Set up Python 3.9
2 uses: actions/setup-python@v2
3 with:
4 python-version: 3.9
5 - name: Install dependencies
6 run: |
7 python -m pip install --upgrade pip
8 pip install -r requirements.txt
After we install our dependancies, we will need a way to connect to our Cloud SQL database. We can connected to the database using Cloud SQL Proxy. We will just download it for now and use it in a later step.
1- name: Get Cloud SQL Proxy
2 run: |
3 wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 -O cloud_sql_proxy
4 chmod +x cloud_sql_proxy
Now that we have everything setup, and downloaded. We can preform the actual migration. I am going to show the code and I will go line by line into what it means.
1- name: migrate Database
2 env:
3 DATABASE_CONNECTION_ADDRESS: 127.0.0.1
4 run: |
5 pip install ruamel.yaml
6 set -a; eval $(python -c 'from pathlib import Path;from ruamel.yaml import YAML; print( "".join( [f"{k}={v!r}\n" for k, v in YAML().load(Path("env_variables.yaml"))["env_variables"].items() if not k.__eq__("DATABASE_CONNECTION_ADDRESS")] ) )'); set +a
7 ./cloud_sql_proxy -instances=${{secrets.GCLOUD_PROJECT_PROD_ID}}:us-central1:cms-db=tcp:5432 -credential_file secrets.json &
8 python manage.py migrate
9 exit 0;
These two lines below read the env_variables.yaml file and source it into current environment to be accessible by Django when it runs its migration. We want to not include DATABASE_CONNECTION_ADDRESS from the env_variables.yaml file because we want to use the ip of the cloud sql proxy instead.Which we have specified in the env of this step.
1pip install ruamel.yaml
2set -a; eval $(python -c 'from pathlib import Path;from ruamel.yaml import YAML; print( "".join( [f"{k}={v!r}\n" for k, v in YAML().load(Path("env_variables.yaml"))["env_variables"].items() if not k.__eq__("DATABASE_CONNECTION_ADDRESS")] ) )'); set +a
This next step we are running the cloud sql proxy and giving it the credential file we wrote to disk earlier in this process. Additionally we are running the proxy in the background using the & symbol so that we can not block the main thread and continue on with the migration.
1./cloud_sql_proxy -instances=${{secrets.GCLOUD_PROJECT_PROD_ID}}:us-central1:cms-db=tcp:5432 -credential_file secrets.json &
Then finally we run the migration and exit!
1python manage.py migrate
2exit 0;
And wow our migration step is done!
Final Github Actions File
1name: deploy-app-to-gcp
2on:
3 push:
4 branches: [ main]
5 paths:
6 - '**'
7jobs:
8 deploy:
9 name: Deploy
10 runs-on: ubuntu-latest
11
12 steps:
13 - uses: actions/checkout@v1
14 - name: get env file
15 run: |
16 echo "${{secrets.ENV_FILE}}" | base64 --decode > ./env_variables.yaml
17 echo "${{secrets.SECRET_JSON}}" | base64 --decode > ./secrets.json
18 - uses: actions-hub/gcloud@master
19 env:
20 PROJECT_ID: ${{secrets.GCLOUD_PROJECT_PROD_ID}}
21 APPLICATION_CREDENTIALS: ${{secrets.GCLOUD_GITHUB_CREDENTIALS}}
22 with:
23 args: app deploy app.yaml
24 migrate:
25 name: Migrate Database
26 runs-on: ubuntu-latest
27 needs: deploy
28 steps:
29 - uses: actions/checkout@v1
30 - name: get env file
31 run: |
32 echo "${{secrets.ENV_FILE}}" | base64 --decode > ./env_variables.yaml
33 echo "${{secrets.GCLOUD_GITHUB_CREDENTIALS}}" | base64 --decode > ./secrets.json
34 - name: Set up Python 3.9
35 uses: actions/setup-python@v2
36 with:
37 python-version: 3.9
38 - name: Install dependencies
39 run: |
40 python -m pip install --upgrade pip
41 pip install -r requirements.txt
42 - name: Get Cloud SQL Proxy
43 run: |
44 wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 -O cloud_sql_proxy
45 chmod +x cloud_sql_proxy
46 - name: migrate Database
47 env:
48 DATABASE_CONNECTION_ADDRESS: 127.0.0.1
49 run: |
50 pip install ruamel.yaml
51 set -a; eval $(python -c 'from pathlib import Path;from ruamel.yaml import YAML; print( "".join( [f"{k}={v!r}\n" for k, v in YAML().load(Path("env_variables.yaml"))["env_variables"].items() if not k.__eq__("DATABASE_CONNECTION_ADDRESS")] ) )'); set +a
52 ./cloud_sql_proxy -instances=${{secrets.GCLOUD_PROJECT_PROD_ID}}:us-central1:cms-db=tcp:5432 -credential_file secrets.json &
53 python manage.py migrate
54 exit 0;