Deploying our Guestbook single-page application front-end and backend required several manual steps. Manual deployment of code is repetitive and error-prone. "Infrastructure as Code" seeks to eliminate this by allowing one to programmatically deploy resources into your cloud project. Examples of this include platform-specific tools such as AWS Cloud Formation and GCP Cloud Deployment Manager as well as platform-agnostic tools such as HashiCorp's Terraform and Kubernetes.

In this lab we will repeat the deployment of the API version of our Guestbook, but deploy it to GCP using a Python program that leverages the Deployment Manager API.

Note that this lab assumes that the guestbook service account has been created previously. We would typically create this service account within a Deployment Manager specification to make it self-contained, however, for the sake of brevity, its setup from the previous labs has been assumed.

Important

gcloud functions delete entries
gcloud functions delete entry

We will now examine the Deployment Manager code for deploying the REST API backend. In Cloud Shell, change into the source directory for the Deployment Manager code

cd cs430-src/06_gcp_restapi_cloudfunctions/dm

View the restapi-deployment.py file.

Declarative infrastructure specification

As with many Infrastructure-as-Code approaches, Deployment Manager centers around a declarative specification for defining and instantiating platform resources. The specification is done with YAML. For simple deployments, one can just create the YAML file and deploy it directly in the command line as below:

# Example only (do not run).
gcloud deployment-manager deployments create my-deployment --config my-deployment.yaml

More sophisticated deployments will use Python to both build YAML file programmatically from a template and perform the deployment via the Deployment Manager API. While Jinja templates are often used to generate the YAML with run-time parameters, for this lab, we will build the template using a Python f-string. The template f-string is shown below. It resides in a function that takes the current project, the location (region) the function will reside, the URL that contains the source code of the function, and the name of the function as parameters. The YAML template specifies two things:

restapi-deployment.py

def generateDeploymentYaml(project_id, location_name, upload_url, function_name):
    yaml = f'''
    - name: {function_name}
      type: gcp-types/cloudfunctions-v1:projects.locations.functions
      properties:
        function: {function_name}
        parent: projects/{project_id}/locations/{location_name}
        sourceUploadUrl: {upload_url}
        entryPoint: {function_name}
        runtime: python37
        httpsTrigger: {{}}
        serviceAccountEmail: guestbook@{project_id}.iam.gserviceaccount.com

    - name: {function_name}-iam
      action: gcp-types/cloudfunctions-v1:cloudfunctions.projects.locations.functions.setIamPolicy
      properties:
        resource: $(ref.{function_name}.name)
        policy:
          bindings:
          - role: roles/cloudfunctions.invoker
            members:
            - allUsers
    '''
    return yaml

While YAML is helpful for specifying platform resources to instantiate declaratively, deployments typically require some level of programmatic access in order to work with code assets that need to be deployed. This is done imperatively in Python code.

Imperative code

Prior labs have shown how to use the supported Google Cloud Platform Python packages to access its main services such as Cloud Datastore. Many services, however, do not have supported Python packages in PyPI. To access these services from within Python, we can either access the API via HTTP requests directly or use standard API discovery services to generate them (OpenAPI, Google's API Discovery)

To begin with, we import the API discovery package as well as the platform's auth package in order to handle the credentials we require to do the deployment. Note that when we run this program from Cloud Shell, it will take on the Google APIs Service agent role when requesting access. The role's default level of access is "Project=>Editor" which we will need to add to.

import apiclient.discovery
import google.auth

The main code is below. It takes the name of the deployment as a parameter and sets the region for the deployment. It then obtains the GCP project ID and a set of credentials for performing the deployment. From this, it sets the service account that will be used to deploy the Cloud Function (assuming this has been created from the prior labs). It then calls the internal function uploadCloudFunction() which creates a zip file of the code for implementing the Guestbook and then uses the Cloud Functions API to upload it to a temporary bucket, returning the URL of the zip file in the bucket.

restapi-deployment.py

location_name = 'us-central1'
deployment_name = sys.argv[1]

# Instantiate Deployment Manager API
credentials, project_id = google.auth.default()
service_account = f"guestbook@{project_id}.iam.gserviceaccount.com"

# Upload Cloud Function code
upload_url = uploadCloudFunction(credentials)

The program then performs the main deployment. It uses the Discovery package to build the Deployment Manager API object, then generates the YAML for both REST API endpoints (/entries and /entry). It then creates a request payload that includes the name of the deployment and its YAML specification. Finally, it executes the deployment by calling the Deployment Manager API's insert() method.

deployment_api = apiclient.discovery.build(
                       'deploymentmanager',
                       'v2', credentials=credentials)

yaml = f'''resources:
{generateDeploymentYaml(project_id, location_name, upload_url, 'entries')}
{generateDeploymentYaml(project_id, location_name, upload_url, 'entry')}'''

request_body = {
    "name": deployment_name,
    "target": {
        "config": {
            "content": yaml
        },
    }
}
operation = deployment_api.deployments().insert(
                  project=project_id,
                  body=request_body).execute()

Before deploying our API, we will need to add some additional privileges to the Google APIs service account used to deploy our code. It has been given "Editor" access to create the functions, but that role does not allow it to change the permissions of the function. Doing so requires administrator privileges. To do so, we attach the appropriate role (Cloud Functions Admin) to the service account. In the web console, visit IAM and locate the service account. It has the format of <acct_number>@cloudservices.gserviceaccount.com as shown below.

Click on the pencil icon to edit this service account. Then add the Cloud Functions Admin role to it as shown below:

Click "Save" and verify that the role has been added to the service account.

As before, these steps may also be done via the command line in Cloud Shell. The steps below get the project number into an environment variable (to identify the service account), then add a policy binding to the account that grants it the ability to deploy functions.

export PROJECT_NUMBER=$(gcloud projects describe $GOOGLE_CLOUD_PROJECT --format='value(projectNumber)')

gcloud projects add-iam-policy-binding ${GOOGLE_CLOUD_PROJECT} \
  --member serviceAccount:${PROJECT_NUMBER}@cloudservices.gserviceaccount.com \
  --role roles/cloudfunctions.admin

We will now deploy our API. In Cloud Shell, create a Python environment that includes all of the packages needed for deployment (google-api-python-client and google-auth).

cd cs430-src/06_gcp_restapi_cloudfunctions/dm
virtualenv -p python3 env
source env/bin/activate
pip install -r requirements.txt

Before deploying, ensure you have unset the GOOGLE_APPLICATION_CREDENTIALS environment variable in Cloud Shell if it has been set in previous labs.

unset GOOGLE_APPLICATION_CREDENTIALS

Then, deploy the REST APIs using your OdinID to ensure a uniquely named deployment.

python3 restapi-deployment.py <OdinID>-restapi

After initiating the deployment, visit the Deployment Manager console, find the deployment, and click on it. When the deployment is complete (after about 5 minutes), take a screenshot of its success as below:

Back in Cloud Shell, the Python script will report an endpoint for the REST APIs. Copy this endpoint and save it somewhere. We will use it to configure the client in an upcoming portion of the lab.

Finally, visit the web console for Cloud Functions and take a screenshot that includes the two functions that were created in the deployment.

As before, our goal is to deploy the web app to a storage bucket with static website hosting enabled and public access allowed. Go back to the Deployment Manager directory.

cd cs430-src/06_gcp_restapi_cloudfunctions/dm

The deployment script for the frontend is similar to that of the REST API. The main difference is with the YAML specification. For the frontend, the only platform resource we want to create is the storage bucket to hold our static web assets. As shown below, we use the name of the deployment given by the user for the name of the bucket. We also specify that the permissions on the bucket be set to allow public access for both the bucket and its objects.

frontend-deployment.py

yaml = f''' resources:
  - type: gcp-types/storage-v1:buckets
    name: {deployment_name}
    properties:
      region: {location_name}
      storageClass: STANDARD
      acl:
        - role: READER
          entity: allUsers
      defaultObjectAcl:
        - entity: allUsers
          role: READER
'''

After deploying the bucket, we simply populate the bucket with the content of the site as shown in the code below. Note that since Google maintains a Python package for accessing its Cloud Storage service (google-cloud-storage), we can use that instead of the apidiscovery package.

from google.cloud import storage
storage_client = storage.Client()
bucket = storage_client.bucket(deployment_name)
for object_file in ['index.html', 'static/guestbook.js', 'static/style.css']:
    blob = bucket.blob(object_file)
    blob.upload_from_filename(f'../frontend-src/{object_file}')

We will now deploy the frontend using a separate Deployment Manager program. Before doing so, however, we need to edit our guestbook.js code to point the endpoints we have deployed in the previous step. Visit the source directory containing the application.

cd cs430-src/06_gcp_restapi_cloudfunctions/frontend-src

Change the baseApiUrl to point to the URL the deployment returned in the REST API deployment (e.g. https://<region_name>-<project_id>.cloudfunctions.net/ ).

frontend-src/static/guestbook.js

const baseApiUrl = "<FMI>";

Change back into the Deployment Manager directory

cd cs430-src/06_gcp_restapi_cloudfunctions/dm

Then, deploy the frontend. Ensure the name you use for the deployment contains your OdinID to ensure the bucket created has a unique name.

python frontend-deployment.py <OdinID>-frontend

The frontend will be deployed in a bucket. As with the prior codelab that used a storage bucket to host the static assets of the site, you can access content via the following URL

https://storage.googleapis.com/<BucketName>

Visit the index.html file in this bucket:

https://storage.googleapis.com/<BucketName>/index.html

Add an entry with the message "Hello Deployment Manager!".

Delete the storage bucket and the Cloud function either via the web console UI or from Cloud Shell via the CLI.

gsutil rm -r gs://<OdinId>-frontend

Then, list the two deployments via the command-line.

gcloud deployment-manager deployments list

Using Deployment Manager's command-line interface, delete both deployments:

gcloud deployment-manager deployments delete <NameOfRestAPIDeployment> <NameOfFrontendDeployment>

Note, you may optionally delete the deployments via the web console as well.