Terraforming the Google Cloud

This article will help you start using Hashicorp's Terraform to create infrastructure resources in the Google Cloud.

Prerequisites

You need to have an account in the Google Cloud Platform (GCP). For lucky new users Google offers starter bonus. As well you need terraform installed somewhere too.

Granting access

First of all we need to allow terraform access GCP. This can be achieved by Service Accounts with corresponding access keys.
Within the GCP console tool gcloud it can be done like:

gcloud iam service-accounts create terraform  
gcloud iam service-accounts keys create gce-terraform-key.json --iam-account=terraform@<ỳour-project-id>.iam.gserviceaccount.com  

The first line will create new service account named terraform. The reason for new separate service account it that you can grant and revoke rights to it independently to the default service account. The second line creates new access key and export is as file gce-terraform-key.json. With this key terraform can be authenticated to the GCP.

How ever the newly created user need writing permission to be able to create any resources. Several possibilites exiting to do achieve this, here we use biding of new service role terraform@<ỳour-project-id>.iam.gserviceaccount.com to predefined editor role

gcloud projects add-iam-policy-binding <ỳour-project-id> --member terraform@<ỳour-project-id>.iam.gserviceaccount.com --role roles/editor  

This should be sufficient to start, for more background to what happens here take a look at Google Cloud Documentation.

Configure terraform to access Google Cloud

Now we can start with terraform. First lets teach terraform how to access GCP. Terraform provider's name for GCP is "google". So if we put this

variable "gce_project" {}  
variable "gce_region" { default= "europe-west1"}

provider "google" {  
  project     = "${var.gce_project}"
  region      = "${var.gce_region}"
}

in to main.tf we are nearly there. But i've only provided project and region through terraform variables, we still missing credentials property. Since i care about credential's here and do not want to commit them accidentally i better provide them via environment variable GOOGLE_CREDENTIALS. This variable1 need to contain everything from the gce-terraform-key.json file we created above. One way to do handle it on Linux:

export GOOGLE_CREDENTIALS=$(cat ~/.gce/gce-terraform-key.json)  

Now we are ready test everything. Let's declare small Cloud SQL instance in main.tf

resource "google_sql_database_instance" "test-db" {  
  name = "test-instance"
  region      = "${var.gce_region}"

  settings {
    tier = "db-f1-micro"
  }
}

and then execute

terraform plan  
terraform apply  

You will be asked for value of gce_project variable - provide it and continue. Also you should see an error stating some access restrictions to API. Normally not all Google API's are activated by default. But normaly this kind of error points you to a particular URL where you can activate API access once. Even after the activation it may take several minutes, but then it works.

If you like destroy your resources with:

terraform destroy  

At this point you might think about further improvements. One of them is the remote state.

Configuring remote state

Remote states are great feature, when it comes to work in a team. To enable Remote state with GCP you only need to define backend and initialize new configuration.

Unfortunately backend configuration does not support variable interpolation. Even if it not that sensitive information like access.key i probably do not want to show it to everyone. Therefore my solution for this now is to put backend configuration into a separate file e.g. local_config.tf and notice it in the .gitignore.

terraform {  
  backend "gcs" {
    bucket  = "org.holbreich"
    path    = "terraform/default"
    project = "my_cool_project_id"
  }
}

This file will be omitted on commits, but considered by terraform as long it's in the same directory with the rest of the resource definitions. Now don't forget to apply new configuration:

terraform init  

You will be asked it you wan't to migrate your local state to new remote location...

However not everything is shiny with Google Cloud integration at the moment. The relative fresh Environment States feature is not supported with GCS backend yet. From now you will receive error if you try to work with environment states.

$ terraform env list
named states not supported  

Also state locking is not supported with GCS, can get important too.

Summary

Even if not everything is supported to degree of AWS terraform has good support of most important resources for GCE and additional features like "remote state". Yes, environments should be supported by remote state soon as well to make a better picture. Anyway i would say it worth to start working with terraform to provision GCE. Existing infrastructure can be integrated with terraform import. I've tested it with several resources and it worked well even if import feature is systematically not documented on resources (in contrast to AWS).

If you like you can check git repo with code mentioned here (it may evolve in the future). And as always appreciate your comments.

  1. In fact not only GOOGLE_CREDENTIALS is checked and also region and project can be defined via variables. Consult terraform documentation for more