This article will help you start using Hashicorp’s Terraform to create infrastructure resources in the Google Cloud.
Prerequisites
You need to have an account in the Google Cloud Platform (GCP). For lucky new users Google offers starter bonus. As well you need terraform installed somewhere too.
Granting access
First of all, we need to allow terraform access to GCP. This can be achieved by Service Accounts with corresponding access keys. Within the GCP console tool gcloud it can be done like:
gcloud iam service-accounts create terraform
gcloud iam service-accounts keys create gce-terraform-key.json --iam-account=terraform@<ỳour-project-id>.iam.gserviceaccount.com
The first line will create a new service account named terraform
. The reason for the new separate service accounts is that you can grant and revoke rights to them independently to the default service account. The second line creates a new access key and the export it as file gce-terraform-key.json
. With this key, terraform can be authenticated to the GCP.
However, the newly created user needs writing permission to be able to create any resources. Several possibilities exiting to do achieve this, here we use biding of new service role terraform@<ỳour-project-id>.iam.gserviceaccount.com
to predefined editor role
gcloud projects add-iam-policy-binding <ỳour-project-id> --member terraform@<ỳour-project-id>.iam.gserviceaccount.com --role roles/editor
This should be sufficient to start, for more background to what happens here take a look at Google Cloud Documentation.
Configure terraform to access Google Cloud
Now we can start with terraforming. First, let’s teach terraform how to access GCP. Terraform provider’s name for GCP is “google”. So if we put this
variable "gce_project" {}
variable "gce_region" { default= "europe-west1"}
provider "google" {
project = "${var.gce_project}"
region = "${var.gce_region}"
}
in to main.tf
we are nearly there. But I’ve only provided project and region through terraform variables, we still missing credentials property. Since I care about credentials here and do not want to commit them accidentally I better provide them via environment variable GOOGLE_CREDENTIALS. This variable1 needs to contain everything from the gce-terraform-key.json
file we created above. One way to do handle it on Linux:
export GOOGLE_CREDENTIALS=$(cat ~/.gce/gce-terraform-key.json)
Now we are ready to test everything. Let’s declare a small Cloud SQL instance in main.tf
resource "google_sql_database_instance" "test-db" {
name = "test-instance"
region = "${var.gce_region}"
settings {
tier = "db-f1-micro"
}
}
and then execute
terraform plan
terraform apply
You will be asked for the value of gce_project variable - provide it and continue. Also, you should see an error stating some access restrictions to API. Normally not all Google APIs are activated by default. But normally this kind of error points you to a particular URL where you can activate API access once. Even after the activation, it may take several minutes, but then it works.
If you like destroy your resources with:
terraform destroy
At this point, you might think about further improvements. One of them is the remote state.
Configuring remote state
Remote states are a great feature when it comes to working in a team. To enable Remote state with GCP you only need to define backend and initialize new configuration.
Unfortunately, backend configuration does not support variable interpolation. Even if it not that sensitive information like access.key I probably do not want to show it to everyone. Therefore my solution for this now is to put backend configuration into a separate file e.g. local_config.tf
and notice it in the .gitignore
.
terraform {
backend "gcs" {
bucket = "org.holbreich"
path = "terraform/default"
project = "my_cool_project_id"
}
}
This file will be omitted on commits, but considered by terraforming as long it’s in the same directory with the rest of the resource definitions. Now don’t forget to apply a new configuration:
terraform init
You will be asked if you want to migrate your local state to a new remote location…
However, not everything is shiny with Google Cloud integration at the moment. The relative fresh Environment States feature is not supported with GCE backend yet. From now you will receive errors if you try to work with environment states.
$ terraform env list
named states not supported
Also, state locking is not supported with GCS, which can get important too.
Summary
Even if not everything is supported to the degree of AWS terraform has good support of most important resources for GCE and additional features like “remote state”. Yes, environments should be supported by a remote state soon as well to make a better picture. Anyway, I would say it worth starting working with terraform to provision GCE. Existing infrastructure can be integrated with terraform import
. I’ve tested it with several resources and it worked well even if the import feature is systematically not documented on resources (in contrast to AWS).
If you like you can check git repo with the code mentioned here (it may evolve in the future). And as always appreciate your comments.