This article is a follow up to my original blog post "First Impressions with Terraform Cloud".
In my original article, I only tried Terraform Cloud for a few hours and detailed my initial impressions. I was recently on an engagement with The Pentad Group at a large insurance company and, during the initial phase of that engagement, the customer wanted to utilize Terraform Enterprise. This later changed and, because of budget reasons, we moved forward with Terraform Open Source, but we did a lot of initial planning, research, and proof-of-concept work with with Enterprise that broadened my understanding of the product.
Terraform Cloud vs Terraform Enterprise
As you were reading above, you probably noticed that I began referencing Terraform Enterprise (TFE) in a Terraform Cloud article. That is because Terraform Cloud and Enterprise as essentially the same product, same codebase, etc but with a few differences. The biggest difference is that TF Enterprise is self-hosted. You host the service in your datacenter, which means you are responsible for some of the support (ensuring the machines are running, communication is allowed, etc). Additionally, there are some caveats that you will find in the documentation regarding some features that may not be available in TF Enterprise. These are usually minor.
Keep in mind that the documentation for TF Enterprise at terraform.io is always going to reference Terraform Cloud. There is not separate documentation for TF Enterprise, but rather there will be callouts in the documentation if something differs or is not available for TFE.
Who chooses Terraform Enterprise over TF Cloud? Usually organizations with a restricted security posture or who are not comfortable with their data traversing a hosted environment they do not have control over. One additional call out - you do not need to use Terraform Enterprise to execute Terraform code within your on-premise datacenter. Terraform Cloud does have the capability to execute runs against "Terraform Cloud Agents" which can be hosted on-prem and in isolated environments.
Existing CI/CD Integration
In my original article, I made reference to the fact that Terraform Cloud is not a full CI/CD tool. It is designed to run Terraform Plan, Apply, provide an approval gate, perform a cost check, and run a Sentinel checks. This is all done in TF Cloud "runs", which is similar to a CI/CD pipeline. As stated in the original article, there are 3 ways to invoke a TF Cloud run:
Version Control Workflow - When you store your TF config files in a git repository.
CLI-driven workflow - You use Terraform's CLI to execute runs in Terraform Cloud.
API-driven workflow - You trigger TF Cloud using the Terraform API.
Version Control Workflow invocation is the quickest and easiest to setup but is also the most limited method. If you use this, you cannot integration TF Cloud into a larger CI/CD pipeline, unless your TF code is in a sub-repo and that pipeline is dumping code to that repo. But even then, you are going to run into visibility issues between the TF Cloud run and the parent pipeline.
CLI-driven Workflow
Overview
This method uses the Terraform binary to invoke a run via the CLI. This mimics how Terraform Open Source is used, but with some slight differences.
In order to invoke the run, you must first have Terraform Cloud and the correct workspace defined in code:
Then you will need to login to Terraform Cloud. This can be done with the "terraform login" command, which will open a browser window for you authenticate to. Once you authenticate, Terraform will grab a token from that session and use it for the CLI commands. Alternatively, you can configure your credentials in a CLI config file. The credential file generally contains the terraform cloud url (app.terraform.com) or a TF Enterprise url and a token. The file will look something like this:
After getting your authentication set, you can then run "terraform init" and other Terraform commands. An additional difference is that, when you run "terraform plan" and "terraform apply", these runs actually occur in Terraform Cloud and then get streamed back to your CLI terminal. Additionally, there will be a URL that you can click to see the run in the Terraform Cloud UI.
This all looks pretty straightforward and similar to how you have probably used Terraform Open Source. One question you may be asking is - can you use the CLI method in a CI/CD pipeline, like you can with Terraform Open Source? The answer is "Yes!", but there are a few tweaks that have to be made, specifically around the authentication process.
CI/CD Integration
In this section, I will demonstrate a simple pipeline invoking a TF Cloud run with the CLI. For this test, I will be using some Pentad Group lab resources. I will need the following:
Development Workstation (Macbook Pro)
Integrated Development Environment (VSCode)
Terraform Cloud Organization (Free Trial)
Azure Cloud Subscription (Pay-as-You-Go Billing)
Terraform Executable
Azure Pipeline (Free Azure DevOps Subscription)
Azure DevOps Repository (Free Azure DevOps Subscription)
Basic Terraform Code to Deploy Resource Groups
Azure Keyvault Resource (To Store the TF Cloud Token)
Service Principal and Service Connection (To Allow the Pipeline to Auth to Azure Cloud)
Configuration Items:
Note: I will not cover the configuration of every item listed above. The steps below will be specific to getting a TF CLI invoked run working from a pipeline using the above resources. If there is interest in additional articles covering the rest of this setup, please comment on this article.
1. Configure a remote Terraform backend pointing at Terraform Cloud, specifying a workspace and organization.
You will also need to create a matching workspace in TF Cloud with the correct invocation method.
2. Go to the Terraform Cloud UI, click on your user name in the top right corner and then click on "User Settings". On the next page, click on "Tokens" on the left. Then click on "Create an API token".
3. In the box that pops up, write a description for the token and click "Create API token". Then copy the token string and place it in Azure Keyvault as a secret.
4. I will assume that you have some basic working Terraform code stored in an Azure DevOps repo, at this point. In this case, I am using some simple code to deploy some Azure Resource Groups within a Subscription.
5. Next, create a pipeline yaml file. I will break down the contents of the yaml file into sections so that I can cover each piece.
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
This section covers the trigger, which means any changes to the "main" branch of the repo will cause Azure Pipelines to trigger a pipeline run. The pool section tells the pipeline what Azure DevOps hosted machine pool to use. In this case, we will be using a Ubuntu machine pool to run the pipeline tasks.
6. For the first task in the pipeline job, we will be telling it to access the Keyvault (in Azure) where we stored the TF Cloud token. It will retrieve it and store it as a sensitive variable that can be used in the rest of the pipeline.
steps:
- task: AzureKeyVault@2
displayName: Pull keyvault secrets
inputs:
azureSubscription: 'mysub'
KeyVaultName: 'akv'
SecretsFilter: 'tfcloud-token'
RunAsPreJob: true
7. The last task in the pipeline is a multi-line, in-line bash script.
- task: AzureCLI@2
displayName: Terraform Init and Plan
inputs:
azureSubscription: 'mysub'
scriptType: bash
scriptLocation: inlineScript
inlineScript: |
echo 'credentials "app.terraform.io" {token = "$(tfcloud-token)"}' > ~/.terraformrc
terraform init
terraform apply
workingDirectory: $(System.DefaultWorkingDirectory)
The in-line script consists of 3 lines. The first line echos a string and outputs it into a .terraformrc file created in the home directory. The .terraformrc file is the credential file that we referenced earlier. As you can see, part of the string is the Azure Pipelines variable for the secret we pulled from the vault in the preceding step (denoted by the $). This writes out the secret to the credential file in the home directory. After this run, the machine is purged, so the file will not be hanging around.
The second and third lines are your normal "terraform init" and "terraform apply". You will notice that we are not doing a "terraform plan" first. That is because, with TF Cloud, an apply triggers a remote run, which consists of a Plan step, Approval step, and an Apply step.
8. Assuming you have all of your other pre-requisites configured correctly, you should be able to push this code to the repo, configure an Azure Pipeline (using the yaml file), and then let it run.
Testing the Pipeline:
1. After pushing a change to the repo, the pipeline triggered.
2. The TF Cloud token was pulled successfully from the Azure Keyvault.
3. In the next step, the credential file was written (nothing was written out to the console), Terraform successfully initialized, and a Terraform apply was triggered.
4. The pipeline will now be in a pending state until we review the run in TF Cloud.
5. Jumping over to TF Cloud and reviewing our workspaces, we have a run to review.
6. Looking at the run, we have a Terraform plan that completed and matches what we saw in Azure pipelines.
7. Looks good, so we go ahead and click "Confirm & Apply" at the bottom.
8. Back in Azure Pipelines, the pipeline has updated to show we approved the apply and we can see the apply in action here, as well as in TF Cloud.
With just a few tweaks, we get a very similar experience to how pipelines generally invoke TF Open Source.
API-driven workflow
Overview
This method uses custom scripting/tooling to execute a series of rest calls to the Terraform Cloud API to create a new configuration and then trigger a run. The Terraform code is packaged up as a .tar.gz and sent to the API for the run. Documentation on this approach is here.
You must first have Terraform Cloud and the correct workspace defined in code:
Then you will need to login to Terraform Cloud. This can be done with the "terraform login" command, which will open a browser window for you authenticate to. Once you authenticate, Terraform will grab a token from that session and use it for the CLI commands. Alternatively, you can configure your credentials in a CLI config file. The credential file generally contains the terraform cloud url (app.terraform.com) or a TF Enterprise url and a token. The file will look something like this:
After getting your authentication set, you will need to follow the documentation referenced above to use your own script/tooling or download the sample script to execute the API calls. Once you have the script, you can execute a run by passing the required arguments to the script.
CI/CD Integration
In this section, I will demonstrate a simple pipeline invoking a TF Cloud run with the API method. For this test, I will be using some Pentad Group lab resources. I will need the following:
Development Workstation (Macbook Pro)
Integrated Development Environment (VSCode)
Terraform Cloud Organization (Free Trial)
Azure Cloud Subscription (Pay-as-You-Go Billing)
Terraform Executable
Azure Pipeline (Free Azure DevOps Subscription)
Azure DevOps Repository (Free Azure DevOps Subscription)
Basic Terraform Code to Deploy Resource Groups
Azure Keyvault Resource (To Store the TF Cloud Token)
Service Principal and Service Connection (To Allow the Pipeline to Auth to Azure Cloud)
Configuration Items:
Note: I will not cover the configuration of every item listed above. The steps below will be specific to getting a TF API invoked run working from a pipeline using the above resources. If there is interest in additional articles covering the rest of this setup, please comment on this article.
1. Configure a remote Terraform backend pointing at Terraform Cloud, specifying a workspace and organization.
You will also need to create a matching workspace in TF Cloud with the correct invocation method.
2. Go to the Terraform Cloud UI, click on your user name in the top right corner and then click on "User Settings". On the next page, click on "Tokens" on the left. Then click on "Create an API token".
3. In the box that pops up, write a description for the token and click "Create API token". Then copy the token string and place it in Azure Keyvault as a secret.
4. I will assume that you have some basic working Terraform code stored in an Azure DevOps repo, at this point. In this case, I am using some simple code to deploy some Azure Resource Groups within a Subscription. There is a difference here from the previous CLI method. You will want to make sure all of your Terraform code is in a sub-folder within the repo, as that sub-folder will be packaged up by the script in the next step.
5. We will be using a sample script from Hashicorp, located here. It is important to note the following disclaimer from the article.
Copy the content of the script from the article and place the content in a file, within the repo, called "terraform-enterprise-push.sh". I won't run through each part of the script and what it does because the article does a good job of that already, but the script packages up the Terraform code and submits it to the TF Cloud API as a tar.gz file.
6. Next, create a pipeline yaml file. I will break down the contents of the yaml file into sections so that I can cover each piece.
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
This section covers the trigger, which means any changes to the "main" branch of the repo will cause Azure Pipelines to trigger a pipeline run. The pool section tells the pipeline what Azure DevOps hosted machine pool to use. In this case, we will be using a Ubuntu machine pool to run the pipeline tasks.
6. For the first task in the pipeline job, we will be telling it to access the Keyvault (in Azure) where we stored the TF Cloud token. It will retrieve it and store it as a sensitive variable that can be used in the rest of the pipeline.
steps:
- task: AzureKeyVault@2
displayName: Pull keyvault secrets
inputs:
azureSubscription: 'mysub'
KeyVaultName: 'akv'
SecretsFilter: 'tfcloud-token'
RunAsPreJob: true
7. The last task in the pipeline is a multi-line, in-line bash script.
- task: AzureCLI@2
displayName: Terraform API Push
inputs:
azureSubscription: 'mysub'
scriptType: bash
scriptLocation: inlineScript
inlineScript: |
export TOKEN=$(tfcloud-token)
./terraform-enterprise-push.sh ./terraform org_name/tfcloud-api-test
workingDirectory: $(System.DefaultWorkingDirectory)
The in-line script consists of 2 lines. The first line sets an environment variable called "TOKEN" within the pipeline build machine. The value of "TOKEN" is set to the Azure Pipelines variable for the secret we pulled from the vault in the preceding step (denoted by the $). This assigns the TF Cloud token to the environment variable so the machine can authenticate to TF Cloud. After this run, the machine is purged, so the token will not be hanging around.
The second line invokes the TF Cloud API script we copied down earlier with two arguments. The script is passed two strings: one with the Terraform code content directory and one that consists of the TF Cloud organization name with the targeted workspace. The script will package up the Terraform code in the repo and send it off to the TF Cloud API to trigger a Terraform Apply run.
Testing the Pipeline:
1. After pushing a change to the repo, the pipeline triggered.
2. The TF Cloud token was pulled successfully from the Azure Keyvault.
3. In the next step, the Terraform code was packaged up and sent to the TF Cloud API. A TF Cloud run was triggered successfully.
Note: There is noticeably less feedback in the pipeline and the pipeline itself does not show the status of the TF Cloud run or wait for the run to finish prior to the pipeline completing. This is expected behavior with this sample script and the script itself can be modified to do a loop to check on the status of the TF Cloud run and report back accordingly.
4. Looking at the run, we have a Terraform plan that completed and needs approval to move to the apply step.
5. Looks good, so we go ahead and click "Confirm & Apply" at the bottom.
6. After the approval, the apply step kicks off in TF Cloud and we can see that it was successful.
Another Note: While this workspace is defined as an API triggered workspace, you can still trigger runs via the CLI from your local machine. However, the run often does not show up in the "runs" tab of the workspace. You generally need to click on the URL displayed in the terminal to see the CLI invoked run.
Sentinel and Testing
I want to close this article on some notes around testing and Sentinel. Sentinel is not available in the free version of TF Cloud, but can be used for a limited time with a trial of a paid subscription. Sentinel is a great testing and enforcement tool that can be set to run on select or all workspaces within a TF Cloud org. This provides a central control point for governance over all teams using Terraform and provides a way to centralize and restrict access to testing and compliance policies.
For 3rd party tooling, some items will work and can be integrated with TF Cloud and others cannot. The key determining factor is where the tool fits into the normal TF workflow in a pipeline. If the tool requires a TF plan output, there is not way to integrate that at the current time within the TF Cloud run. This rules out tools like "Terraform-Compliance". As a note though, Hashicorp is working on "Run Tasks" which will resolve this issue, but it is still a private beta feature at the time of this writing.
For tools that do static code analysis like Checkov, those can be integrated into the parent pipeline as tasks that occur before the Terraform Cloud run is invoked. But, once "Run Tasks" go GA, these types of tools can be integrated into the Terraform Cloud run as well.
Comentários