We all know that infrastructure as code (IaC) is a wildy adopted pattern to provision infrastrcuture and manage infrastructure. Some good examples are Terraform, AWS Cloud Formation and Azure ARM templates. Because these tools have similarities to JSON, one could argue that these IaC tools are mostly designed and built for System Administrators and Operations folks. But what about developers? If a developer wants to provision resources with any of these tools, they’d need to learn that configuration language, in other words they cannot use their knowledge in the programming language of their choice.
CDK for Terraform allows you to provision infrastructure using everyday programming languages. This means that developers have less context switching between doing their day to day work, which is implementing features in the product they are developing, and provisioning underlying infrastructure for that product. Moreover, using generic programming languages means that they can leverage the power of these languages and deliver more maintanable and readable code through object orientation. They can also perform unit tests to improve the reliability of their infrastructure.
But again this does not mean that now operators have to learn everyday programming languages to provision infrastructure. The goal of this Tutorial is to showcase how different Teams within an organization can collaborate and be more effective.
For this tutorial we consider two different personas. We have a DB Admin, who is doesn’t have programming skills but knows how to configure their Database using Terraform HCL. And we have a Development team, who is proficient in Typescript and would like to avoid context switching between their programming language and Terraform HCL. Above all, we’d like everyone to bring their expertise and choose the best tool for the job, while not being locked-in into a single cloud provider.
The overal architecture is as follows:
Firstly, in Terraform cloud, create two workspaces. One for the DB Admin and one for the Development team. We’d like to have a GitOps workflow. For the Terraform HCL code, this is fairly easily achieveable. Use this guide you can configure the Terraform Cloud workspace to use a GitHub repository. https://developer.hashicorp.com/terraform/tutorials/cloud/cloud-run-triggers In order to enable a GitOps workflow for the CDKTF code, we will leverage GitHub actions to enable this. Use this guide to configure the GitHub actions workflow. https://developer.hashicorp.com/terraform/tutorials/automation/github-actions
Our DB Admin has a lot of expertice with Mongodb and like to provision a Mongodb instance on Mongodb atlas. This is a fairly simple process with Terraform HCL. Our Developers would like to implement the business logic, let’s say a basic sentiment analysis application, with Lambda functions and API Gateway. They’d like to use Typescript to implement this application as well as the infrastructure which supports it.
All that said, our developers need to persist data, so they need the connection string from the DB Admin to do so. To avoid secret sprawl, we will leverage Terraform Cloud which gives our organization a single control plane and a medium trough which we can properly control access to sensitive information, which in our case would be the connection string of the Database.
Clone the following repository to obtain the HCL code for the database.
git clone https://github.com/pedramha/mongodb_hahsitalks.git
Use the following guide to connect this repository to Terraform Cloud.
https://developer.hashicorp.com/terraform/tutorials/cloud/cloud-run-triggers
The code is fairly simple, it creates a Mongodb Atlas cluster and a database user. The output of the code is the connection string of the database. The connection string is sensitive information, so we will store it in Terraform Cloud. Please note that the code is not production ready, it is just for demonstration purposes. This code allows all incoming connections from the internet to the database, which is not a good practice. In a real world scenario, you might want to use VPC peering to connect your Lambda in AWS to the database on Mongodb Atlas.
You can follow this link to get started with CDKTF [!https://developer.hashicorp.com/terraform/tutorials/cdktf/cdktf-install]. This tutorial builds on top of that.
You can clone the following repository and have a look at the code :
git clone https://github.com/pedramha/cdktf-hashitalks.git
Under the path “.github/workflows/cdktf-deploy.yml” you can find the GitHub actions workflow. This workflow will deploy the CDKTF code to Terraform Cloud. You can also modify the code to perform unit tests or any other additional workflow that your usecase might require.
Navigate to the line 54 of the “main.ts” file, you can see how we are reading the connection string from the Databse workspace and passing it as an environment variable to our Lambda function (of course you can use better mechanisms to pass in the lambda function).
// getting the state from terraform cloud
const remoteState = new DataTerraformRemoteState(this, "remote-state", {
organization: "your-organisation",
hostname: "app.terraform.io",
workspaces: {
name: "your workspace name"",
}
});
const lambdafunc = new lambdafunction.LambdaFunction(this, "lambdaFunc", {
functionName: 'HelloLambda',
s3Bucket: assetBucket.bucket,
s3Key: lambdaArchive.key,
sourceCodeHash: lambdaArchive.sourceHash,
handler: 'index.handler',
runtime: 'nodejs14.x',
role: role.arn,
environment: {
variables: {
"CONNECTIONSTRING": remoteState.get('connection_strings').toString()
}
}
});