Deploy AWS infrastructure using Terraform and GitHub Actions

Terraform being an Infrastructure as a Code helps us to manage a lot of infrastructure for several platforms in a consistent manner.  In this blog, I will walkthrough on deploying a terraform code using  Github actions

Terraform code to create S3 bucket

In this blog, I will walkthrough each blocks in terraform to create an S3 bucket in AWS in automated workflow using GitHub Actions.

Lets look into the Terraform files. To understand easily, I have created separate tf files for each purpose, but you can still use all in one tf file as per your convenient. You can use any name with tf extension, for better understanding I used following names.
  • versions.tf
    terraform {
      required_version = ">= 0.14"
      required_providers {
        aws = {
          source  = "hashicorp/aws"
          version = "3.36.0"
        }
      }
      backend "s3" {
        bucket         = "prasanna-aws-demo-terraform-tfstate"
        key            = "creates3.tfstate"
        region         = "ap-south-1"
        dynamodb_table = "prasanna-aws-demo-terraform-tfstatelock"
        encrypt        = true
      }
    }
    <required_version> Specifies which versions of Terraform can be used with your configuration. Terraform 0.13 and later should be used in order to use the above code.

    <required_providers> This block specifies all of the providers required by the current module, mapping each local provider name to a source address and a version constraint.

    <backend> This block is very useful in keeping the state files in a secured remote location. State file keeps the blue print of the state of Infrastructure. In this case, I have used one of the existing s3 bucket. To prevent two team members from writing to the state file at the same time, we will use a DynamoDB table lock. Terraform provides locking to prevent concurrent runs against the same state. Locking helps make sure that only one team member runs terraform configuration. Locking helps us prevent conflicts, data loss and state file corruption due to multiple runs on same state file.

  • providers.tf
    provider "aws" {
      region = var.region
    }
    Terraform relies on plugins called "providers" to interact with cloud providers, SaaS providers, etc. var.region is fetched from the variable which I will show you in the following.

  • main.tf
    resource "aws_s3_bucket" "prasanna_test" {
      bucket = var.s3_bucket_name
      acl    = var.acl

      tags = merge (
        {
          Name        = var.s3_bucket_name
          Environment = var.Environment
        }
      )
    }
    resource is actual resource we are creating. In this case we are using aws_s3_bucket as resource name which will create the s3 bucket with provided attributes.

  • variables.tf
    # common variables
    variable "s3_bucket_name" {
      description = "tags"
      type        = string
      default     = "my-tf-s3-bucket-prasannakn"
    }
    variable "acl" {
      description = "access control list"
      type        = string
      default     = "private"
    }

    variable "Environment" {
      description = "access control list"
      type        = string
      default     = "Dev"
    }

    variable "region" {
      description = "region"
      type        = string
      default     = "ap-south-1"
    }
    Above are the variables required for our tf files.

Deploy Terraform code via Github Actions

We have the terraform code and lets deploy the code into AWS environment in an automated workflow using Github Actions.

As a pre-requisites, in order to deploy the resources in AWS, you will have to pass the user credential in Github secrets as shown below
Create a programmatical IAM user with S3 permission from AWS console, please refer AWS official doc for the steps to create an IAM user. Add the secret keys and access secrets keys in GitHub secrets.

Lets look into the GitHub workflow. We have below two pipelines
  • TerraformTest.yml - To validate the Terraform code which will trigger for every push irrespective of the branch
  • TerraformApply.yml - To deploy the Terraform code which will trigger when Pull request is made on main branch
We have used two pipelines because, Each developers will be working in their own branch, so this approach will help when developer wants to validate the code before the Pull request and with this developer doesn't require to validate in the local setup. This can be achieved from the first pipeline. Second pipeline will apply the changes in AWS after the code merge to main branch, so before the code going to main, we can validate our infrastructure change.

Lets look into the both the pipelines workflow.

name: Terraform Test

# Trigger the workflow on push to Test the Terraform code
on: push

jobs:
  terraform_test:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v1

    - name: Install Terraform
      env:
        TERRAFORM_VERSION: "1.0.11"
      run: |
        tf_version=$TERRAFORM_VERSION
        wget https://releases.hashicorp.com/terraform/"$tf_version"/terraform_"$tf_version"_linux_amd64.zip
        unzip terraform_"$tf_version"_linux_amd64.zip
        sudo mv terraform /usr/local/bin/

    - name: Verify Terraform version
      run: terraform --version

    - name: Terraform init
      env:
        AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
        AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
      run: terraform init -input=false

    - name: Terraform validation
      run: terraform validate

    - name: Terraform plan
      env:
        AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
        AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
      run: terraform plan

This pipeline will checkout to the code in container and fetch the secrets from the Github secrets and will initialize the terraform and validate the code. Also at the end, we will look out the current and desired state using terraform plan. This pipeline will trigger on every push.


name: Terraform Apply

on:
  # Trigger the workflow on push or Pull request,
  # but only for the main branch
  push:
    branches:
      - main

jobs:
  terraform_apply:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v1

    - name: Install Terraform
      env:
        TERRAFORM_VERSION: "1.0.11"
      run: |
        tf_version=$TERRAFORM_VERSION
        wget https://releases.hashicorp.com/terraform/"$tf_version"/terraform_"$tf_version"_linux_amd64.zip
        unzip terraform_"$tf_version"_linux_amd64.zip
        sudo mv terraform /usr/local/bin/

    - name: Verify Terraform version
      run: terraform --version
   
    - name: Terraform init
      env:
        AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
        AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
      run: terraform init -input=false

    - name: Terraform apply
      env:
        AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
        AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
      run: terraform apply -auto-approve -input=false

This pipeline will checkout the code in container and fetch the secrets from the Github secrets and do the terraform init and apply.


Looking into the tree, below is my folder structure, where terraform_deploy is my main folder and inside we have all the required files.

terraform_deploy
    │   main.tf
    │   providers.tf
    │   README.md
    │   variables.tf
    │   versions.tf
    │
    └───.github
        └───workflows
                TerraformApply.yml
                TerraformTest.yml

Lets create a new branch and push the code to Github repo, Here is my Github repo for your reference. Once you push the code to the new branch of the Github repo, you can observe Terraform Test pipeline will start running.
Terraform test is Green and you can review the terraform states in the job.
Let's merge the code to main branch using Pull request, now we can observe, after code is merged to main branch, Terraform apply pipeline will runs and create the S3 bucket in AWS

Conclusion

In this blog post, I have shown you the basic of terraform files. I also showed you on how to deploy the terraform code using Github Action with two pipelines as a best practice. Hope this blog helped you in your similar use case.
Thank you for reading!

Comments

Popular posts from this blog

Connect to Linux EC2 Instance if Key pair is lost after Initial Launch

Setup Grafana on AWS EKS and integrate with AWS Cloudwatch

Start or Stop services in multiple Windows EC2 Instances using AWS Systems Manager

Concourse CI Installation and Configuration in Windows

Install SSM Agent in Amazon EC2 Instance

Automate Permission Boundary Attachment to IAM roles and Users

AWS Route 53 Inbound Resolver to resolve DNS for Multi Account Organization

Import Existing Resources using Terraform

Hosting AWS VPC Interface Endpoints in Shared Model