In this article we will be using some basic functionalities such as:
- Basic knowledge on Terraform
- Inputs — for parametrization
- Intrinsic functions — to work with credentials and pass the inputs to terraform template
- Capabilities — Exposing terraform outputs
If you are not familiar with how to work with the items above I’ll recommend to go over the following articles:
- EC2 instance with Cloudify
What you’ll learn
In this article we will go through a few examples how to run Terraform template with Cloudify.
- Package the terraform template as part of the Blueprint package
- Run the terraform template by referencing to URL or Public Git Repo
- Run the terraform template by referencing to a private Git Repo
You’ll need to have the following setup on you development machine.
Git — To work with our tutorial examples
Python 3.7 — To run Cloudify CLI (cfy)
Docker — To run Cloudify Manager container
AWS access key id and secret access key with permissions to create and delete EC2 instances. You can follow the AWS documentation on how to get those in the link
What is Cloudify?
Cloudify is an orchestration tool that can do many things.
In our current example I’ll walk you step by with the best practices on what is required to spin an EC2 instance.
We will create a simple blueprint file that describes how EC2 instance should be created and upload it to Cloudify Manager.
To spin an EC2 instance we will created and install a deployment. As part of it Cloudify Manager will connect to AWS and create an EC2 instance as described in the blueprint.
Cloudify Installation: Running locally Cloudify Manager is very easy. You’ll need to install locally on your machine docker and run on it Cloudify Manager Community edition container.
To run Cloudify Manager Community edition simply run the following command. You might wait some time until the application is fully up and running
sudo docker run --name cfy_manager_local -d --restart unless-stopped -v /sys/fs/cgroup:/sys/fs/cgroup:ro --tmpfs /run --tmpfs /run/lock --security-opt seccomp:unconfined --cap-add SYS_ADMIN -p 80:80 -p 8000:8000 cloudifyplatform/community-cloudify-manager-aio:latest
Congratulations now you can browse to
http://localhost and login with default user and password
Cloudify CLI installation
You have two ways to work with Cloudify Manager. You can work either with UI or CLI. In this article we will work CLI.
To install Cloudify CLI run the following command:
pip install cloudify==6.1.0
Before we start we need to install AWS and Utilities Plugin
To simplify let’s upload all supported plugins
cfy plugins bundle-upload
The terraform template we will be working with is very simple. It has two files
There are 4 variables:
- For AWS authentication
aws_amisto programatically select the right AMI image per selected region.
main.tfterraform template file
It uses the variables above to provision EC2 instance of size
t2.micro and will be tagged with
Package the terraform template as part of the Blueprint package
In this example we will package the terraform as part of the Blueprint package. The terraform template spinning an EC2 instance on AWS.
First let’s checkout the project:
git clone firstname.lastname@example.org:cloudify-community/cloudify-tutorial.git
You’ll find there two files and one directory.
- blueprint.yaml — The blueprint that we will be using to run the Terraform template
- terraform.zip — The terraform template we will be using
- template — the directory that contains the Terraform Template source code
Let’s walk through the blueprint
On the inputs side the only parameter we expect to get is the region, which has
us-east-1 as a default value.
To run the terraform template we need 2 node types.
The first node type describes the terraform binary we are going to use. When we will running the example the binary file will be downloaded. So you must have the access to the URL you are providing.
For the full list of the properties you can read in the link
The second node type describe the module that should executed with all the parameters
The main properties you should be aware of are:
- The source location that points to the relative path inside the blueprint package.
2. Variables — the values to pass to the terraform template. In our case we are passing AWS credentials from the secret store and the AWS Region from the input
run_on_host relationship that connects the
module type to the
For the full list of the properties you can read in the link.
Let’s run the example.
Run the terraform template by referencing to URL or Public Git Repo
An alternative option is to store the terraform template remotely either a URL to a ZIP or it can be public Git Repo.
The only changes from the previous example is the
The location is pointing to the ZIP file it can be a Git repo as well. In case of the Git Repo the url should end with
If the template is not in the root we have to specify the
source_path where it’s located.
to run the example run the following command within
For public git example
Run the terraform template by referencing to a private Git Repo
In case you are using a private Git Repo or a URL that requires basic authentication. You can provide
user name and
password . In our case we have pre created them in the secret store. See the example below.
to run the example run the following command within
The benefits of using Git over template inside the blueprint package
Looking at the the GitOps best practices you would like to always have a git repo for your terraform template. All the changes can go through code review and you can always track the changes.
In case you’ve applied changes to your Git repo after you created a deployment you can always run
reload workflow to pull the newest changes.
To reload the changes, run the following command, replace the
DEPLOYMENT_ID with one of the examples above
cfy executions start -d DEPLOYMENT_ID refresh_terraform_resources