Professional Documents
Culture Documents
BOOK A CONSULTATION
Search …
CATEGORIES
ActiveMQ (4)
Ansible (65)
Apache (8)
AWS (28)
Azure (1)
Best Practice (5)
BitBucket (1)
Boto (1)
Cloud (1)
Database (1)
Development (3)
Docker (11)
EFK (1)
Elastic Search (1)
Infrastructure as Code is getting all attention it deserves and everyone is trying to find their way to F5-Big-IP (6)
the `Completely automated Infrastructure Provisioning & Management` FluentD (2)
GCP – GoogleCloud (1)
While there are a lot of tools available now in the market starting from Terraform, AWS
Github Actions (1)
CloudFormation, Chef, Puppet, Salt Stack
Grafana (1)
There are some differences between each tool. Some of them are configuration management tools Graphite (1)
(Ansible, Chef, Saltstack) and Some of them are purely provisioning tools (terraform, cloud IBM Cloud (1)
formation). You have to choose your judgement based on various factors like Easy to learn, easy to IBM Websphere (2)
adopt. IHS and Apache (10)
Though Terraform is a wonderful cross-platform provisioning tool I feel it has a steep learning IIS (4)
curve. while Ansible is not that complex to start with an easy to learn and adopt. JavaScript (3)
Jenkins (4)
So this post is for those ( including me ) who love ansibleʼs simplicity and structure and want to see
Joomla (1)
how it works with AWS.
Kubernetes (16)
This is going to be a quick introduction to Ansible AWS Ec2 module. Middleware (3)
Network
Troubleshooting (1)
Networking (1)
Table of Contents NodeJS (2)
1. How Ansible works with AWS EC2 – Setup Boto for Ansible Office365 (1)
2. Environment Setup for Ansible to work with AWS EC2 module OpenSource (2)
3. Ansible Playbook to create a new EC2 instance Oracle DB (3)
3.1. Setup AWS Authentication Before Running the Playbook with Ansible EC2
Oracle Weblogic (2)
3.2. Getting ready to execute the playbook – Ansible AWS EC2
Packer (2)
3.3. Execution Part – Run the playbook with Ansible EC2
Perl (1)
4. A Playbook with Ansible EC2 & Ansible Vault – Secure Approach
PHP (1)
4.1. Saving the AWS Secrets
5. Execution of Ansible AWS Playbook example PostgreSQL (1)
PowerShell (1)
problem/solution (8)
Prometheus (1)
Pulumi (1)
How Ansible works with AWS EC2 – Setup Boto for Ansible Python Flask (1)
python_scripts (7)
Ansible AWS combo is more like Hobbs and Shah ( Sorry! I am Fast & Furious Fan) while both of em
Redis (1)
have their own individual stardom they join together to create magic.
Serverless (1)
As we all know Ansible is pythonic and their modules are written in python as well. So for AWS
modules to work you need to have Certain prerequisite elements installed on your Ansible Control
machine ( where you have installed ansible )
boto
boto3
botocore
python version >= 2.6
You can install python easily but what is this boto. boto is one of the Amazon supported SDK. Boto3
is the latest version of boto.
to make sure you have boto installed in your python. Just Launch your Python interactive terminal
and type import boto and import boto3 if it works fine ( shows no error) you are good.
to install boto and boto3 you must have pip3 as well. If you are having python 2.7 there are
chances you might have this bundled in.
Some times when you have two versions of python installed in your system you have to try them out
both and make sure which one has the boto installed. by performing the `import boto` command
so that you can avoid some exceptions like this
If you take my MAC, for example, I have two python version installed, One is at
/usr/local/bin/python another one is at /usr/bin/python One comes as built-in with OS
another one is installed by `homebrew` , I had to lauch them both and check the if the boto
package is present in there like shown below.
You could see that my /usr/local/bin/python3 all necessary modules installed,
Note:
It is known that Python3 may not work properly with Ansible old versions. In that case, you can
choose to install your boto libraries in python 2.7.* version and it use it as your primary
version in ansible_python_interpreter
This is just a method to find the right python package to use it with ansible
The path and everything could be different for you. Before executing these commands, update the
paths based on your local installation.
Now I presume that you have a python with these necessary modules installed.
Here is the playbook to create EC2 instances and also to get the list of in your AWS Cloud account.
For security reasons, we made the Second block to run only when it is being explicitly called with --
tags
this has been done by using the tag never in the block
---
- name: Create Ec2 instances
hosts: localhost
gather_facts: false
tasks:
ec2_instance_info:
register: ec2info
You might wonder where is my AWS Key and SECRET mentioned. How would I be able to login to my
AWS account? How the authentication will be done.
I hear ya.
Setup AWS Authentication Before Running the Playbook with Ansible EC2
To make this article precise, I just assume that you know how to create programmatic access for your
AWS account and get your AWS_ACCESS_KEY and AWS_SECRET
Once you have the keys, the easiest but an unsecured way is to save it as Environment Variables like
this and we are all set.
Before executing the playbook you must be sure which python interpreter you are going to use
which has `boto` libraries installed.
When it comes to using EC2 modules, It is always better to tell ansible explicitly which python
interpreter it has to use.
In my case, as I have said before my /usr/local/bin/python3 has the necessary boto libraries
and I have to tell ansible to use that python.
to do that, I can use ansible.cfg file or ansible inventory file but I prefer to do it in a
command line as a runtime variable
ansible-playbook ec2-creation.yml \
--connection=local \
-e "ansible_python_interpreter=/usr/local/bin/python3"
Here
–connection tells Ansible to run this task locally and not look for any remote server or hosts file
–e extra args or variables where we gave the python interpreter by setting the python full location
to ansible_python_interpreter variable
when I run this, I would get only the info of existing instances. it would not create the instances yet.
as we have the info as the default block
Now to create the ec2 instance or to run that block, I need to call the tag of that block using --
tags like shown below.
ansible-playbook ec2-creation.yml \
--connection=local \
--tags=ec2-create \
-e "ansible_python_interpreter=/usr/local/bin/python3"
to use the Ansible vault to securely store your AWS Keys, You might need one more file on the same
directory where you can save your credentials as variables and encrypt it with the vault.
---
- name: Create Ec2 instances
hosts: localhost
# import the secret file
vars_files:
- secrets.yml
gather_facts: false
tasks:
ec2_instance_info:
register: ec2info
As we have included the secrets.yml file inside our playbook, Now we need to save our AWS KEY
and SECRET
➜ cat secrets.yml
AWS_ACCESS_KEY_ID: AKIATQ7Q7SGKYB4TT3DX
AWS_SECRET_ACCESS_KEY: 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
It is saved as clear text info within the file so whoever opens it can see what is inside.
So Encrypt it
For Ansible to ask you the password you should use a startup argument named --ask-vault-pass
ansible-playbook ec2-creation.yml \
--connection=local \
--tags=ec2-create \
-e "ansible_python_interpreter=/usr/local/bin/python3" \
--ask-vault-pass
See the following Screen record if you want to see how it works in realtime.
I know this is just a first step, Now you have got better things to do with that newly created server
and install and configure it. you might have further questions on how to use these servers and
configure them properly on the same playbook.
Please stay connected and Lookup for my next article. I am already drafting it.
Cheers
Sarav AK
Email*
SUBSCRIBE
Introduction Ansible Dry Run or Ansible Check mode feature is to check your
playbook before execution like Ansible's --syntax-check feature. With Ansible
Dry Run feature you can execute the playbook without having to actually make
changes on the server. With Ansible Dry Run you can see if the host is getting
changed…
In this post, we are going to see how to run the playbook locally on our
Localhost on the control machine. How to run Ansible Playbook locally is one
of the questions I used to have before I learnt it. Let me tell you how to Run
Ansible Playbook Locally…
Ansible and AWS Ansible AWS Boto Ansible AWS Example Ansible EC2 Ansible EC2 Creation Ansible EC2 example
« Ansible Async Poll Examples – Ansible nohup, task in background Ansible Unarchive Module Examples »
What do you think about this article ?
4 Responses
8 Comments
1 Login
LOG IN WITH
OR SIGN UP WITH DISQUS ?
Name
Sort by Best ⥅
I setup a clean environment (ubuntu wsl), created a venv with python3 and boto3.
This is the playbook in which I changed just security group, key name, region, instance type, ami name:
---
- name: Create Ec2 instances
hosts: localhost
# import the secret file
vars_files:
- secrets.yml
gather_facts: false
tasks:
△ ▽ • Reply • Share ›
If you do not want the `ec2_instance_info` to come into picture. you can just execute only the right tag
△ ▽ • Reply • Share ›
Hi Marco.
The issue is that your default profile on the local system is being chosen with the first `ec2_instance_info`
block
Please add `regional` and `profile` blocks like this into the `ec2_instance_info` task and retry.
△ ▽ • Reply • Share ›
Hi Marco.
see more
△ ▽ • Reply • Share ›
In this case, personal means a profile name of AWS CLI. it also called as Boto profile.
When you manage multiple AWS accounts from you AWS CLI. you would have multiple profiles
configured.
Only if you have multiple profile, use the profile option otherwise remove it and just use the
region.
Next time when you want to post the code to me. please use a code block or use
https://gist.github.com/ so that I can check your Yaml file Indentation. as well.
If you would still need some help. Please let me know we can connect for a 30 minute zoom
call.
Happy to help
△ ▽ • Reply • Share ›
Marco Reale > Sarav AK • 10 months ago
Hi Sarav
before all I really thank you for your help :)Anyway after some more tests I finally have been
able to make it working. Actually it was just my stupid mistake and your playbook worked
perfectly since the beginning without any change. Sorry for wasting your time.
I have just a last question:
in my test environment I created a venv environment that works perfectly (source aws-
venv/bin/activate)
and now I have the following folder structure with your playbook:
△ ▽ • Reply • Share ›
PDFmyURL.com - convert URLs, web pages or even full websites to PDF online. Easy API for developers!