Professional Documents
Culture Documents
Terraform CLI Cheat Sheet
Terraform CLI Cheat Sheet
reconfigure
is
used
in
order
to
tell
terraform
to
not
copy
the Apply
only
one
module
About
Terraform
CLI existing
state
to
the
new
remote
state
location.
$
terraform
apply
target=module.s3
Terraform,
a
tool
created
by
Hashicorp
in
2014,
written
in
Go, Get
aims
to
build,
change
and
version
control
your
infrastructure.
This This
-target
option
works
with
terraform
plan
too.
tool
have
a
powerfull
and
very
intuitive
Command
Line
Interface. This
command
is
useful
when
you
have
de ned
some
modules.
Modules
are
vendored
so
when
you
edit
them,
you
need
to
get Destroy
Installation again
modules
content.
$
terraform
destroy
$
terraform
get
update=true
Install
through
curl Delete
all
the
resources!
When
you
use
modules,
the
rst
thing
you’ll
have
to
do
is
to
do
a
$
curl
O
https://releases.hashicorp.com/terraform/0.1 terraform
get.
This
pulls
modules
into
the
.terraform
directory. A
deletion
plan
can
be
created
before:
$
sudo
terraform_0.15.1_darwin_amd64.zip
Once
you
do
that,
unless
you
do
another
terraform
get
d
/usr/local/bin/
update=true,
you’ve
essentially
vendored
those
modules. $
terraform
plan
–destroy
$
rm
terraform_0.15.1_darwin_amd64.zip
Plan target
option
allow
to
destroy
only
one
resource,
for
example
OR
install
through
tfenv:
a
Terraform
version
manager a
S3
bucket
:
The
plan
step
check
con guration
to
execute
and
write
a
plan
to $
terraform
destroy
target
aws_s3_bucket.my_bucket
First
of
all,
download
the
tfenv
binary
and
put
it
in
your
PATH.
apply
to
target
infrastructure
provider.
$
git
clone
https://github.com/Zordrak/tfenv.git
$
terraform
plan
out
plan.out Debug
~/.tfenv
$
echo
'export
PATH="$HOME/.tfenv/bin:$PATH"'
The
Terraform
console
command
is
useful
for
testing
>>
$HOME/.bashrc It’s
an
important
feature
of
Terraform
that
allows
a
user
to
see
which
actions
Terraform
will
perform
prior
to
making
any interpolations
before
using
them
in
con gurations.
Terraform
changes,
increasing
con dence
that
a
change
will
have
the console
will
read
con gured
state
even
if
it
is
remote.
Then,
you
can
install
desired
version
of
terraform:
desired
effect
once
applied. $
echo
"aws_iam_user.notif.arn"
|
terraform
console
$
tfenv
install
0.15.1 arn:aws:iam::123456789:user/notif
When
you
execute
terraform
plan
command,
terraform
will
scan
Usage all
*.tf
les
in
your
directory
and
create
the
plan.
Graph
Apply
Show
version $
terraform
graph
|
dot
–Tpng
>
graph.png
Now
you
have
the
desired
state
so
you
can
execute
the
plan.
$
terraform
version
Visual
dependency
graph
of
terraform
resources.
Terraform
v0.15.1
$
terraform
apply
plan.out
Validate
Init
Terraform Good
to
know:
Since
terraform
v0.11+,
in
an
interactive
mode
(non
CI/CD/autonomous
pipeline),
you
can
just
execute Validate
command
is
used
to
validate/check
the
syntax
of
the
$
terraform
init terraform
apply
command
which
will
print
out
which
actions Terraform
les.
A
syntax
check
is
done
on
all
the
terraform
les
in
TF
will
perform. the
directory,
and
will
display
an
error
if
any
of
the
les
doesn’t
It’s
the
rst
command
you
need
to
execute.
Unless,
terraform validate.
The
syntax
check
does
not
cover
every
syntax
common
plan,
apply,
destroy
and
import
will
not
work.
The
command By
generating
the
plan
and
applying
it
in
the
same
command, issues.
terraform
init
will
install
: Terraform
can
guarantee
that
the
execution
plan
won’t
change,
$
terraform
validate
without
needing
to
write
it
to
disk.
This
reduces
the
risk
of
terraform
modules
potentially-sensitive
data
being
left
behind,
or
accidentally
checked
into
version
control. Providers
eventually
a
backend
$
terraform
apply You
can
use
a
lot
of
providers/plugins
in
your
terraform
de nition
and
provider(s)
plugins resources,
so
it
can
be
useful
to
have
a
tree
of
providers
used
by
modules
in
your
project.
Init
Terraform
and
don’t
ask
any
input Apply
and
auto
approve
$ terraform init input=false $ terraform apply autoapprove
$
terraform
init
backendconfig=cfg/s3.dev.tf
$
terraform
apply
autoapprove
reconfigure var
tagsrepository_url=${GIT_URL}
$
terraform
providers
Show
current
workspace $
gcloud
beta
resourceconfig
bulkexport
.
resourceformat=terraform
├──
provider.aws
~>
1.24.0
$
terraform
workspace
show
├──
module.my_module
dev Resources
types
supported:
│
├──
provider.aws
(inherited)
│
├──
provider.null
│
└──
provider.template
Tools $
gcloud
beta
resourceconfig
listresources
└──
module.elastic
Terraforming
└──
provider.aws
(inherited) jq
jq
is
a
lightweight
command-line
JSON
processor.
Combined
with If
you
have
an
existing
AWS
account
for
examples
with
existing
State terraform
output
it
can
be
powerful. components
like
S3
buckets,
SNS,
VPC
…
You
can
use
terraforming
tool,
a
tool
written
in
Ruby,
which
extract
existing
Pull
remote
state
in
a
local
copy AWS
resources
and
convert
it
to
Terraform
les!
Installation
$
terraform
state
pull
>
terraform.tfstate Installation
For
Linux:
Push
state
in
remote
backend
storage $
sudo
apt
install
ruby
or
$
sudo
yum
install
ruby
$
sudo
aptget
install
jq
$
terraform
state
push and
or
This
command
is
usefull
if
for
example
you
riginally
use
a
local
tf $
gem
install
terraforming
$
yum
install
jq
state
and
then
you
de ne
a
backend
storage,
in
S3
or
Consul…
For
OS
X: Usage
How
to
tell
to
Terraform
you
moved
a
ressource
in
a
module? $
brew
install
jq Pre-requisites
:
If
you
moved
an
existing
resource
in
a
module,
you
need
to Usage Like
for
Terraform,
you
need
to
set
AWS
credentials
update
the
state:
$
export
AWS_ACCESS_KEY_ID="an_aws_access_key"
$
terraform
state
mv
aws_iam_role.role1
module.mymodul For
example,
we
de nd
outputs
in
a
module
and
when
we
execute $
export
AWS_SECRET_ACCESS_KEY="a_aws_secret_key"
terraform
apply
outputs
are
displayed: $
export
AWS_DEFAULT_REGION="eucentral1"
How
to
import
existing
resource
in
Terraform?
$
terraform
apply
...
You
can
also
specify
credential
pro le
in
~/.aws/credentials_s
If
you
have
an
existing
resource
in
your
infrastructure
provider, Apply
complete!
Resources:
0
added,
0
changed,
and
with
_–pro le
option.
you
can
import
it
in
your
Terraform
state:
0
destroyed.
$
cat
~/.aws/credentials
$
terraform
import
aws_iam_policy.elastic_post
[aurelie]
arn:aws:iam::123456789:policy/elastic_post Outputs:
aws_access_key_id
=
xxx
aws_secret_access_key
=
xxx
elastic_endpoint
=
vpctoto12fgfd4d5f4ds5fngetwe4.
Workspaces eucentral1.es.amazonaws.com
aws_default_region
=
eucentral1
$
terraforming
s3
profile
aurelie
To
manage
multiple
distinct
sets
of
infrastructure We
can
extract
the
value
that
we
want
in
order
to
use
it
in
a
script
resources/environments. for
example.
With
jq
it’s
easy: Usage
Instead
of
create
a
directory
for
each
environment
to
manage,
we $
terraform
output
json
$
terraforming
help
need
to
just
create
needed
workspace
and
use
them: {
Commands:
"elastic_endpoint":
{
terraforming
alb
#
ALB
Create
workspace
"sensitive":
false,
...
"type":
"string",
terraforming
vgw
#
VPN
Gateway
This
command
create
a
new
workspace
and
then
select
it
"value":
"vpctoto12fgfd4d5f4ds5fngetwe4.
terraforming
vpc
#
VPC
eucentral1.es.amazonaws.com"
$
terraform
workspace
new
dev
}
Example:
}
Select
a
workspace
$
terraforming
s3
>
aws_s3.tf
$
terraform
output
json
|
jq
'.elastic_endpoint.value
"vpctoto12fgfd4d5f4ds5fngetwe4.eucentral1.
Remarks:
As
you
can
see,
terraforming
can’t
extract
for
the
$
terraform
workspace
select
dev
es.amazonaws.com" moment
API
gateway
resources
so
you
need
to
write
it
manually.
List
workspaces
gcloud
bulk-export
in
terraform
format Authors
:
$
terraform
workspace
list
default
Export
natively
Google
Cloud
resources
in
Terraform @aurelievache
*
dev
Cloud
Engineer
&
DevRel
at
Stack
Labs
prod Usage
v1.0.3