Terraformer tool for reverse-engineering existing cloud infrastructure into Terraform code. Import resources from AWS, Azure, GCP, Kubernetes, and other providers. Generate Terraform configurations from running infrastructure for migration, disaster recovery, and infrastructure documentation.
/plugin marketplace add Lobbi-Docs/claude/plugin install iac-golden-architect@claude-orchestrationThis skill is limited to using the following tools:
examples/aws-import.shexamples/azure-import.shexamples/gcp-import.shreferences/import-workflow.mdreferences/providers.mdComprehensive Terraformer tool expertise for reverse-engineering existing cloud infrastructure into Terraform code. Transform brownfield infrastructure into infrastructure-as-code with automated resource discovery and code generation.
Activate this skill when:
Terraformer is a CLI tool that generates Terraform configuration files from existing infrastructure. It uses cloud provider APIs to discover resources and automatically creates:
Best for: Brownfield infrastructure, migration projects, infrastructure discovery, documentation generation
Terraformer supports 40+ providers including:
# macOS
brew install terraformer
# Linux
curl -LO https://github.com/GoogleCloudPlatform/terraformer/releases/download/$(curl -s https://api.github.com/repos/GoogleCloudPlatform/terraformer/releases/latest | grep tag_name | cut -d '"' -f 4)/terraformer-linux-amd64
chmod +x terraformer-linux-amd64
sudo mv terraformer-linux-amd64 /usr/local/bin/terraformer
# Windows
choco install terraformer
# Verify installation
terraformer version
# Basic import command structure
terraformer import <provider> \
--resources=<resource_types> \
--regions=<regions> \
--filter=<filters> \
--path-pattern=<output_path> \
--compact
# Example: Import AWS VPC resources in us-east-1
terraformer import aws \
--resources=vpc,subnet,security_group \
--regions=us-east-1 \
--compact
# List available resources for a provider
terraformer import aws --resources=* --regions=us-east-1 --dry-run
# AWS: List all resource types
terraformer import aws list
# Azure: List resource types
terraformer import azure list
# GCP: List resource types
terraformer import google list
# Import specific resource types
terraformer import aws \
--resources=vpc,subnet,route_table,internet_gateway \
--regions=us-east-1,us-west-2 \
--compact
# Import with tag filtering
terraformer import aws \
--resources=ec2_instance \
--regions=us-east-1 \
--filter="Name=tag:Environment;Value=production" \
--compact
# Import by resource ID
terraformer import aws \
--resources=s3 \
--filter="Name=id;Value=my-bucket-name" \
--compact
# Import all resources (careful - can be large!)
terraformer import aws \
--resources=* \
--regions=us-east-1 \
--compact \
--path-pattern={output}/aws/{region}/{service}
# Azure resource group import
terraformer import azure \
--resources=* \
--resource-group=my-resource-group
# GCP project import
terraformer import google \
--resources=* \
--projects=my-project-id \
--regions=us-central1
# Import from multiple regions
terraformer import aws \
--resources=vpc,ec2_instance,rds \
--regions=us-east-1,us-west-2,eu-west-1 \
--compact \
--path-pattern={output}/aws/{region}
# Import Kubernetes resources
terraformer import kubernetes \
--resources=deployments,services,configmaps,secrets \
--namespace=production
# Import all namespaced resources
terraformer import kubernetes \
--resources=* \
--namespace=default
After running Terraformer, the output directory contains:
generated/
└── aws/
└── us-east-1/
├── vpc/
│ ├── vpc.tf # Resource definitions
│ ├── terraform.tfstate # Generated state
│ ├── variables.tf # Variable definitions
│ └── outputs.tf # Output definitions
├── ec2_instance/
│ ├── ec2_instance.tf
│ ├── terraform.tfstate
│ └── variables.tf
└── security_group/
├── security_group.tf
└── terraform.tfstate
# Navigate to output directory
cd generated/aws/us-east-1/vpc
# Review generated Terraform
cat vpc.tf
# Check state file
terraform state list
# Common cleanup tasks:
# - Remove unnecessary tags
# - Extract hardcoded values to variables
# - Consolidate repeated patterns into modules
# - Remove default values
# - Organize files by logical grouping
# - Add meaningful resource names
# Initialize Terraform
terraform init
# Validate configuration
terraform validate
# Plan to verify no changes
terraform plan
# Expected: No changes. Infrastructure is up-to-date.
# Option 1: Merge state files
terraform state pull > original.tfstate
# Manually merge or use terraform state mv
# Option 2: Use terraform_remote_state data source
# Reference imported resources from other projects
# Option 3: Import into existing state
terraform import aws_vpc.main vpc-12345678
# By tag
terraformer import aws \
--resources=ec2_instance \
--filter="Name=tag:Team;Value=platform"
# By name pattern
terraformer import aws \
--resources=s3 \
--filter="Name=id;Value=prod-*"
# Multiple filters
terraformer import aws \
--resources=rds \
--filter="Name=tag:Environment;Value=production" \
--filter="Name=engine;Value=postgres"
# Exclude pattern
terraformer import aws \
--resources=vpc \
--excludes="default-vpc-*"
# Organize by environment and region
terraformer import aws \
--resources=* \
--regions=us-east-1 \
--path-pattern=generated/{provider}/{region}/{environment}
# Organize by service
terraformer import aws \
--resources=vpc,subnet,route_table \
--path-pattern=generated/networking/{service}
# Compact mode reduces file size and improves readability
terraformer import aws \
--resources=vpc \
--compact # Removes comments and formatting
# Generate plan file for review
terraformer plan aws \
--resources=vpc \
--regions=us-east-1
# Review plan
terraform show plan.out
# 1. Import infrastructure locally
terraformer import aws --resources=vpc --regions=us-east-1
# 2. Configure remote backend
cat > backend.tf <<EOF
terraform {
backend "s3" {
bucket = "my-terraform-state"
key = "imported/vpc/terraform.tfstate"
region = "us-east-1"
encrypt = true
dynamodb_table = "terraform-locks"
}
}
EOF
# 3. Initialize with backend
terraform init
# 4. Push state to remote
terraform state push terraform.tfstate
# 1. Import resources
terraformer import aws --resources=vpc,subnet --compact
# 2. Refactor into module structure
mkdir -p modules/vpc
mv generated/aws/us-east-1/vpc/*.tf modules/vpc/
# 3. Create module interface
# Edit modules/vpc/variables.tf, outputs.tf, main.tf
# 4. Use module in root config
cat > main.tf <<EOF
module "vpc" {
source = "./modules/vpc"
cidr_block = "10.0.0.0/16"
name = "production-vpc"
}
EOF
# Import production infrastructure
terraformer import aws \
--resources=vpc,subnet,ec2_instance,rds,s3 \
--regions=us-east-1 \
--filter="Name=tag:Environment;Value=production"
# Modify for DR region
# - Change region variables
# - Update CIDR blocks if needed
# - Adjust instance sizes for cost
# Deploy to DR region
terraform apply -var="region=us-west-2"
# Import from Account A
export AWS_PROFILE=account-a
terraformer import aws \
--resources=* \
--regions=us-east-1 \
--path-pattern=generated/account-a/{service}
# Import from Account B
export AWS_PROFILE=account-b
terraformer import aws \
--resources=* \
--regions=us-east-1 \
--path-pattern=generated/account-b/{service}
# Consolidate and standardize
# Create unified modules from both accounts
# Import all infrastructure
terraformer import aws --resources=* --regions=us-east-1
# Generate documentation
terraform-docs markdown . > INFRASTRUCTURE.md
# Create architecture diagrams
terraform graph | dot -Tpng > architecture.png
# Phase 1: Import existing resources
terraformer import aws --resources=vpc,subnet,route_table
# Phase 2: Validate no drift
terraform plan # Should show no changes
# Phase 3: Make infrastructure changes via Terraform
# Edit .tf files, run terraform apply
# Phase 4: Decommission manual processes
# Update runbooks, disable console access
Solution: Verify cloud provider credentials
# AWS
aws sts get-caller-identity
export AWS_PROFILE=my-profile
# Azure
az account show
az login
# GCP
gcloud auth list
gcloud config set project my-project
Solution: Use filters to limit scope
# Instead of importing all resources
terraformer import aws --resources=* # DON'T DO THIS
# Import specific services
terraformer import aws --resources=vpc,ec2_instance,rds
Solution: Review for default values and formatting
# Common causes:
# - Default tags added by provider
# - Computed values not captured
# - Different attribute formatting
# Fix by:
# 1. Adding lifecycle ignore_changes
# 2. Removing default values
# 3. Adjusting attribute formatting
Solution: Split into smaller state files
# Import with path pattern
terraformer import aws \
--resources=* \
--path-pattern=generated/{service}
# Each service gets its own state file
Solution: Import dependent resources together
# Import VPC and all related resources
terraformer import aws \
--resources=vpc,subnet,route_table,internet_gateway,nat_gateway,security_group \
--regions=us-east-1
Solution: Refactor after import
# Before: aws_instance.tfer--i-0123456789abcdef0
# After: aws_instance.web_server_1
# Use terraform state mv to rename
terraform state mv \
'aws_instance.tfer--i-0123456789abcdef0' \
'aws_instance.web_server_1'
--resource-group flag for scoped imports--projects flag for project selection--namespace# Use parallelism (experimental)
terraformer import aws \
--resources=ec2_instance \
--regions=us-east-1,us-west-2 \
--parallel=4
# Import only recent resources
terraformer import aws \
--resources=ec2_instance \
--filter="Name=launch-time;Value=2024-01-01"
# Use compact mode (faster processing)
terraformer import aws \
--resources=* \
--compact
# Use excludes to skip unwanted resources
terraformer import aws \
--resources=* \
--excludes="default-*,terraform-*"
# Split by service
terraformer import aws \
--resources=vpc \
--path-pattern=generated/{service}
references/providers.md - Provider-specific import patterns and examplesreferences/import-workflow.md - Step-by-step import process and best practicesreferences/filters.md - Advanced filtering techniques and patternsreferences/post-import.md - Cleanup, refactoring, and optimization guideexamples/aws-import.sh - AWS infrastructure import scriptexamples/azure-import.sh - Azure resource import scriptexamples/gcp-import.sh - GCP project import scriptexamples/kubernetes-import.sh - Kubernetes cluster import scriptexamples/multi-cloud-import.sh - Multi-cloud consolidated import/iac:import: Guided terraformer import workflow/iac:validate: Validate imported Terraform code/iac:refactor: Refactor imported code into modules# List available providers
terraformer --help
# List resources for a provider
terraformer import aws list
# Dry run (preview)
terraformer import aws --resources=vpc --dry-run
# Basic import
terraformer import aws --resources=vpc --regions=us-east-1
# Filtered import
terraformer import aws --resources=vpc --filter="Name=tag:Env;Value=prod"
# Multi-region import
terraformer import aws --resources=vpc --regions=us-east-1,us-west-2
# Import all resources
terraformer import aws --resources=* --regions=us-east-1 --compact
# By tag
--filter="Name=tag:Environment;Value=production"
# By resource ID
--filter="Name=id;Value=vpc-12345678"
# By name pattern
--filter="Name=name;Value=prod-*"
# Multiple filters (AND logic)
--filter="Name=tag:Team;Value=platform" --filter="Name=tag:Env;Value=prod"
# By service
--path-pattern=generated/{service}
# By region
--path-pattern=generated/{region}/{service}
# By environment
--path-pattern=generated/{environment}/{service}
# Custom hierarchy
--path-pattern=infrastructure/{provider}/{region}/{environment}/{service}
Creating algorithmic art using p5.js with seeded randomness and interactive parameter exploration. Use this when users request creating art using code, generative art, algorithmic art, flow fields, or particle systems. Create original algorithmic art rather than copying existing artists' work to avoid copyright violations.
Applies Anthropic's official brand colors and typography to any sort of artifact that may benefit from having Anthropic's look-and-feel. Use it when brand colors or style guidelines, visual formatting, or company design standards apply.
Create beautiful visual art in .png and .pdf documents using design philosophy. You should use this skill when the user asks to create a poster, piece of art, design, or other static piece. Create original visual designs, never copying existing artists' work to avoid copyright violations.