Generate Terraform infrastructure as code - read design documents and implement them as Terraform configurations including resources, variables, outputs, and provider configurations. Creates modular, maintainable Terraform code following best practices with proper resource naming, tagging, and organization.
Translates infrastructure design documents into production-ready Terraform code with variables, outputs, and provider configurations. Automatically triggered when you reference design files, FABER specs, or provide direct infrastructure instructions.
/plugin marketplace add fractary/claude-plugins/plugin install fractary-faber-cloud@fractaryThis skill inherits all available tools. When active, it can use any tool Claude has access to.
scripts/load-context.shscripts/parse-input.shscripts/test/README.mdscripts/test/test-load-context.shscripts/test/test-parse-input.shscripts/test/test-validate-terraform.shscripts/validate-terraform.shworkflow/generate-terraform.mdworkflow/load-context.mdworkflow/parse-input.mdworkflow/validate-code.md<CRITICAL_RULES> IMPORTANT: Terraform Best Practices
IMPORTANT: Code Quality
The skill intelligently parses the input to determine:
Additionally receives:
EXECUTE STEPS:
This workflow uses the 3-layer architecture with deterministic operations in shell scripts.
Workflow documentation files (in workflow/ directory) provide detailed implementation guidance:
workflow/parse-input.md - Documents input parsing patterns and securityworkflow/load-context.md - Documents context loading and requirements extractionworkflow/generate-terraform.md - Documents Terraform generation patterns and templatesworkflow/validate-code.md - Documents validation proceduresActual execution uses shell scripts via Bash tool:
Parse Input (via parse-input.sh script)
./scripts/parse-input.sh "$INSTRUCTIONS"Load Context (via load-context.sh script)
./scripts/load-context.sh "$PARSE_RESULT"Generate Terraform Code (LLM-based - stays in context)
workflow/generate-terraform.md for detailed patternsValidate Implementation (via validate-terraform.sh script - ALWAYS)
./scripts/validate-terraform.sh "./infrastructure/terraform"OUTPUT COMPLETION MESSAGE:
ā
COMPLETED: Infrastructure Engineer
Source: {source description}
Terraform Files Created:
- {terraform_directory}/main.tf
- {terraform_directory}/variables.tf
- {terraform_directory}/outputs.tf
Resources Implemented: {count}
Validation: ā
Passed
Next Steps:
- Test: /fractary-faber-cloud:test
- Preview: /fractary-faber-cloud:deploy-plan
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
IF FAILURE:
ā FAILED: Infrastructure Engineer
Step: {failed step}
Error: {error message}
Resolution: {how to fix}
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
ARCHITECTURE NOTE: This workflow follows the 3-layer architecture:
This reduces context usage by ~55-60% by keeping deterministic operations in scripts. </WORKFLOW>
<COMPLETION_CRITERIA> This skill is complete and successful when ALL verified:
ā 1. Input Parsing
ā 2. Context Loading
ā 3. Code Generation
ā 4. Code Quality
ā 5. File Organization
FAILURE CONDITIONS - Stop and report if:
ā Input Parsing Failures:
ā Context Loading Failures:
ā Code Generation Failures:
ā Validation Failures:
PARTIAL COMPLETION - Not acceptable: ā ļø Code generated but not validated ā Validate before returning (MANDATORY) ā ļø Files created but not formatted ā Run terraform fmt before returning (MANDATORY) ā ļø Security issues found but ignored ā Must address or fail ā ļø Empty files created ā Must contain valid content
Error: "Path outside allowed directory"
Action: Reject immediately, log security event, return error
User Action: Use valid path within allowed directories
Error: "Design file not found: /path/to/file.md"
Action: Return error with correct path format
User Action: Check filename spelling and location
Error: "Source file is empty: /path/to/file.md"
Action: Return error, suggest checking file content
User Action: Ensure file has content
Error: "Cannot extract requirements from source"
Action: Return error with file details
User Action: Check file format and content validity
Error: "Multiple files match pattern: file1.md, file2.md"
Action: Return error listing matches
User Action: Specify exact filename or path
Error: "Terraform validation failed: [specific error]"
Action: Show exact terraform error, return failure
User Action: Review error, fix requirements, retry
</COMPLETION_CRITERIA>
1. Terraform Files
Return to agent:
{
"status": "success",
"terraform_directory": "./infrastructure/terraform",
"files_created": [
"main.tf",
"variables.tf",
"outputs.tf"
],
"resource_count": 5,
"resources": [
{"type": "aws_s3_bucket", "name": "uploads"},
{"type": "aws_lambda_function", "name": "processor"}
]
}
</OUTPUTS>
<TERRAFORM_PATTERNS>
Resource Naming:
# Use variables for dynamic names
resource "aws_s3_bucket" "uploads" {
bucket = "${var.project_name}-${var.subsystem}-${var.environment}-uploads"
tags = local.common_tags
}
Standard Variables:
variable "project_name" {
description = "Project name"
type = string
}
variable "subsystem" {
description = "Subsystem name"
type = string
}
variable "environment" {
description = "Environment (test/prod)"
type = string
}
variable "aws_region" {
description = "AWS region"
type = string
default = "us-east-1"
}
Standard Tags:
locals {
common_tags = {
Project = var.project_name
Subsystem = var.subsystem
Environment = var.environment
ManagedBy = "terraform"
CreatedBy = "fractary-faber-cloud"
}
}
Outputs:
output "bucket_name" {
description = "Name of the S3 bucket"
value = aws_s3_bucket.uploads.id
}
output "bucket_arn" {
description = "ARN of the S3 bucket"
value = aws_s3_bucket.uploads.arn
}
</TERRAFORM_PATTERNS>
<RESOURCE_TEMPLATES>
S3 Bucket with Versioning:
resource "aws_s3_bucket" "this" {
bucket = "${var.project_name}-${var.subsystem}-${var.environment}-${var.bucket_suffix}"
tags = local.common_tags
}
resource "aws_s3_bucket_versioning" "this" {
bucket = aws_s3_bucket.this.id
versioning_configuration {
status = "Enabled"
}
}
resource "aws_s3_bucket_server_side_encryption_configuration" "this" {
bucket = aws_s3_bucket.this.id
rule {
apply_server_side_encryption_by_default {
sse_algorithm = "AES256"
}
}
}
Lambda Function:
resource "aws_lambda_function" "this" {
function_name = "${var.project_name}-${var.subsystem}-${var.environment}-${var.function_name}"
role = aws_iam_role.lambda.arn
runtime = var.runtime
handler = var.handler
filename = var.deployment_package
source_code_hash = filebase64sha256(var.deployment_package)
environment {
variables = var.environment_variables
}
tags = local.common_tags
}
resource "aws_iam_role" "lambda" {
name = "${var.project_name}-${var.subsystem}-${var.environment}-lambda-role"
assume_role_policy = jsonencode({
Version = "2012-10-17"
Statement = [{
Action = "sts:AssumeRole"
Effect = "Allow"
Principal = {
Service = "lambda.amazonaws.com"
}
}]
})
tags = local.common_tags
}
DynamoDB Table:
resource "aws_dynamodb_table" "this" {
name = "${var.project_name}-${var.subsystem}-${var.environment}-${var.table_name}"
billing_mode = var.billing_mode
hash_key = var.hash_key
range_key = var.range_key
attribute {
name = var.hash_key
type = "S"
}
dynamic "attribute" {
for_each = var.range_key != null ? [1] : []
content {
name = var.range_key
type = "S"
}
}
server_side_encryption {
enabled = true
}
point_in_time_recovery {
enabled = true
}
tags = local.common_tags
}
API Gateway REST API:
resource "aws_api_gateway_rest_api" "this" {
name = "${var.project_name}-${var.subsystem}-${var.environment}-api"
description = var.api_description
endpoint_configuration {
types = ["REGIONAL"]
}
tags = local.common_tags
}
resource "aws_api_gateway_deployment" "this" {
rest_api_id = aws_api_gateway_rest_api.this.id
stage_name = var.environment
depends_on = [
aws_api_gateway_integration.this
]
}
</RESOURCE_TEMPLATES>
<FILE_STRUCTURE>
main.tf:
# Provider configuration
terraform {
required_version = ">= 1.5.0"
required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 5.0"
}
}
backend "s3" {
# Backend config provided via init
}
}
provider "aws" {
region = var.aws_region
default_tags {
tags = local.common_tags
}
}
# Local values
locals {
common_tags = {
Project = var.project_name
Subsystem = var.subsystem
Environment = var.environment
ManagedBy = "terraform"
CreatedBy = "fractary-faber-cloud"
}
}
# Resources
resource "aws_s3_bucket" "uploads" {
# ... resource configuration
}
# ... more resources
variables.tf:
# Core variables
variable "project_name" {
description = "Project name"
type = string
}
variable "subsystem" {
description = "Subsystem name"
type = string
}
variable "environment" {
description = "Environment (test/prod)"
type = string
validation {
condition = contains(["test", "prod"], var.environment)
error_message = "Environment must be test or prod."
}
}
variable "aws_region" {
description = "AWS region"
type = string
default = "us-east-1"
}
# Resource-specific variables
# ... add as needed
outputs.tf:
output "bucket_name" {
description = "Name of the S3 bucket"
value = aws_s3_bucket.uploads.id
}
output "bucket_arn" {
description = "ARN of the S3 bucket"
value = aws_s3_bucket.uploads.arn
}
# ... more outputs
test.tfvars:
project_name = "myproject"
subsystem = "core"
environment = "test"
aws_region = "us-east-1"
# Resource-specific values
# ...
</FILE_STRUCTURE>
<INPUT_PARSING_LOGIC>
Determining Input Type:
Check for file paths - Contains .md extension or starts with path separators:
"user-uploads.md" ā design file
".fractary/plugins/faber-cloud/designs/api-backend.md" ā design file
".faber/specs/123-add-feature.md" ā FABER spec
Check for design directory reference - Mentions design directory:
"Implement design from user-uploads.md" ā extract: user-uploads.md
"Use the design in api-backend.md" ā extract: api-backend.md
Check for spec directory reference - Mentions .faber/specs:
"Implement infrastructure for .faber/specs/123-add-api.md" ā extract spec path
Direct instructions - Doesn't match above patterns:
"Fix IAM permissions - Lambda needs s3:PutObject"
"Add CloudWatch alarms for all Lambda functions"
No input - Empty or null:
"" ā Find latest design in .fractary/plugins/faber-cloud/designs/
File Path Resolution:
.fractary/plugins/faber-cloud/designs/{filename}.faber/</INPUT_PARSING_LOGIC>
<EXAMPLES> <example> Input: "user-uploads.md" Parse Result: - Type: design_file - Path: .fractary/plugins/faber-cloud/designs/user-uploads.md Process: 1. Read design document 2. Extract S3 bucket and Lambda processor requirements 3. Generate main.tf with: - S3 bucket resource with versioning and encryption - Lambda function - IAM role for Lambda - S3 event notification to trigger Lambda 4. Generate variables.tf with standard variables 5. Generate outputs.tf with bucket name, ARN, Lambda ARN 6. Run terraform fmt 7. Run terraform validate Output: Complete validated Terraform configuration in ./infrastructure/terraform/ </example> <example> Input: ".faber/specs/123-add-api-backend.md" Parse Result: - Type: faber_spec - Path: .faber/specs/123-add-api-backend.md Process: 1. Read FABER spec 2. Extract infrastructure requirements from software spec 3. Determine needed AWS resources (API Gateway, Lambda, DynamoDB) 4. Generate main.tf with: - API Gateway REST API - Lambda functions for endpoints - DynamoDB table for data storage - IAM roles and policies 5. Generate variables.tf and outputs.tf 6. Run terraform fmt and validate Output: Complete validated Terraform configuration </example> <example> Input: "Fix IAM permissions - Lambda needs s3:PutObject on uploads bucket" Parse Result: - Type: direct_instructions - Instructions: "Fix IAM permissions - Lambda needs s3:PutObject on uploads bucket" Process: 1. Read existing Terraform code 2. Locate Lambda IAM role 3. Add s3:PutObject permission for uploads bucket 4. Update IAM policy document 5. Run terraform fmt and validate Output: Updated Terraform with corrected IAM permissions </example> <example> Input: "" (no input) Parse Result: - Type: latest_design - Path: (auto-detected from designs directory) Process: 1. List files in .fractary/plugins/faber-cloud/designs/ 2. Find most recently modified .md file 3. Read and implement that design 4. Generate Terraform code 5. Validate Output: Complete validated Terraform configuration </example> </EXAMPLES>Creating algorithmic art using p5.js with seeded randomness and interactive parameter exploration. Use this when users request creating art using code, generative art, algorithmic art, flow fields, or particle systems. Create original algorithmic art rather than copying existing artists' work to avoid copyright violations.
Applies Anthropic's official brand colors and typography to any sort of artifact that may benefit from having Anthropic's look-and-feel. Use it when brand colors or style guidelines, visual formatting, or company design standards apply.
Create beautiful visual art in .png and .pdf documents using design philosophy. You should use this skill when the user asks to create a poster, piece of art, design, or other static piece. Create original visual designs, never copying existing artists' work to avoid copyright violations.