AWS hosting handler - centralized AWS operations including authentication, resource deployment, verification, and querying. Provides standard interface for AWS-specific logic used by all infrastructure skills. Handles AWS CLI authentication, profile management, resource deployment validation, and AWS Console URL generation.
Centralizes all AWS operations including authentication, resource deployment, verification, and querying. Used by infrastructure skills whenever they need to interact with AWS resources.
/plugin marketplace add fractary/claude-plugins/plugin install fractary-faber-cloud@fractaryThis skill inherits all available tools. When active, it can use any tool Claude has access to.
workflow/authenticate.mdworkflow/cloudwatch-operations.md<CRITICAL_RULES> IMPORTANT: AWS Profile Separation
IMPORTANT: Environment Validation
LOAD CONFIGURATION:
# Source configuration loader
source "$(dirname "${BASH_SOURCE[0]}")/../devops-common/scripts/config-loader.sh"
# Load configuration for environment
load_config "${environment}"
# Validate profile separation
validate_profile_separation "${operation_type}" "${environment}"
EXECUTE OPERATION: Route to appropriate operation handler:
OUTPUT COMPLETION MESSAGE:
ā
AWS HANDLER COMPLETE: {operation}
{Summary of results}
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
IF FAILURE:
ā AWS HANDLER FAILED: {operation}
Error: {error message}
AWS Profile: {AWS_PROFILE}
Resolution: {suggested fix}
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
</WORKFLOW>
<OPERATIONS>
<AUTHENTICATE>
**Purpose:** Verify AWS credentials and validate profile configuration
Workflow:
Usage:
operation="authenticate"
environment="test"
Output:
Workflow:
Usage:
operation="deploy"
environment="test"
resource_type="s3"
resource_config='{"bucket_name": "my-bucket", "versioning": true}'
Output:
Workflow:
Usage:
operation="verify"
environment="test"
resource_type="s3"
resource_identifier="arn:aws:s3:::my-bucket"
Output:
Workflow:
Usage:
operation="query"
environment="test"
resource_type="s3"
resource_identifier="my-bucket"
Output:
Workflow:
Usage:
operation="delete"
environment="test"
resource_type="s3"
resource_identifier="my-bucket"
Output:
<COMPLETION_CRITERIA> This skill is complete and successful when ALL verified:
ā 1. Profile Validation
ā 2. Operation Execution
ā 3. Response Format
FAILURE CONDITIONS - Stop and report if: ā Invalid environment (action: return error) ā Wrong AWS profile for operation (action: return error with correct profile) ā AWS CLI error (action: return error with AWS error message) ā Resource not found (verify operation) (action: return not found status)
PARTIAL COMPLETION - Not acceptable: ā ļø Operation started but not verified ā Verify completion before returning ā ļø Resource created but URL not generated ā Generate URL before returning </COMPLETION_CRITERIA>
<OUTPUTS> After successful completion, return to calling skill:Standard Response Format:
{
"status": "success|failure",
"operation": "authenticate|deploy|verify|query|delete",
"environment": "test|prod",
"resource": {
"type": "s3|lambda|etc",
"arn": "arn:aws:...",
"id": "resource-id",
"console_url": "https://console.aws.amazon.com/..."
},
"message": "Operation description",
"error": "Error message if failed"
}
Return to caller: JSON response string </OUTPUTS>
<CONSOLE_URL_GENERATION> Generate AWS Console URLs for resources:
S3 Bucket:
https://s3.console.aws.amazon.com/s3/buckets/{bucket_name}?region={region}
Lambda Function:
https://console.aws.amazon.com/lambda/home?region={region}#/functions/{function_name}
DynamoDB Table:
https://console.aws.amazon.com/dynamodb/home?region={region}#tables:selected={table_name}
CloudWatch Logs:
https://console.aws.amazon.com/cloudwatch/home?region={region}#logStream:group={log_group}
IAM Role:
https://console.aws.amazon.com/iam/home#/roles/{role_name}
</CONSOLE_URL_GENERATION>
<ERROR_HANDLING>
<AUTHENTICATION_FAILURE> Pattern: AWS CLI returns "Unable to locate credentials" Action:
aws configure list-profiles | grep {profile}<PERMISSION_DENIED> Pattern: AWS returns "AccessDenied" or "UnauthorizedOperation" Action:
<RESOURCE_NOT_FOUND> Pattern: AWS returns "ResourceNotFoundException" or "NoSuchBucket" Action:
<RESOURCE_ALREADY_EXISTS> Pattern: AWS returns "ResourceAlreadyExists" or "BucketAlreadyExists" Action:
</ERROR_HANDLING>
<EXAMPLES> <example> Operation: authenticate Input: environment="test" Process: 1. Load config for test environment 2. Validate AWS_PROFILE is test-deploy profile 3. Run: aws sts get-caller-identity --profile {AWS_PROFILE} 4. Extract account ID and region 5. Return authentication status Output: {"status": "success", "account_id": "123456789012", "region": "us-east-1", "profile": "myproject-core-test-deploy"} </example> <example> Operation: deploy Input: environment="test" resource_type="s3" resource_config='{"bucket_name": "myproject-core-test-uploads", "versioning": true}' Process: 1. Load config for test environment 2. Validate profile is test-deploy (not discover-deploy) 3. Run: aws s3 mb s3://myproject-core-test-uploads --profile {AWS_PROFILE} 4. Enable versioning if requested 5. Generate console URL 6. Return resource details Output: { "status": "success", "resource": { "type": "s3", "arn": "arn:aws:s3:::myproject-core-test-uploads", "id": "myproject-core-test-uploads", "console_url": "https://s3.console.aws.amazon.com/s3/buckets/myproject-core-test-uploads?region=us-east-1" } } </example> <example> Operation: verify Input: environment="test" resource_type="s3" resource_identifier="myproject-core-test-uploads" Process: 1. Load config for test environment 2. Run: aws s3api head-bucket --bucket {bucket} --profile {AWS_PROFILE} 3. Check return code 4. Return verification status Output: {"status": "success", "exists": true, "resource_status": "available"} </example> </EXAMPLES><AWS_CLI_PATTERNS> Common AWS CLI commands used:
# Authentication
aws sts get-caller-identity --profile {profile}
# S3
aws s3 mb s3://{bucket} --profile {profile}
aws s3api head-bucket --bucket {bucket} --profile {profile}
aws s3api put-bucket-versioning --bucket {bucket} --versioning-configuration Status=Enabled --profile {profile}
# Lambda
aws lambda get-function --function-name {name} --profile {profile}
aws lambda list-functions --profile {profile}
# DynamoDB
aws dynamodb describe-table --table-name {name} --profile {profile}
aws dynamodb list-tables --profile {profile}
# CloudWatch
aws logs describe-log-groups --log-group-name-prefix {prefix} --profile {profile}
# IAM
aws iam get-role --role-name {name} --profile {profile}
aws iam list-roles --profile {profile}
</AWS_CLI_PATTERNS>
Creating algorithmic art using p5.js with seeded randomness and interactive parameter exploration. Use this when users request creating art using code, generative art, algorithmic art, flow fields, or particle systems. Create original algorithmic art rather than copying existing artists' work to avoid copyright violations.
Applies Anthropic's official brand colors and typography to any sort of artifact that may benefit from having Anthropic's look-and-feel. Use it when brand colors or style guidelines, visual formatting, or company design standards apply.
Create beautiful visual art in .png and .pdf documents using design philosophy. You should use this skill when the user asks to create a poster, piece of art, design, or other static piece. Create original visual designs, never copying existing artists' work to avoid copyright violations.