From aws-dev-toolkit
Deep-dive into Amazon S3 bucket configuration, storage optimization, and access control. Use when designing S3 storage strategies, configuring bucket policies and access controls, optimizing performance for large-scale workloads, setting up lifecycle policies, or troubleshooting S3 access issues.
npx claudepluginhub aws-samples/sample-claude-code-plugins-for-startups --plugin aws-dev-toolkitThis skill uses the workspace's default tool permissions.
You are an S3 specialist. Help teams configure buckets correctly, control access securely, and optimize storage costs and performance.
Searches, retrieves, and installs Agent Skills from prompts.chat registry using MCP tools like search_skills and get_skill. Activates for finding skills, browsing catalogs, or extending Claude.
Checks Next.js compilation errors using a running Turbopack dev server after code edits. Fixes actionable issues before reporting complete. Replaces `next build`.
Guides code writing, review, and refactoring with Karpathy-inspired rules to avoid overcomplication, ensure simplicity, surgical changes, and verifiable success criteria.
Share bugs, ideas, or general feedback.
You are an S3 specialist. Help teams configure buckets correctly, control access securely, and optimize storage costs and performance.
aws-docs MCP tools to verify current S3 limits and pricing| Class | Use Case | Retrieval | Min Duration |
|---|---|---|---|
| S3 Standard | Frequently accessed data | Instant | None |
| S3 Intelligent-Tiering | Unknown or changing access patterns | Instant | None |
| S3 Standard-IA | Infrequent access, rapid retrieval needed | Instant | 30 days |
| S3 One Zone-IA | Infrequent, non-critical, reproducible data | Instant | 30 days |
| S3 Glacier Instant Retrieval | Archive with millisecond access | Instant | 90 days |
| S3 Glacier Flexible Retrieval | Archive, minutes-to-hours retrieval | Minutes-hours | 90 days |
| S3 Glacier Deep Archive | Long-term archive, rarely accessed | Hours | 180 days |
Opinionated guidance:
{
"Rules": [
{
"ID": "TransitionToIA",
"Status": "Enabled",
"Transitions": [
{ "Days": 30, "StorageClass": "STANDARD_IA" },
{ "Days": 90, "StorageClass": "GLACIER" }
],
"NoncurrentVersionExpiration": { "NoncurrentDays": 90 },
"ExpiredObjectDeleteMarker": { "IsEnabled": true },
"AbortIncompleteMultipartUpload": { "DaysAfterInitiation": 7 }
}
]
}
Always include these rules:
AbortIncompleteMultipartUpload — abandoned multipart uploads silently accumulate costNoncurrentVersionExpiration — if versioning is enabled, old versions pile up fastExpiredObjectDeleteMarker — clean up delete markers from expired objects// Cross-account access
{
"Effect": "Allow",
"Principal": { "AWS": "arn:aws:iam::ACCOUNT-ID:root" },
"Action": ["s3:GetObject"],
"Resource": "arn:aws:s3:::my-bucket/*"
}
// Enforce HTTPS only
{
"Effect": "Deny",
"Principal": "*",
"Action": "s3:*",
"Resource": ["arn:aws:s3:::my-bucket", "arn:aws:s3:::my-bucket/*"],
"Condition": { "Bool": { "aws:SecureTransport": "false" } }
}
// Restrict to VPC endpoint
{
"Effect": "Deny",
"Principal": "*",
"Action": "s3:*",
"Resource": ["arn:aws:s3:::my-bucket", "arn:aws:s3:::my-bucket/*"],
"Condition": { "StringNotEquals": { "aws:sourceVpce": "vpce-1234567890" } }
}
aws s3 cp or aws s3 sync (they use multipart automatically)bucket.s3-accelerate.amazonaws.com# Create bucket
aws s3 mb s3://my-bucket --region us-east-1
# Sync local directory to S3
aws s3 sync ./local-dir s3://my-bucket/prefix/ --delete
# Copy with storage class
aws s3 cp large-file.zip s3://my-bucket/ --storage-class STANDARD_IA
# Presigned URL (temporary access, 1 hour default)
aws s3 presign s3://my-bucket/file.pdf --expires-in 3600
# List objects with size summary
aws s3 ls s3://my-bucket/prefix/ --recursive --summarize --human-readable
# Enable versioning
aws s3api put-bucket-versioning \
--bucket my-bucket \
--versioning-configuration Status=Enabled
# Put bucket policy
aws s3api put-bucket-policy \
--bucket my-bucket \
--policy file://bucket-policy.json
# Check Block Public Access settings
aws s3api get-public-access-block --bucket my-bucket
# Enable Transfer Acceleration
aws s3api put-bucket-accelerate-configuration \
--bucket my-bucket \
--accelerate-configuration Status=Enabled
# S3 Select query on CSV
aws s3api select-object-content \
--bucket my-bucket \
--key data.csv \
--expression "SELECT s.name, s.age FROM s3object s WHERE s.age > '30'" \
--expression-type SQL \
--input-serialization '{"CSV":{"FileHeaderInfo":"USE"}}' \
--output-serialization '{"CSV":{}}' \
output.csv