Deploys Django to Google App Engine Standard with Cloud SQL PostgreSQL. Configures Unix socket connections, Cloud SQL Auth Proxy for local dev, Gunicorn, app.yaml, and production settings.
From evolv3ainpx claudepluginhub evolv3ai/claude-skills-archive --plugin projectThis skill uses the workspace's default tool permissions.
README.mdreferences/cloud-sql-proxy-setup.mdreferences/iam-authentication.mdreferences/migrations-in-production.mdrules/django-cloud-sql-postgres.mdtemplates/app.yamltemplates/requirements.txttemplates/settings_cloudsql.pyGuides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Verifies tests pass on completed feature branch, presents options to merge locally, create GitHub PR, keep as-is or discard; executes choice and cleans up worktree.
Dispatches parallel agents to independently tackle 2+ tasks like separate test failures or subsystems without shared state or dependencies.
Status: Production Ready
Last Updated: 2026-01-24
Dependencies: None
Latest Versions: Django@5.1, psycopg2-binary@2.9.9, gunicorn@23.0.0, google-cloud-sql-connector@1.12.0
pip install Django psycopg2-binary gunicorn
For Cloud SQL Python Connector (recommended for local dev):
pip install "cloud-sql-python-connector[pg8000]"
Why this matters:
psycopg2-binary is the PostgreSQL adapter for Djangogunicorn is required for App Engine Standard (Python 3.10+)settings.py (production with Unix socket):
import os
# Detect App Engine environment
IS_APP_ENGINE = os.getenv('GAE_APPLICATION', None)
if IS_APP_ENGINE:
# Production: Connect via Unix socket
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ['DB_NAME'],
'USER': os.environ['DB_USER'],
'PASSWORD': os.environ['DB_PASSWORD'],
'HOST': f"/cloudsql/{os.environ['CLOUD_SQL_CONNECTION_NAME']}",
'PORT': '', # Empty for Unix socket
}
}
else:
# Local development: Connect via Cloud SQL Proxy
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ.get('DB_NAME', 'mydb'),
'USER': os.environ.get('DB_USER', 'postgres'),
'PASSWORD': os.environ.get('DB_PASSWORD', ''),
'HOST': '127.0.0.1',
'PORT': '5432',
}
}
CRITICAL:
/cloudsql/PROJECT:REGION:INSTANCE127.0.0.1:5432runtime: python310
entrypoint: gunicorn -b :$PORT myproject.wsgi:application
env_variables:
DB_NAME: "mydb"
DB_USER: "postgres"
CLOUD_SQL_CONNECTION_NAME: "project-id:region:instance-name"
# Cloud SQL connection
beta_settings:
cloud_sql_instances: "project-id:region:instance-name"
handlers:
- url: /static
static_dir: static/
- url: /.*
script: auto
secure: always
CRITICAL:
beta_settings.cloud_sql_instances enables the Unix socket at /cloudsql/...gcloud app deploy or Secret Manager, not in app.yaml# Create PostgreSQL instance
gcloud sql instances create myinstance \
--database-version=POSTGRES_15 \
--tier=db-f1-micro \
--region=us-central1
# Create database
gcloud sql databases create mydb --instance=myinstance
# Create user
gcloud sql users create postgres \
--instance=myinstance \
--password=YOUR_SECURE_PASSWORD
Key Points:
POSTGRES_15 or later for best compatibilitydb-f1-micro is cheapest for dev ($7-10/month), use db-g1-small or higher for productionPROJECT_ID:REGION:INSTANCE_NAMErequirements.txt:
Django>=5.1,<6.0
psycopg2-binary>=2.9.9
gunicorn>=23.0.0
whitenoise>=6.7.0
settings.py additions:
import os
# Security settings for production
DEBUG = os.environ.get('DEBUG', 'False') == 'True'
ALLOWED_HOSTS = [
'.appspot.com',
'.run.app',
'localhost',
'127.0.0.1',
]
# Static files with WhiteNoise
STATIC_URL = '/static/'
STATIC_ROOT = os.path.join(BASE_DIR, 'static')
MIDDLEWARE.insert(1, 'whitenoise.middleware.WhiteNoiseMiddleware')
STATICFILES_STORAGE = 'whitenoise.storage.CompressedManifestStaticFilesStorage'
# Database connection pooling
DATABASES['default']['CONN_MAX_AGE'] = 60 # Keep connections open for 60 seconds
Why these settings:
CONN_MAX_AGE=60 reduces connection overhead (Cloud SQL has connection limits)ALLOWED_HOSTS must include .appspot.com for App EngineInstall Cloud SQL Auth Proxy:
# macOS
brew install cloud-sql-proxy
# Linux
curl -o cloud-sql-proxy https://storage.googleapis.com/cloud-sql-connectors/cloud-sql-proxy/v2.14.1/cloud-sql-proxy.linux.amd64
chmod +x cloud-sql-proxy
Run the proxy:
# Authenticate first
gcloud auth application-default login
# Start proxy (runs on 127.0.0.1:5432)
./cloud-sql-proxy PROJECT_ID:REGION:INSTANCE_NAME
# Or with specific port
./cloud-sql-proxy PROJECT_ID:REGION:INSTANCE_NAME --port=5432
Set environment variables for local dev:
export DB_NAME=mydb
export DB_USER=postgres
export DB_PASSWORD=your_password
export DEBUG=True
Key Points:
# Local (with proxy running)
python manage.py migrate
# Verify connection
python manage.py dbshell
For production migrations (via Cloud Build or local with proxy):
# Option 1: Run locally with proxy
./cloud-sql-proxy PROJECT:REGION:INSTANCE &
python manage.py migrate
# Option 2: Use Cloud Build (recommended)
# See references/cloud-build-migrations.md
gunicorn.conf.py (optional, for fine-tuning):
import multiprocessing
# Workers
workers = 2 # App Engine Standard limits this
threads = 4
worker_class = 'gthread'
# Timeout (App Engine has 60s limit for standard, 3600s for flexible)
timeout = 55
# Logging
accesslog = '-'
errorlog = '-'
loglevel = 'info'
# Bind (App Engine sets $PORT)
bind = f"0.0.0.0:{os.environ.get('PORT', '8080')}"
app.yaml entrypoint options:
# Simple (recommended for most cases)
entrypoint: gunicorn -b :$PORT myproject.wsgi:application
# With config file
entrypoint: gunicorn -c gunicorn.conf.py myproject.wsgi:application
# With workers and timeout
entrypoint: gunicorn -b :$PORT -w 2 -t 55 myproject.wsgi:application
Key Points:
gthread worker class for I/O-bound Django apps# Collect static files
python manage.py collectstatic --noinput
# Deploy
gcloud app deploy app.yaml
# Set DB password via environment
gcloud app deploy --set-env-vars="DB_PASSWORD=your_secure_password"
# View logs
gcloud app logs tail -s default
Verify deployment:
# Open in browser
gcloud app browse
# Check database connection
gcloud app logs read --service=default | grep -i database
ALWAYS DO:
/cloudsql/PROJECT:REGION:INSTANCE on App EnginePORT='' (empty string) for Unix socket connectionsCONN_MAX_AGE for connection pooling (30-60 seconds)NEVER DO:
app.yaml (use Secret Manager or env vars at deploy time)HOST='localhost' on App Engine (must use Unix socket path)beta_settings.cloud_sql_instances in app.yamlCONN_MAX_AGE=None (unlimited) - can exhaust connection poolThis skill prevents 12 documented issues:
Error: django.db.utils.OperationalError: could not connect to server: No such file or directory
Source: https://cloud.google.com/sql/docs/postgres/connect-app-engine-standard
Why It Happens: App Engine can't find the Unix socket because beta_settings.cloud_sql_instances is missing in app.yaml
Prevention: Always include beta_settings.cloud_sql_instances: "PROJECT:REGION:INSTANCE" in app.yaml
Error: django.db.utils.OperationalError: connection refused or could not connect to server: Connection refused
Source: https://cloud.google.com/sql/docs/postgres/connect-auth-proxy
Why It Happens: Cloud SQL Auth Proxy is not running or bound to wrong port
Prevention: Start cloud-sql-proxy PROJECT:REGION:INSTANCE before running Django locally. Verify it's running on port 5432.
Error: FATAL: password authentication failed for user "postgres"
Source: https://cloud.google.com/sql/docs/postgres/create-manage-users
Why It Happens: Wrong password in environment variable, or user doesn't exist in Cloud SQL instance
Prevention: Verify password with gcloud sql users list --instance=INSTANCE. Reset if needed: gcloud sql users set-password postgres --instance=INSTANCE --password=NEW_PASSWORD
Error: FATAL: too many connections for role "postgres" or remaining connection slots are reserved
Source: https://cloud.google.com/sql/docs/postgres/quotas#connection_limits
Why It Happens: Each request opens a new connection without pooling, exhausting the 25-100 connection limit
Prevention: Set CONN_MAX_AGE = 60 in Django settings. For high traffic, use PgBouncer or django-db-connection-pool.
Error: DeadlineExceededError or request terminated after 60 seconds
Source: https://cloud.google.com/appengine/docs/standard/python3/how-requests-are-handled
Why It Happens: Database query or migration takes longer than App Engine's 60-second limit
Prevention: Set Gunicorn timeout to 55 seconds. For long-running tasks, use Cloud Tasks or Cloud Run Jobs.
Error: Static files return 404, CSS/JS not loading
Source: https://cloud.google.com/appengine/docs/standard/serving-static-files
Why It Happens: Missing static/ handler in app.yaml or collectstatic not run before deploy
Prevention: Run python manage.py collectstatic --noinput before deploy. Include static handler in app.yaml.
Error: Forbidden (403) CSRF verification failed
Source: Django documentation on CSRF
Why It Happens: CSRF_TRUSTED_ORIGINS not set for appspot.com domain
Prevention: Add CSRF_TRUSTED_ORIGINS = ['https://*.appspot.com'] to settings.py
Error: django.db.utils.OperationalError: FATAL: database "mydb" does not exist
Source: https://cloud.google.com/sql/docs/postgres/create-manage-databases
Why It Happens: Database exists in Cloud SQL but DB_NAME environment variable is wrong
Prevention: Verify database name: gcloud sql databases list --instance=INSTANCE. Ensure DB_NAME matches exactly.
Error: FATAL: Cloud SQL IAM user authentication failed
Source: https://cloud.google.com/sql/docs/postgres/iam-logins
Why It Happens: App Engine service account doesn't have roles/cloudsql.instanceUser role
Prevention: Grant role: gcloud projects add-iam-policy-binding PROJECT --member="serviceAccount:PROJECT@appspot.gserviceaccount.com" --role="roles/cloudsql.instanceUser"
Error: Migrations timeout or can't run during deployment Source: https://cloud.google.com/sql/docs/postgres/connect-build Why It Happens: App Engine deploy doesn't provide a migration step; running in entrypoint times out Prevention: Run migrations separately via Cloud Build, or locally with Cloud SQL Proxy before deploying.
Error: Security warning about hardcoded SECRET_KEY
Source: Django deployment checklist
Why It Happens: SECRET_KEY is in settings.py instead of environment variable or Secret Manager
Prevention: Use SECRET_KEY = os.environ.get('SECRET_KEY') and set via gcloud app deploy --set-env-vars or Secret Manager.
Error: First request after idle period times out
Source: https://cloud.google.com/appengine/docs/standard/how-instances-are-managed
Why It Happens: App Engine instance cold start + Cloud SQL activation delay (if using "on demand" activation)
Prevention: Use App Engine warmup requests, or keep Cloud SQL instance "always on" (increases cost). Set app_engine_apis: true and add /_ah/warmup handler.
import os
from pathlib import Path
BASE_DIR = Path(__file__).resolve().parent.parent
# Security
SECRET_KEY = os.environ.get('SECRET_KEY', 'dev-key-change-in-production')
DEBUG = os.environ.get('DEBUG', 'False') == 'True'
ALLOWED_HOSTS = [
'.appspot.com',
'.run.app',
'localhost',
'127.0.0.1',
]
# CSRF for App Engine
CSRF_TRUSTED_ORIGINS = [
'https://*.appspot.com',
'https://*.run.app',
]
# Application definition
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
# Your apps here
]
MIDDLEWARE = [
'django.middleware.security.SecurityMiddleware',
'whitenoise.middleware.WhiteNoiseMiddleware', # Static files
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
]
ROOT_URLCONF = 'myproject.urls'
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [BASE_DIR / 'templates'],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
],
},
},
]
WSGI_APPLICATION = 'myproject.wsgi.application'
# Database
IS_APP_ENGINE = os.getenv('GAE_APPLICATION', None)
if IS_APP_ENGINE:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ['DB_NAME'],
'USER': os.environ['DB_USER'],
'PASSWORD': os.environ['DB_PASSWORD'],
'HOST': f"/cloudsql/{os.environ['CLOUD_SQL_CONNECTION_NAME']}",
'PORT': '',
'CONN_MAX_AGE': 60,
'OPTIONS': {
'connect_timeout': 10,
},
}
}
else:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ.get('DB_NAME', 'mydb'),
'USER': os.environ.get('DB_USER', 'postgres'),
'PASSWORD': os.environ.get('DB_PASSWORD', ''),
'HOST': os.environ.get('DB_HOST', '127.0.0.1'),
'PORT': os.environ.get('DB_PORT', '5432'),
'CONN_MAX_AGE': 60,
}
}
# Static files
STATIC_URL = '/static/'
STATIC_ROOT = BASE_DIR / 'static'
STATICFILES_STORAGE = 'whitenoise.storage.CompressedManifestStaticFilesStorage'
# Default primary key field type
DEFAULT_AUTO_FIELD = 'django.db.models.BigAutoField'
# Logging
LOGGING = {
'version': 1,
'disable_existing_loggers': False,
'handlers': {
'console': {
'class': 'logging.StreamHandler',
},
},
'root': {
'handlers': ['console'],
'level': 'INFO',
},
'loggers': {
'django': {
'handlers': ['console'],
'level': os.getenv('DJANGO_LOG_LEVEL', 'INFO'),
'propagate': False,
},
},
}
runtime: python310
entrypoint: gunicorn -b :$PORT -w 2 -t 55 --threads 4 myproject.wsgi:application
instance_class: F2
env_variables:
DB_NAME: "mydb"
DB_USER: "postgres"
CLOUD_SQL_CONNECTION_NAME: "project-id:us-central1:instance-name"
DEBUG: "False"
beta_settings:
cloud_sql_instances: "project-id:us-central1:instance-name"
handlers:
- url: /static
static_dir: static/
secure: always
- url: /.*
script: auto
secure: always
automatic_scaling:
min_instances: 0
max_instances: 2
target_cpu_utilization: 0.65
Why these settings:
F2 instance class allows 2 Gunicorn workersmin_instances: 0 saves costs when idletarget_cpu_utilization: 0.65 scales before overloadDjango>=5.1,<6.0
psycopg2-binary>=2.9.9
gunicorn>=23.0.0
whitenoise>=6.7.0
import os
def get_database_config():
"""Return database config based on environment."""
is_production = os.getenv('GAE_APPLICATION', None)
if is_production:
return {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ['DB_NAME'],
'USER': os.environ['DB_USER'],
'PASSWORD': os.environ['DB_PASSWORD'],
'HOST': f"/cloudsql/{os.environ['CLOUD_SQL_CONNECTION_NAME']}",
'PORT': '',
'CONN_MAX_AGE': 60,
}
else:
return {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ.get('DB_NAME', 'mydb'),
'USER': os.environ.get('DB_USER', 'postgres'),
'PASSWORD': os.environ.get('DB_PASSWORD', ''),
'HOST': '127.0.0.1',
'PORT': '5432',
'CONN_MAX_AGE': 60,
}
DATABASES = {'default': get_database_config()}
When to use: Standard Django project needing local/production parity
from google.cloud import secretmanager
def get_secret(secret_id, version_id='latest'):
"""Retrieve secret from Google Secret Manager."""
client = secretmanager.SecretManagerServiceClient()
project_id = os.environ.get('GOOGLE_CLOUD_PROJECT')
name = f"projects/{project_id}/secrets/{secret_id}/versions/{version_id}"
response = client.access_secret_version(request={"name": name})
return response.payload.data.decode('UTF-8')
# Usage in settings.py
if os.getenv('GAE_APPLICATION'):
SECRET_KEY = get_secret('django-secret-key')
DB_PASSWORD = get_secret('db-password')
When to use: Production deployments requiring proper secret management
# For local development without Cloud SQL Auth Proxy
from google.cloud.sql.connector import Connector
def get_db_connection():
connector = Connector()
return connector.connect(
"project:region:instance",
"pg8000",
user="postgres",
password=os.environ["DB_PASSWORD"],
db="mydb",
)
# In settings.py for local dev (requires pg8000 driver)
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'mydb',
'USER': 'postgres',
'OPTIONS': {
'get_conn': get_db_connection,
},
}
}
When to use: Local development when you can't install Cloud SQL Auth Proxy
# views.py
from django.http import JsonResponse
from django.db import connection
def health_check(request):
"""Health check endpoint for App Engine."""
try:
with connection.cursor() as cursor:
cursor.execute("SELECT 1")
return JsonResponse({
'status': 'healthy',
'database': 'connected',
})
except Exception as e:
return JsonResponse({
'status': 'unhealthy',
'database': str(e),
}, status=503)
# urls.py
urlpatterns = [
path('_ah/health', health_check, name='health_check'),
]
When to use: Load balancer health checks, monitoring database connectivity
templates/settings_snippet.py - Copy-paste database configurationtemplates/app.yaml - Complete App Engine configurationtemplates/requirements.txt - Production dependenciesreferences/cloud-sql-proxy-setup.md - Detailed proxy installation and usagereferences/iam-authentication.md - IAM-based authentication (no passwords)references/secret-manager.md - Storing secrets properlyreferences/migrations-in-production.md - Running migrations safelyWhen Claude should load these:
cloud-sql-proxy-setup.md when user has local connection issuesiam-authentication.md when setting up passwordless authmigrations-in-production.md before first production deploymentInstead of password authentication, use IAM for service-to-service auth:
# Enable IAM authentication on instance
gcloud sql instances patch INSTANCE --database-flags cloudsql.iam_authentication=on
# Create IAM user
gcloud sql users create SERVICE_ACCOUNT@PROJECT.iam \
--instance=INSTANCE \
--type=CLOUD_IAM_SERVICE_ACCOUNT
# Grant connect permission
gcloud projects add-iam-policy-binding PROJECT \
--member="serviceAccount:PROJECT@appspot.gserviceaccount.com" \
--role="roles/cloudsql.instanceUser"
Django settings for IAM auth:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ['DB_NAME'],
'USER': f"{os.environ['SERVICE_ACCOUNT']}@{os.environ['PROJECT_ID']}.iam",
'HOST': f"/cloudsql/{os.environ['CLOUD_SQL_CONNECTION_NAME']}",
'PORT': '',
# No PASSWORD needed with IAM auth
}
}
For high-traffic applications, deploy PgBouncer on Cloud Run:
# Cloud Run service for PgBouncer
# See references/pgbouncer-setup.md for full configuration
Why PgBouncer:
CONN_MAX_AGE helps but doesn't pool across processesOption 1: Cloud Build (Recommended)
# cloudbuild.yaml
steps:
- name: 'gcr.io/cloud-builders/gcloud'
args: ['sql', 'connect', 'INSTANCE', '--quiet']
- name: 'python:3.10'
entrypoint: 'bash'
args:
- '-c'
- |
pip install -r requirements.txt
python manage.py migrate --noinput
env:
- 'DB_NAME=mydb'
- 'DB_USER=postgres'
- 'DB_HOST=/cloudsql/PROJECT:REGION:INSTANCE'
secretEnv: ['DB_PASSWORD']
availableSecrets:
secretManager:
- versionName: projects/PROJECT/secrets/db-password/versions/latest
env: 'DB_PASSWORD'
Option 2: Local with Proxy (Simple)
./cloud-sql-proxy PROJECT:REGION:INSTANCE &
python manage.py migrate
Required:
Django>=5.1 - Web frameworkpsycopg2-binary>=2.9.9 - PostgreSQL adaptergunicorn>=23.0.0 - WSGI server for App EngineRecommended:
whitenoise>=6.7.0 - Static file servingpython-dotenv>=1.0.0 - Local environment variablesOptional:
google-cloud-secret-manager>=2.20.0 - Secret Manager integrationcloud-sql-python-connector[pg8000]>=1.12.0 - Python-native Cloud SQL connectordjango-db-connection-pool>=1.2.5 - Connection pooling (alternative to CONN_MAX_AGE){
"dependencies": {
"Django": ">=5.1,<6.0",
"psycopg2-binary": ">=2.9.9",
"gunicorn": ">=23.0.0",
"whitenoise": ">=6.7.0"
},
"optional": {
"google-cloud-secret-manager": ">=2.20.0",
"cloud-sql-python-connector": ">=1.12.0"
}
}
This skill is based on production Django deployments on App Engine:
No such file or directory for socketSolution:
beta_settings.cloud_sql_instances in app.yamlPROJECT:REGION:INSTANCEgcloud sql instances listSolution:
HOST uses Unix socket path, not 127.0.0.1Cloud SQL Client roleSolution:
Solution:
python manage.py collectstatic --noinput before deploystatic/ handler in app.yaml points to correct directorybeta_settings.cloud_sql_instances in app.yamlCONN_MAX_AGE set for connection poolinggcloud app deployQuestions? Issues?
gcloud sql instances listLast verified: 2026-01-24 | Skill version: 1.0.0