From theclauu
Use for Snowflake operations — query via artemis connection, cutover to artemis-python-tools. Replaces /snowflake-query, /snowflake-cutover.
npx claudepluginhub artemis-xyz/theclauu --plugin theclauuThis skill uses the workspace's default tool permissions.
Two modes. Pick based on intent:
Generates design tokens/docs from CSS/Tailwind/styled-components codebases, audits visual consistency across 10 dimensions, detects AI slop in UI.
Records polished WebM UI demo videos of web apps using Playwright with cursor overlay, natural pacing, and three-phase scripting. Activates for demo, walkthrough, screen recording, or tutorial requests.
Delivers idiomatic Kotlin patterns for null safety, immutability, sealed classes, coroutines, Flows, extensions, DSL builders, and Gradle DSL. Use when writing, reviewing, refactoring, or designing Kotlin code.
Two modes. Pick based on intent:
========================================
Execute ad-hoc Snowflake queries using SnowSQL with key pair authentication. Accepts raw SQL or natural language descriptions.
Authentication uses key pair auth via connection profiles in ~/.snowsql/config:
Run a query:
snowsql -c artemis -q "YOUR SQL HERE"
For multi-line or complex queries, use a heredoc:
snowsql -c artemis -q "$(cat <<'EOF'
SELECT table_schema, table_name, row_count
FROM information_schema.tables
WHERE table_schema = 'PROD'
ORDER BY row_count DESC
LIMIT 10;
EOF
)"
If raw SQL provided:
If natural language description:
MUST include LIMIT for:
SELECT * queriesExceptions (LIMIT optional):
GROUP BY and constrained date rangeCOUNT(*) queriesINFORMATION_SCHEMA queriesSHOW and DESCRIBE commandsNEVER execute:
DATABASE.SCHEMA.TABLE)LIMIT guidance:
| Purpose | LIMIT |
|---|---|
| Quick sanity check | 10 |
| Pattern exploration | 100 |
| Sample analysis | 1,000 |
| Full export (rare) | User must explicitly request |
snowsql -c artemis -q "<SQL_QUERY>"
Use -o output_format=FORMAT for different output:
psql (default) — PostgreSQL-style tablescsv — comma-separatedtsv — tab-separatedjson — JSON formatCheck table freshness:
SELECT MAX(date) as latest_date, MIN(date) as earliest_date, COUNT(*) as total_rows
FROM PC_DBT_DB.PROD.<TABLE_NAME>;
Check data by chain:
SELECT chain, COUNT(*) as row_count, MAX(date) as latest_date
FROM PC_DBT_DB.PROD.<TABLE_NAME>
GROUP BY chain
ORDER BY row_count DESC
LIMIT 50;
Sample recent data:
SELECT * FROM PC_DBT_DB.PROD.<TABLE_NAME>
WHERE date >= CURRENT_DATE - 7
ORDER BY date DESC
LIMIT 100;
Check for NULLs:
SELECT
COUNT(*) as total,
COUNT(<COLUMN>) as non_null,
COUNT(*) - COUNT(<COLUMN>) as null_count,
ROUND(100.0 * (COUNT(*) - COUNT(<COLUMN>)) / COUNT(*), 2) as null_pct
FROM PC_DBT_DB.PROD.<TABLE_NAME>;
Compare dev vs prod:
SELECT 'dev' as env, COUNT(*) as rows FROM PC_DBT_DB.DEV.DEV_<USER>.<MODEL>
UNION ALL
SELECT 'prod' as env, COUNT(*) as rows FROM PC_DBT_DB.PROD.<MODEL>;
Table metadata:
SELECT table_name, row_count, bytes / (1024*1024*1024) as size_gb, created, last_altered
FROM PC_DBT_DB.INFORMATION_SCHEMA.TABLES
WHERE table_schema = 'PROD' AND table_name ILIKE '%<PATTERN>%'
ORDER BY row_count DESC
LIMIT 20;
Column info:
SELECT column_name, data_type, is_nullable
FROM PC_DBT_DB.INFORMATION_SCHEMA.COLUMNS
WHERE table_schema = 'PROD' AND table_name = '<TABLE_NAME>'
ORDER BY ordinal_position;
List databases / schemas / tables:
snowsql -c artemis -q "SHOW DATABASES;"
snowsql -c artemis -q "SHOW SCHEMAS IN DATABASE PC_DBT_DB;"
snowsql -c artemis -q "SHOW TABLES IN SCHEMA PC_DBT_DB.PROD;"
snowsql -c artemis -q "DESCRIBE TABLE PC_DBT_DB.PROD.<TABLE_NAME>;"
Query timeout:
Large result set:
Connection issues:
~/.snowsql/config has the connection profilesnowsql -c artemis -q "SELECT 1;" to test connectivity========================================
Migrate a project from custom Snowflake connection logic to the centralized
artemis-python-tools credential adapter pattern with RSA key-pair auth support.
Read migration-steps.md for the full procedure with code templates and commands.
snowflake.connector.connect() callsSYSTEM_SNOWFLAKE_* env var namingBefore starting, confirm:
Follow each step in migration-steps.md in order:
snowflake.connector.connect() call sites, credential config, Settings classes, and test mocks--no-deps if missing; add to build command, not pyproject.toml depscreate_snowflake_connection() calls; remove dead auth methodsSNOWFLAKE_* to SYSTEM_SNOWFLAKE_*; add RSA key fields to Settings class if using Pattern AIf issues arise after deployment:
SNOWFLAKE_* env vars are still present until Step 7.4 — revert the code deploy and the old config still worksfastapi==0.115.12 — always install with --no-deps to avoid conflictsSYSTEM_SNOWFLAKE_PRIVATE_KEY) enables key-pair auth in environments where key files aren't on disk (Railway, containers)SnowflakeCredentials.from_env() — that creates a parallel config path========================================
Consolidated from legacy claudefather skills. Pick the mode based on intent.