npx claudepluginhub dekart-xyz/geosql --plugin geosqlThis skill uses the workspace's default tool permissions.
This skill uses the following CLIs:
Guides Payload CMS config (payload.config.ts), collections, fields, hooks, access control, APIs. Debugs validation errors, security, relationships, queries, transactions, hook behavior.
Builds production-ready Apache Airflow DAGs with patterns for operators, sensors, testing, and deployment. For data pipelines, workflow orchestration, and batch jobs.
Share bugs, ideas, or general feedback.
This skill uses the following CLIs:
bq for BigQuery SQL execution and cost control.snow for Snowflake SQL execution (snow sql).dekart for rendering maps. Used ONLY when the user explicitly asks for a map. Never run map flow for SQL-only questions.Before using CLIs, verify availability if it was not done before:
for c in bq snow dekart; do command -v $c >/dev/null && echo $c=ok || echo $c=missing; done
Follow these steps in order. Do NOT write a final query until steps 1-3 are complete.
Discover available data objects before writing any query: start with database/share availability (where applicable), then confirm schemas, tables, and columns. Always verify exact object names and column types from the warehouse metadata; do not assume from general knowledge.
When multiple tables match the entity, sample each candidate for attribute density and prefer the richer source. Richer attributes enable stronger visual encoding in Step 5 and a stronger map validation case.
BigQuery:
SELECT table_name
FROM `bigquery-public-data.overture_maps.INFORMATION_SCHEMA.TABLES`
ORDER BY table_name;
SELECT column_name, data_type
FROM `bigquery-public-data.overture_maps.INFORMATION_SCHEMA.COLUMNS`
WHERE table_name = '<target_table>'
ORDER BY ordinal_position;
Snowflake:
SHOW DATABASES LIKE 'OVERTURE_MAPS__%';
If this returns no rows, stop and ask the user to install Overture Maps shares from Snowflake Marketplace before continuing.
SELECT table_catalog, table_schema, table_name
FROM OVERTURE_MAPS__TRANSPORTATION.INFORMATION_SCHEMA.TABLES
WHERE table_schema = 'CARTO'
ORDER BY table_name;
SELECT column_name, data_type
FROM OVERTURE_MAPS__TRANSPORTATION.INFORMATION_SCHEMA.COLUMNS
WHERE table_schema = 'CARTO'
AND table_name = '<TARGET_TABLE>'
ORDER BY ordinal_position;
When the user asks about a named area (city, district, country), query division_area first to discover how it is actually stored (subtype, class, naming conventions). Do not assume from general knowledge.
BigQuery:
SELECT subtype, class, names.primary, bbox.xmin, bbox.xmax, bbox.ymin, bbox.ymax
FROM `bigquery-public-data.overture_maps.division_area`
WHERE country = '<iso2>'
AND LOWER(names.primary) LIKE '%<area_name>%'
LIMIT 20;
Snowflake:
SELECT subtype,
class,
names:primary::string AS name_primary,
bbox:xmin::float AS xmin,
bbox:xmax::float AS xmax,
bbox:ymin::float AS ymin,
bbox:ymax::float AS ymax
FROM OVERTURE_MAPS__DIVISIONS.CARTO.DIVISION_AREA
WHERE country = '<ISO2>'
AND LOWER(names:primary::string) LIKE '%<area_name>%'
LIMIT 20;
Extract the exact bbox constants from the result. Use the full precision values returned by the query, do not round or truncate them.
ST_INTERSECTS against the real geometry from division_area for geographic correctness. bbox alone is rectangular and overshoots.ST_INTERSECTS.SELECT *.LIMIT for exploration.CRITICAL: bbox overlap filter direction. The scan gate must use the OVERLAP pattern, not containment. The feature's bbox must overlap the target area:
-- CORRECT (overlap): feature extends into our area
AND bbox.xmax >= <area_xmin> -- feature's right edge is east of area's left
AND bbox.xmin <= <area_xmax> -- feature's left edge is west of area's right
AND bbox.ymax >= <area_ymin> -- feature's top edge is above area's bottom
AND bbox.ymin <= <area_ymax> -- feature's bottom edge is below area's top
-- WRONG (containment): do NOT use this
AND bbox.xmin >= <area_xmin> -- WRONG
AND bbox.xmax <= <area_xmax> -- WRONG
BigQuery:
WITH area AS (
SELECT geometry
FROM `bigquery-public-data.overture_maps.division_area`
WHERE country = 'DE'
AND region = 'DE-BE'
AND subtype = 'region'
AND class = 'land'
LIMIT 1
)
SELECT s.id, s.geometry
FROM `bigquery-public-data.overture_maps.segment` s
CROSS JOIN area a
WHERE s.subtype = 'rail'
-- overlap pattern: xmax >= area_xmin, xmin <= area_xmax
AND s.bbox.xmax >= 13.08834457397461
AND s.bbox.xmin <= 13.761162757873535
AND s.bbox.ymax >= 52.33823776245117
AND s.bbox.ymin <= 52.67551040649414
AND ST_INTERSECTS(s.geometry, a.geometry)
LIMIT 1000;
Snowflake:
WITH area AS (
SELECT geometry
FROM OVERTURE_MAPS__DIVISIONS.CARTO.DIVISION_AREA
WHERE country = 'DE'
AND region = 'DE-BE'
AND subtype = 'region'
AND class = 'land'
LIMIT 1
)
SELECT s.id, s.geometry
FROM OVERTURE_MAPS__TRANSPORTATION.CARTO.SEGMENT s
CROSS JOIN area a
WHERE s.subtype = 'rail'
AND s.bbox:xmax::float >= 13.08834457397461
AND s.bbox:xmin::float <= 13.761162757873535
AND s.bbox:ymax::float >= 52.33823776245117
AND s.bbox:ymin::float <= 52.67551040649414
AND ST_INTERSECTS(s.geometry, a.geometry)
LIMIT 1000;
Do NOT present the query to the user without validating it first.
bq query --use_legacy_sql=false --dry_run --format=json '<SQL>' to check estimated bytes.
COUNT(*) (or equivalent) to confirm count is reasonable. If count is zero, debug before presenting.GEOGRAPHY, compute total area (for example SUM(ST_AREA(geometry))) and return units in square meters (and optionally km²).GEOGRAPHY, compute total length (for example SUM(ST_LENGTH(geometry))) and return units in meters (and optionally km).Iterate. Fix issues in small steps. Do not run broad or full extraction queries unless explicitly requested. All validation must be done in SQL.
Maps catch what rows cannot: misplaced points, duplicates, coverage gaps.
If user did not ask for map, answer first (SQL + results + cost), then propose map validation as a separate "Next step" section with 2-3 sentences on what to look for and which failure modes rows cannot catch.
Do not start map workflow without user request.
Do not claim visual insights until the styled snapshot is rendered and inspected; never dress row-derived facts as map observations.
If dekart CLI is missing, ask the user to pip install dekart-cli && dekart init and wait until the user confirms with ready, done, or ok. If unauthed, ask to run dekart init.
bq CLIUse bq CLI directly. Always use standard SQL and enforce a budget:
# Dry run (check cost before executing)
bq query --use_legacy_sql=false --dry_run --format=json --maximum_bytes_billed=10737418240 'SELECT ...'
# Execute
bq query --use_legacy_sql=false --format=json --maximum_bytes_billed=10737418240 --max_rows=50 'SELECT ...'
Guardrails:
--maximum_bytes_billed (default 10 GiB = 10737418240).snow CLIUse snow sql directly for Snowflake data and keep queries bounded.
# First verify Overture shares are installed
snow sql --query "SHOW DATABASES LIKE 'OVERTURE_MAPS__%';"
# Validate quickly with row count first
snow sql --query "WITH area AS (...) SELECT COUNT(*) FROM ...;"
# Execute preview rows (table output)
snow sql --query "WITH area AS (...) SELECT ... LIMIT 50;"
# CSV output for piping (clean stdout)
snow sql --format CSV --silent --query "WITH area AS (...) SELECT ... LIMIT 50000;"
Guardrails:
COUNT(*) first.bbox + ST_INTERSECTS + LIMIT) unless the user explicitly asks for full export.--format CSV --silent.SHOW DATABASES LIKE 'OVERTURE_MAPS__%' returns no rows, ask the user to install Overture Maps from Snowflake Marketplace, then retry.dekart CLIUse this to execute the map validation step from SQL query output. Use only if dekart CLI is available.
The CLI stores map artifacts in this hierarchy:
report: top-level map container.dataset: one data layer slot inside a report.file: uploaded data artifact attached to a dataset.Control plane: create report -> create dataset -> create file. The CLI provides an upload wrapper that performs multipart upload and completion.
dekart --help, dekart tools --help, dekart call --help, dekart upload-file --help.ready, done, or ok. If user declines or is silent, stop; do not export CSV, do not create reports.--max_rows is mandatory because BigQuery CLI defaults to 100 rows when omitted.
2>&1 when output is redirected to .csv.bq query ... --format=csv --max_rows=50000 'SELECT ...' | dekart upload-file --stdin --file-id <file_id> --name result.csv --mime-type text/csvsnow sql --format CSV --silent --query "<SQL with LIMIT 50000>" | dekart upload-file --stdin --file-id <file_id> --name result.csv --mime-type text/csvdekart tools.report_iddataset_iddekart upload-file and use returned complete payload/status.
bq query ... --format=csv --max_rows=50000 'SELECT ...' | dekart upload-file --stdin --file-id <file_id> --name result.csv --mime-type text/csvsnow sql --format CSV --silent --query "<SQL with LIMIT 50000>" | dekart upload-file --stdin --file-id <file_id> --name result.csv --mime-type text/csvdekart upload-file --file /tmp/result.csv --file-id <file_id>completed.dekart snapshot --report-id <report_id> --out /tmp/<report_id>-snapshot.pngdekart init, dekart config on your own. Ask user to re-run dekart init if needed.dekart init and confirm before retrying.snapshot timeout), ask the user to enable local snapshots:
dekart snapshot-local install
Then retry snapshot with dekart snapshot --report-id <report_id>.layer.config.dataId points to the report dataset idreport_url from create-report response when available.report_path if report_url is missing.After upload, review the map snapshot and tune the layer. These rules override Claude's default styling instincts. Full reference: references/map-styling.md.
Non-obvious rules:
0.5-4 px, lines stroke 0.5-1.5 px, polygon borders 0.5-1 px hairline. Do not clamp LIMIT below 50k unless cost forces it.When uncertain about a specific pixel value or palette, read references/map-styling.md.
Use H3 when the user requests spatial aggregation, heatmaps, density, or cell-based rollups.
Namespace by location:
jslibs.h3.*jslibs.eu_h3.*Functions:
jslibs.h3.ST_H3(<point>, <resolution>) - point to celljslibs.h3.ST_H3_POLYFILLFROMGEOG(<polygon>, <resolution>) - polygon filljslibs.h3.ST_H3_BOUNDARY(<h3_index>) - cell boundary for visualizationCost rules:
WHERE + hardcoded bbox first, then compute H3.h3 + aggregate metrics before adding boundaries.COUNT(*) previews before geometry-heavy ST_H3_BOUNDARY output.7-9 for city scale.bq query unavailable or auth fails: return exact fix commands only, no auto-install.snow sql unavailable or auth fails: ask user to install/configure snow, then retry using snow sql.SHOW DATABASES LIKE 'OVERTURE_MAPS__%' returns no rows): ask user to install Overture data from Snowflake Marketplace, then continue.dry_run for BigQuery, COUNT(*)/bounded preview for Snowflake).