Connect to and inspect data sources. Use this skill when you need to verify data access, inspect table schemas, check row counts, or understand the structure of a dataset before performing analysis.
Connect to data sources to inspect schemas, verify access, and check row counts before analysis. Triggers when you need to understand dataset structure or validate data is readable.
/plugin marketplace add argythana/python-ml-skills/plugin install argythana-python-ml-skills@argythana/python-ml-skillsThis skill inherits all available tools. When active, it can use any tool Claude has access to.
pyproject.tomlsrc/skill_data_connector/__init__.pysrc/skill_data_connector/connect.pyConnect to data sources and retrieve basic information about datasets.
data-connect - Inspect Data SourceConnects to a data source and returns schema and summary information.
# Basic usage (outputs to stdout)
data-connect --source <path>
# Save to file
data-connect --source <path> --output report.md
Arguments:
--source (required): Path to data file or connection string--output: Output file path (default: stdout)--type: Override source type detection (parquet, csv, json)The script produces a markdown report with:
# Data Connection Report
- **source**: data/sales.parquet
- **type**: parquet
- **row_count**: 1,234,567
- **column_count**: 15
- **file_size**: 45.2 MB
## Columns
| Column | Type |
|--------|------|
| id | INTEGER |
| date | DATE |
| amount | DOUBLE |
| category | VARCHAR |
The connector auto-detects source type from file extension:
.parquet - Apache Parquet files.csv - CSV files (auto-detects delimiter).json, .jsonl - JSON files.db, .duckdb - DuckDB database filesUse when working with Payload CMS projects (payload.config.ts, collections, fields, hooks, access control, Payload API). Use when debugging validation errors, security issues, relationship queries, transactions, or hook behavior.