From clari-pack
Provides reference architecture for Clari revenue intelligence integrations: API export pipelines, Snowflake/BigQuery schemas, Airflow orchestration, analytics, and alerting for forecast platforms.
npx claudepluginhub jeremylongshore/claude-code-plugins-plus-skills --plugin clari-packThis skill is limited to using the following tools:
Production architecture for Clari revenue intelligence integrations: export pipeline design, data warehouse schema, analytics layer, and alerting.
Provides production readiness checklist for Clari API integrations: authentication, data export pipelines, warehouse setup, scheduling, monitoring, rollback.
Builds data pipelines with ETL processes, ingestion from sources, batch/real-time processing, quality validation, schema design, warehousing, caching, replication, and analytics using TypeScript and Vercel Cron.
Designs data pipeline architectures for batch ETL, streaming, or hybrid scenarios including tech stacks, ASCII diagrams, data quality strategies, and cost analysis. Useful for real-time processing, BI reporting, or migrations.
Share bugs, ideas, or general feedback.
Production architecture for Clari revenue intelligence integrations: export pipeline design, data warehouse schema, analytics layer, and alerting.
┌──────────────┐ ┌─────────────────┐ ┌──────────────────┐
│ Clari App │ │ Clari Export │ │ Data Warehouse │
│ (SaaS) │────▶│ API (v4) │────▶│ (Snowflake/BQ) │
└──────────────┘ └─────────────────┘ └────────┬─────────┘
│
┌─────────────────┐ ┌────────▼─────────┐
│ Change │ │ Analytics / │
│ Detection │────▶│ Dashboard │
└─────────────────┘ │ (Looker/Metabase)│
│ └──────────────────┘
┌──────▼──────────┐
│ Alerts │
│ (Slack/Email) │
└─────────────────┘
clari-data-platform/
├── src/
│ ├── clari_client.py # API client wrapper
│ ├── export_pipeline.py # ETL pipeline
│ ├── change_detector.py # Forecast change tracking
│ ├── models.py # Data models
│ └── config.py # Environment config
├── dags/
│ └── clari_export_dag.py # Airflow DAG
├── sql/
│ ├── schema.sql # Warehouse table definitions
│ ├── merge.sql # Upsert logic
│ └── analytics/
│ ├── forecast_accuracy.sql
│ ├── pipeline_coverage.sql
│ └── rep_performance.sql
├── tests/
│ ├── fixtures/ # Sample API responses
│ ├── test_pipeline.py
│ └── test_change_detector.py
├── scripts/
│ ├── run_export.sh
│ └── validate_schema.py
└── monitoring/
├── alerts.yaml # Alert rules
└── dashboard.json # Grafana/Looker config
-- Core tables
CREATE TABLE clari_forecasts (
id BIGINT GENERATED ALWAYS AS IDENTITY,
owner_name VARCHAR NOT NULL,
owner_email VARCHAR NOT NULL,
forecast_amount DECIMAL(15,2),
quota_amount DECIMAL(15,2),
crm_total DECIMAL(15,2),
crm_closed DECIMAL(15,2),
adjustment_amount DECIMAL(15,2),
time_period VARCHAR NOT NULL,
forecast_name VARCHAR NOT NULL,
exported_at TIMESTAMP NOT NULL,
PRIMARY KEY (owner_email, time_period, forecast_name, exported_at)
);
-- Change tracking
CREATE TABLE clari_forecast_changes (
id BIGINT GENERATED ALWAYS AS IDENTITY,
owner_email VARCHAR NOT NULL,
time_period VARCHAR NOT NULL,
previous_amount DECIMAL(15,2),
current_amount DECIMAL(15,2),
change_pct DECIMAL(5,2),
detected_at TIMESTAMP NOT NULL
);
-- Analytics views
CREATE VIEW v_forecast_accuracy AS
SELECT
time_period,
owner_name,
forecast_amount,
crm_closed AS actual_closed,
ROUND((1 - ABS(forecast_amount - crm_closed) / NULLIF(forecast_amount, 0)) * 100, 1) AS accuracy_pct
FROM clari_forecasts
WHERE exported_at = (SELECT MAX(exported_at) FROM clari_forecasts f2 WHERE f2.time_period = clari_forecasts.time_period);
| Decision | Choice | Rationale |
|---|---|---|
| Export frequency | Daily | Balances freshness vs API load |
| Data format | JSON export | Structured, easy to parse |
| Pipeline orchestration | Airflow | Retry, monitoring, DAG visualization |
| Change detection | Snapshot comparison | Clari has no real-time webhooks |
| Warehouse | Snowflake | SQL analytics, dbt compatibility |
This completes the Clari skill pack. Start with clari-install-auth for new integrations.