From jaganpro-sf-skills-7
Manages Salesforce Data Cloud connections: connector discovery, metadata inspection, connection testing, source object/database browsing, and new source setup.
npx claudepluginhub jaganpro/sf-skillsThis skill uses the workspace's default tool permissions.
Use this skill when the user needs **source connection work**: connector discovery, connection metadata, connection testing, source-object browsing, connector schema inspection, or connector-specific setup payloads for external sources.
Orchestrates Salesforce Data Cloud multi-phase pipelines (connect→prepare→harmonize→segment→act), manages data spaces/kits, and troubleshoots cross-phase workflows via sf data360 CLI.
Creates and manages connections in SAP Datasphere to SAP systems (S/4HANA, BW, ECC), cloud databases (BigQuery, Redshift, Azure SQL), storage, and streaming for views, data flows, replication, federation, ETL.
Guides Salesforce Data Cloud (2025) integration patterns and architecture: data ingestion from 200+ sources, harmonization, identity resolution, real-time activation, zero-copy querying.
Share bugs, ideas, or general feedback.
Use this skill when the user needs source connection work: connector discovery, connection metadata, connection testing, source-object browsing, connector schema inspection, or connector-specific setup payloads for external sources.
Use sf-datacloud-connect when the work involves:
sf data360 connection *Delegate elsewhere when the user is:
Ask for or infer:
node ~/.claude/skills/sf-datacloud/scripts/diagnose-org.mjs -o <org> --phase connect --json.2>/dev/null for standard usage.connection list requires --connector-type.connection test, pass --connector-type when resolving a non-Salesforce connection by name.node ~/.claude/skills/sf-datacloud/scripts/diagnose-org.mjs -o <org> --phase connect --json
sf data360 connection connector-list -o <org> 2>/dev/null
sf data360 data-stream list -o <org> 2>/dev/null
sf data360 connection list -o <org> --connector-type SalesforceDotCom 2>/dev/null
sf data360 connection list -o <org> --connector-type REDSHIFT 2>/dev/null
sf data360 connection list -o <org> --connector-type SNOWFLAKE 2>/dev/null
sf data360 connection get -o <org> --name <connection> 2>/dev/null
sf data360 connection objects -o <org> --name <connection> 2>/dev/null
sf data360 connection fields -o <org> --name <connection> 2>/dev/null
sf data360 connection schema-get -o <org> --name <connection-id> 2>/dev/null
sf data360 connection test -o <org> --name <connection> --connector-type <type> 2>/dev/null
sf data360 connection create -o <org> -f connection.json 2>/dev/null
Use the phase-owned examples before inventing a payload from scratch:
examples/connections/heroku-postgres.jsonexamples/connections/redshift.jsonexamples/connections/sharepoint-unstructured.jsonexamples/connections/snowflake-connection.jsonexamples/connections/ingest-api-connection.jsonexamples/connections/ingest-api-schema.jsonTypical Ingestion API setup flow:
sf data360 connection create -o <org> -f examples/connections/ingest-api-connection.json 2>/dev/null
sf data360 connection schema-upsert -o <org> --name <connector-id> -f examples/connections/ingest-api-schema.json 2>/dev/null
sf data360 connection schema-get -o <org> --name <connector-id> 2>/dev/null
Create one in the UI, then inspect it directly:
sf api request rest "/services/data/v66.0/ssot/connections/<id>" -o <org>
connection list has no true global "list all" mode; query by connector type.connection test may need --connector-type for name resolution when the source is not a default Salesforce connector.clientId, clientSecret, and tokenEndpoint in the credentials array and does not require a parameters array.connection schema-upsert has uploaded the object schema.Connect task: <inspect / create / test / update>
Connector type: <SalesforceDotCom / REDSHIFT / SNOWFLAKE / SPUnstructuredDocument / IngestApi / ...>
Target org: <alias>
Commands: <key commands run>
Verification: <passed / partial / blocked>
Next step: <prepare phase or connector follow-up>