From rpw-working
This skill should be used when the user mentions "UCO", "use case", or "use case object" in any context — including requests to find, list, count, filter, update, add, or ask questions about UCOs. Trigger on phrases like "show my UCOs", "how many UCOs", "what UCOs does this account have", "update a UCO", "create a UCO", "UCO status", "UCO health", "stale UCOs", "UCOs for [account/AE/SA]", or any question about use case lifecycle stages.
npx claudepluginhub randypitcherii/rpw-agent-marketplace --plugin rpw-workingThis skill uses the workspace's default tool permissions.
Manage Salesforce Use Case Objects (UCOs) for Databricks Field Engineering. Covers querying, filtering, updating, and creating UCOs via the `sf` CLI.
Guides Payload CMS config (payload.config.ts), collections, fields, hooks, access control, APIs. Debugs validation errors, security, relationships, queries, transactions, hook behavior.
Builds scalable data pipelines, modern data warehouses, and real-time streaming architectures using Spark, dbt, Airflow, Kafka, and cloud platforms like Snowflake, BigQuery.
Builds production Apache Airflow DAGs with best practices for operators, sensors, testing, and deployment. For data pipelines, workflow orchestration, and batch job scheduling.
Manage Salesforce Use Case Objects (UCOs) for Databricks Field Engineering. Covers querying, filtering, updating, and creating UCOs via the sf CLI.
salesforce-authentication) to be available in the user's environment.databricks-query) for bulk analysis.Before any Salesforce operation, verify authentication using the salesforce-authentication skill:
sf org display 2>/dev/null | grep "Connected Status"
If not connected, run: sf org login web --instance-url=https://<your-instance>.salesforce.com/
Always pass -o <username> to all sf commands. Get the username from:
sf org display --json | jq -r '.result.username'
Store it and reuse: SF_USER=$(sf org display --json | jq -r '.result.username')
Determine the current user's Salesforce identity and role to drive correct account filtering:
# Get user ID and role
sf data query -o "$SF_USER" --query "SELECT Id, Name, Title, UserRole.Name FROM User WHERE Username = '$SF_USER'" --json
Role inference rules:
Store the user ID: MY_ID=<Id from query>
The default scope is always all open UCOs (U2–U5) on the user's accounts, regardless of who the UCO is assigned to. Account transitions happen frequently at Databricks — if you're on an account, those UCOs are your responsibility even if assigned to someone else.
For SAs — accounts where you are the primary SA:
Account__r.Last_SA_Engaged__c = '<MY_ID>'
For AEs — accounts you own:
Account__r.OwnerId = '<MY_ID>'
When presenting results, say "here are your active UCOs" — don't expose the technical filter details unless the user asks.
<> not != — zsh escapes != as \!=, breaking queries-o <username> — no default org is setAccount__c IN (SELECT Id FROM Account WHERE ...)sf data query -o "$SF_USER" --query "SELECT Id, Name, Stages__c, Account__r.Name, SAOwner__r.Name, Implementation_Status__c FROM UseCase__c WHERE Account__r.Last_SA_Engaged__c = '$MY_ID' AND Stages__c IN ('U2', 'U3', 'U4', 'U5') ORDER BY Account__r.Name, Stages__c"
# First get AE's user ID
sf data query -o "$SF_USER" --query "SELECT Id FROM User WHERE Name = 'Alex Example'" --json | jq -r '.result.records[0].Id'
# Then filter accounts by AE owner
sf data query -o "$SF_USER" --query "SELECT Id, Name, Stages__c, Account__r.Name, SAOwner__r.Name, Implementation_Status__c FROM UseCase__c WHERE Account__r.Last_SA_Engaged__c = '$MY_ID' AND Account__r.OwnerId = '<AE_ID>' AND Stages__c IN ('U2', 'U3', 'U4', 'U5') ORDER BY Account__r.Name, Stages__c"
sf data query -o "$SF_USER" --query "SELECT Id, Name, Stages__c, Implementation_Status__c, SAOwner__r.Name FROM UseCase__c WHERE Account__r.Name LIKE '%Acme Corp%' AND Stages__c IN ('U2', 'U3', 'U4', 'U5')"
sf data query -o "$SF_USER" --query "SELECT Id, Name, Stages__c, Account__r.Name, SAOwner__r.Name FROM UseCase__c WHERE Name LIKE '%<SEARCH_TERM>%' AND Stages__c IN ('U2', 'U3', 'U4', 'U5')"
sf data query -o "$SF_USER" --query "SELECT Id, Name, Stages__c, Account__r.Name, SAOwner__r.Name, Implementation_Status__c, Demand_Plan_Next_Steps__c, Full_Production_Date__c, Implementation_Start_Date__c, Implementation_Notes__c, UseCaseInPlan__c, DSA__r.Name FROM UseCase__c WHERE Id = '<UCO_ID>'"
Always read the current state before updating next steps (to preserve existing content).
sf data update record -o "$SF_USER" --sobject UseCase__c --record-id <UCO_ID> \
--values "Implementation_Status__c=Green"
# Valid: Green, Yellow, Red
sf data update record -o "$SF_USER" --sobject UseCase__c --record-id <UCO_ID> \
--values "Stages__c=U3"
# Valid: U1, U2, U3, U4, U5, U6, Lost, Disqualified
# 1. Read current next steps
CURRENT=$(sf data query -o "$SF_USER" --query "SELECT Demand_Plan_Next_Steps__c FROM UseCase__c WHERE Id = '<UCO_ID>'" --json | jq -r '.result.records[0].Demand_Plan_Next_Steps__c // ""')
# 2. Prepend new entry (newest first, format: Mon-DD - INITIALS - update text)
sf data update record -o "$SF_USER" --sobject UseCase__c --record-id <UCO_ID> \
--values "Demand_Plan_Next_Steps__c='Feb-21 - XX - <UPDATE>\n$CURRENT'"
sf data update record -o "$SF_USER" --sobject UseCase__c --record-id <UCO_ID> \
--values "Implementation_Start_Date__c=2026-03-01 Full_Production_Date__c=2026-06-01"
sf data update record -o "$SF_USER" --sobject UseCase__c --record-id <UCO_ID> \
--values "Implementation_Status__c=Green Full_Production_Date__c=2026-06-01 Demand_Plan_Next_Steps__c='Feb-21 - XX - <UPDATE>\n<EXISTING>'"
sf data create record -o "$SF_USER" --sobject UseCase__c \
--values "Name='<UCO Name>' Account__c=<ACCOUNT_ID> Stages__c=U2 Implementation_Status__c=Green"
To find an account ID:
sf data query -o "$SF_USER" --query "SELECT Id, Name FROM Account WHERE Name LIKE '%<ACCOUNT_NAME>%'" --json
references/uco-fields.md — Full field reference, legacy vs active fields, stage definitions, next steps formatreferences/soql-patterns.md — Common query patterns, bulk filtering, historical analysis via Logfood