npx claudepluginhub refractionpoint/lc-ai --plugin lc-advanced-skillsThis skill is limited to using the following tools:
A comprehensive, dynamic assistant for LimaCharlie adapter lifecycle management. This skill researches adapter configurations from multiple sources and helps you create, validate, deploy, and troubleshoot adapters for any data source.
Guides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Migrates code, prompts, and API calls from Claude Sonnet 4.0/4.5 or Opus 4.1 to Opus 4.5, updating model strings on Anthropic, AWS, GCP, Azure platforms.
Reviews prose for communication issues impeding comprehension, outputs minimal fixes in a three-column table per Microsoft Writing Style Guide. Useful for 'review prose' or 'improve prose' requests.
A comprehensive, dynamic assistant for LimaCharlie adapter lifecycle management. This skill researches adapter configurations from multiple sources and helps you create, validate, deploy, and troubleshoot adapters for any data source.
Prerequisites: Run
/init-lcto initialize LimaCharlie context.
All LimaCharlie operations use the limacharlie CLI directly:
limacharlie <noun> <verb> --oid <oid> --output yaml [flags]
For command help and discovery: limacharlie <command> --ai-help
| Rule | Wrong | Right |
|---|---|---|
| CLI Access | Call MCP tools or spawn api-executor | Use Bash("limacharlie ...") directly |
| Output Format | --output json | --output yaml (more token-efficient) |
| Filter Output | Pipe to jq/yq | Use --filter JMESPATH to select fields |
| LCQL Queries | Write query syntax manually | Use limacharlie ai generate-query first |
| Timestamps | Calculate epoch values | Use date +%s or date -d '7 days ago' +%s |
| OID | Use org name | Use UUID (call limacharlie org list if needed) |
Use this skill when:
Common scenarios:
This skill is truly dynamic - it researches adapter configurations from multiple sources in real-time:
The skill then guides you through creating, validating, and deploying adapter configurations with proper parsing rules and credential management.
| Category | Description | Examples |
|---|---|---|
| External Adapter | Cloud-managed syslog, webhook, or API receivers | syslog, webhook, custom API |
| Cloud Sensor | Cloud-to-cloud SaaS integrations | Okta, CrowdStrike, O365, AWS S3, Azure Event Hub |
| On-prem/USP Adapter | Binary deployments with local configuration | file, syslog receiver, Kubernetes pods |
This is the key capability - the skill researches ANY data source dynamically, not just predefined ones.
Search local documentation:
Glob("./docs/limacharlie/doc/Sensors/Adapters/Adapter_Types/*{keyword}*.md")
Check GitHub usp-adapters repository (use API at root - adapters are NOT in a subdirectory):
WebFetch(
url="https://api.github.com/repos/refractionPOINT/usp-adapters/contents",
prompt="List all available adapter directories from the JSON response"
)
If a native adapter exists, first list files in the adapter directory, then fetch the main source:
# Step 1: List files to find the main config file
WebFetch(
url="https://api.github.com/repos/refractionPOINT/usp-adapters/contents/{adapter}",
prompt="List all .go files in this adapter directory"
)
# Step 2: Fetch the config (usually client.go, but some adapters differ - e.g., sentinelone uses s1.go)
WebFetch(
url="https://raw.githubusercontent.com/refractionPOINT/usp-adapters/master/{adapter}/client.go",
prompt="Extract all configuration fields from the Config struct"
)
ALWAYS research the external product to understand its capabilities:
WebSearch("{product name} API documentation")
WebSearch("{product name} webhook integration")
WebSearch("{product name} audit logs export")
Extract:
If NO native adapter exists, determine how to connect:
Then map out:
Use AskUserQuestion to understand the user's needs:
Execute the dynamic research strategy above to gather all relevant information about:
Get organizations:
limacharlie org list --output yaml
List existing External Adapters:
limacharlie external-adapter list --oid <oid> --output yaml
List existing Cloud Sensors:
limacharlie cloud-adapter list --oid <oid> --output yaml
Get existing configuration (if modifying):
limacharlie external-adapter get --key <adapter-name> --oid <oid> --output yaml
# or for cloud sensors:
limacharlie cloud-adapter get --key <sensor-name> --oid <oid> --output yaml
Build the configuration based on research:
client_options:
identity:
oid: "<organization-id>"
installation_key: "<installation-key>" # or use hive://secret/...
platform: "text|json|carbon_black|gcp|..."
sensor_seed_key: "<unique-identifier>"
hostname: "<adapter-hostname>"
mapping:
parsing_grok:
message: '%{PATTERN:field} ...' # For text platform
event_type_path: "field/path"
event_time_path: "timestamp/path"
sensor_hostname_path: "host/path"
Use Hive secrets for sensitive values:
apikey: "hive://secret/okta-api-key"
client_secret: "hive://secret/azure-client-secret"
Check existing secrets:
limacharlie secret list --oid <oid> --output yaml
indexing:
- events_included: ["*"]
path: "src_ip"
index_type: "ip"
- events_included: ["*"]
path: "user/email"
index_type: "user"
Supported index types: file_hash, file_path, file_name, domain, ip, user, service_name, package_name
Validate parsing rules before deployment:
# Write adapter config with mapping and sample data to a YAML file:
cat > /tmp/usp-test.yaml << 'EOF'
parsing_grok:
message: "%{TIMESTAMP_ISO8601:timestamp} %{WORD:action} ..."
event_type_path: action
event_time_path: timestamp
sample_data:
- "2024-01-15T10:30:00Z LOGIN user@example.com"
EOF
limacharlie usp validate \
--platform text \
--input-file /tmp/usp-test.yaml \
--oid <oid> --output yaml
Review validation results:
For complex parsing, invoke the parsing-helper skill:
Skill("parsing-helper")
For local testing before production deployment:
Skill("test-limacharlie-adapter")
Deploy External Adapter:
# Write configuration to a temp file first
cat > /tmp/adapter-config.yaml << 'EOF'
<full-configuration-yaml>
EOF
limacharlie external-adapter set --key <adapter-name> --input-file /tmp/adapter-config.yaml --oid <oid> --output yaml
Deploy Cloud Adapter:
cat > /tmp/cloud-adapter-config.yaml << 'EOF'
<full-configuration-yaml>
EOF
limacharlie cloud-adapter set --key <sensor-name> --input-file /tmp/cloud-adapter-config.yaml --oid <oid> --output yaml
For On-prem Adapters, generate deployment artifacts:
YAML Configuration:
syslog:
client_options:
identity:
installation_key: "<IID>"
oid: "<OID>"
platform: text
sensor_seed_key: "<unique-key>"
hostname: "<hostname>"
mapping:
parsing_grok:
message: '%{PATTERN:field} ...'
event_type_path: "..."
port: 514
is_udp: true
CLI Command:
./lc_adapter syslog \
client_options.identity.installation_key=<IID> \
client_options.identity.oid=<OID> \
client_options.platform=text \
client_options.sensor_seed_key=<key> \
"client_options.mapping.parsing_grok.message=%{PATTERN:field} ..." \
port=514 \
is_udp=true
Docker Command:
docker run -d --rm -p 514:514/udp refractionpoint/lc-adapter syslog \
client_options.identity.installation_key=<IID> \
client_options.identity.oid=<OID> \
client_options.platform=text \
client_options.sensor_seed_key=<key> \
port=514 \
is_udp=true
Verify deployment:
limacharlie sensor list --selector "iid == \`<installation-key-iid>\`" --oid <oid> --output yaml
Troubleshoot if data not appearing:
limacharlie external-adapter get --key <adapter-name> --oid <oid> --output yaml
limacharlie org errors --oid <oid> --output yaml
Look for errors with component names containing the adapter name.
Verify sensor exists in sensor list
Query for recent events:
# First calculate timestamps dynamically
start=$(date -d '1 hour ago' +%s) && end=$(date +%s)
limacharlie event list --sid <sensor-id> --start $start --end $end --oid <oid> --output yaml
event_type: "unknown_event" with only text field)Offer D&R rule creation:
Skill("detection-engineering")
For auditing adapters across multiple organizations, spawn parallel sub-agents:
Task(
subagent_type="lc-essentials:multi-org-adapter-auditor",
prompt="Audit adapters for organization: {org_name} ({oid})
Return:
- List of external adapters with status
- List of cloud sensors with status
- Any adapters with errors
- Configuration issues detected"
)
| Adapter Type | Platform | Deployment | Key Configuration |
|---|---|---|---|
| syslog | text | External/On-prem | port, is_udp, parsing_grok |
| webhook | json | Cloud Sensor | secret, client_options |
| okta | json | Cloud Sensor | apikey, url |
| s3 | varies | Cloud Sensor/On-prem | bucket_name, access_key, secret_key, prefix |
| azure_event_hub | varies | Cloud Sensor/On-prem | connection_string |
| office365 | json | Cloud Sensor | domain, tenant_id, publisher_id, client_id, client_secret, endpoint |
| falconcloud | json | Cloud Sensor/On-prem | client_id, client_secret |
| pubsub | varies | Cloud Sensor/On-prem | project_id, subscription_id, service_account_creds |
| file | varies | On-prem | file_path, backfill |
IMPORTANT: Always check the usp-adapters repo for authoritative field definitions:
https://github.com/refractionPOINT/usp-adaptersclient.go with a *Config struct defining all valid fieldsclient_options containing identity, platform, and sensor_seed_keyUser: "I want to ingest Okta system logs into LimaCharlie"
Workflow:
Research Phase:
./docs/limacharlie/doc/Sensors/Adapters/Adapter_Types/adapter-types-okta.mdConfiguration:
sensor_type: "okta"
okta:
apikey: "hive://secret/okta-api-key"
url: "https://your-company.okta.com"
client_options:
identity:
oid: "<oid>"
installation_key: "<iid>"
hostname: "okta-system-logs"
platform: "json"
sensor_seed_key: "okta-logs-sensor"
mapping:
sensor_hostname_path: "client/device"
event_type_path: "eventType"
event_time_path: "published"
External Setup Instructions:
limacharlie secret set okta-api-key --oid <oid>Deploy: Use set_cloud_sensor
User: "Set up a syslog adapter for our firewall. Sample log: <134>Nov 15 12:30:45 fw01 ACCEPT TCP 192.168.1.100:54321 10.0.0.5:443"
Workflow:
Analyze log format: Priority, syslog timestamp, hostname, action, protocol, src:port dst:port
Generate Grok pattern:
parsing_grok:
message: '<%{INT:priority}>%{SYSLOGTIMESTAMP:timestamp} %{HOSTNAME:host} %{WORD:action} %{WORD:protocol} %{IP:src_ip}:%{NUMBER:src_port} %{IP:dst_ip}:%{NUMBER:dst_port}'
Validate with validate_usp_mapping using the sample log
Deploy as External Adapter:
cat > /tmp/firewall-adapter.yaml << 'EOF'
adapter_type: syslog
port: 514
is_udp: true
client_options:
identity:
oid: "<oid>"
installation_key: "<iid>"
platform: text
sensor_seed_key: firewall-logs
hostname: firewall-syslog
mapping:
parsing_grok:
message: "<%{INT:priority}>%{SYSLOGTIMESTAMP:timestamp} %{HOSTNAME:host} %{WORD:action} %{WORD:protocol} %{IP:src_ip}:%{NUMBER:src_port} %{IP:dst_ip}:%{NUMBER:dst_port}"
event_type_path: action
event_time_path: timestamp
sensor_hostname_path: host
EOF
limacharlie external-adapter set --key firewall-syslog --input-file /tmp/firewall-adapter.yaml --oid <oid> --output yaml
User: "I want to connect this monitoring tool that sends webhooks, I don't know if LimaCharlie supports it"
Workflow:
Ask clarifying questions:
Research the product:
WebSearch("{product name} webhook documentation")
WebSearch("{product name} webhook payload format")
Read LimaCharlie webhook adapter docs:
Read("./docs/limacharlie/doc/Sensors/Adapters/Adapter_Types/adapter-types-webhook.md")
Build integration plan:
Deploy and provide webhook URL to user
User: "My Azure Event Hub adapter isn't receiving data"
Workflow:
Get current configuration:
limacharlie cloud-adapter get --key azure-event-hub --oid <oid> --output yaml
Check for errors in last_error field of sys_mtd
Common issues:
Research Azure-side setup:
WebSearch("Azure Event Hub diagnostic settings configuration")
Provide fix guidance based on error analysis
Installation Key vs IID: For adapters, use the IID (UUID format like e9a3bcdf-efa2-47ae-b6df-579a02f3a54d), not the full base64 installation key
DATESTAMP vs TIMESTAMP_ISO8601:
YYYY-MM-DD HH:MM:SS → Use %{TIMESTAMP_ISO8601}MM-DD-YY HH:MM:SS → Use %{DATESTAMP}Jan 15 12:30:45 → Use %{SYSLOGTIMESTAMP}Grok field name: Always use message as the key in parsing_grok for text platform:
parsing_grok:
message: '%{PATTERN:field}' # Correct
Azure Event Hub connection string: Must include EntityPath=<event-hub-name> for consumer applications
Credential storage: Always use hive://secret/<secret-name> for sensitive values in production
Unparsed events: If you see event_type: "unknown_event" with only a text field, parsing is not configured
Timestamp calculations: Always use Bash to calculate epoch timestamps dynamically:
start=$(date -d '1 hour ago' +%s) && end=$(date +%s)
parsing-helper - For complex Grok pattern generation and validationtest-limacharlie-adapter - For local testing before production deploymentdetection-engineering - For creating D&R rules on adapter dataFor more details:
./docs/limacharlie/doc/Sensors/Adapters/adapter-usage.md./docs/limacharlie/doc/Sensors/Adapters/adapter-deployment.md./docs/limacharlie/doc/Sensors/Adapters/Adapter_Types/