From ai-brain-starter
Ingests recent Slack channel messages into vault as daily markdown files in External Inputs/Slack. Auto-creates decision log stubs for keywords like incident, outage, pricing. Invoke via /ingest-slack <channel> [--days N] or ingest requests.
npx claudepluginhub adelaidasofia/ai-brain-starterThis skill uses the workspace's default tool permissions.
Ingests recent Slack messages into the vault as markdown the graphify pipeline can read and the rest of the AI Brain Starter substrate (decision log, session-close cascade, hooks) can act on.
Ingests recent messages from WhatsApp chats (group or direct) into Obsidian vault as daily markdown files. Auto-creates Decision Log stubs for keywords like incident, outage. Invoke via /ingest-whatsapp <chat> [--days N].
Automates Slack via CLI for AI agents: read messages/threads/history, search/download attachments, send/edit/delete/reply, manage channels/users/workflows/unreads/mark read.
Handles Slack channel and thread messaging with context-aware session management. Reads messages, sends replies, searches channels/users, drafts, and schedules via Slack MCP tools.
Share bugs, ideas, or general feedback.
Ingests recent Slack messages into the vault as markdown the graphify pipeline can read and the rest of the AI Brain Starter substrate (decision log, session-close cascade, hooks) can act on.
This is the first connector in a pattern. Adding the next external source (Notion, Jira, email) means writing a new normalizer, not a new architecture.
/ingest-slack <channel-name> (with or without --days N)Do NOT use for:
slack_send_message directly)slack_read_thread)slack_search_channelsslack_read_channelslack_read_thread## per parent message, threads as ### sub-sections)External Inputs/Slack/<channel-name>/<YYYY-MM-DD>.md⚙️ Meta/Decisions/<YYYY-MM-DD>-slack-<channel>-<sha8(message_ts)>.mdThe skill is a thin orchestrator. The actual ingestion runs in Python at ~/.claude/skills/ingest-slack/ingest.py. The skill assembles the Slack MCP tool calls, hands the raw payloads to ingest.py, and the script does the normalization, file write, and edge-case scan.
When invoked:
--days N (optional, default 7)slack_search_channels with the channel name. If zero results, report "channel not found" and stop. If multiple, ask the user to disambiguate.slack_read_channel with the resolved channel ID and a limit of 100. Compute oldest as the Unix timestamp for now - N days.reply_count > 0, call slack_read_thread with the parent ts.ingest.py as JSON on stdin.ingest.py writes the vault file, the decision stubs, and prints a summary.The vault file at External Inputs/Slack/<channel>/<YYYY-MM-DD>.md has frontmatter:
---
type: external-input
source: slack
channel: <channel-name>
channel_id: <Cxxxx>
date_range: <YYYY-MM-DD>..<YYYY-MM-DD>
message_count: <int>
ingested_at: <ISO 8601 timestamp>
---
Body is chronological. Each parent message is a ## YYYY-MM-DD HH:MM <author> section with the message body as its content. Thread replies become ### YYYY-MM-DD HH:MM <author> sub-sections.
Decision Log stubs created at ⚙️ Meta/Decisions/<date>-slack-<channel>-<sha8>.md carry frontmatter that matches the existing decision schema so the aggregator picks them up cleanly.
Re-running /ingest-slack <channel> --days N on the same calendar day overwrites the same vault file. The decision stub filenames hash the Slack message timestamp, so the same source message produces the same stub filename across re-runs. Re-runs do not duplicate; they refresh.
A successful run produces:
External Inputs/Slack/<channel>/<date>.md⚙️ Meta/Decisions/<date>-slack-<channel>-<sha8>.mdWrote N messages to <path>. Detected K edge cases. Stubs at: <paths>.If the channel resolves but contains no messages in the date range, write the file anyway with message_count: 0 so re-runs are still idempotent and the absence is recorded.