By sun-lab-nbb
Provides microcontroller discovery, extraction configuration, log processing, and pipeline orchestration skills for ataraxis-communication-interface. Includes MCP bindings for hardware discovery, manifest management, and batch data processing.
npx claudepluginhub sun-lab-nbb/ataraxis --plugin communicationComplete reference for ExtractionConfig parameters, generation from manifest, validation, and lifecycle. Covers the full extraction configuration data model, MCP tools for reading, writing, and validating configs, event code semantics, and config lifecycle workflow. Use when creating, reading, writing, or validating extraction configurations for the log processing pipeline.
Documents the input data format required by the log processing pipeline: NPZ log archives produced by DataLogger, source ID semantics, microcontroller manifest system, archive internal message layout, and communication protocol. Use when the user asks about log archive format, source IDs, DataLogger output, or why processing fails due to missing or malformed archives.
Complete reference for log processing output data formats, feather file discovery, output verification, event distribution analysis, and interpretation guidance. Use when evaluating log processing results, when the user asks about extracted event data, timing statistics, or microcontroller data quality.
Orchestrates batch log processing via the ataraxis-communication-interface MCP server: archive discovery, batch preparation, job execution, progress monitoring, cancellation, and error recovery. Use when processing microcontroller log archives, extracting hardware module and kernel data, or managing batch processing jobs.
Diagnoses and resolves ataraxis-communication-interface MCP server connectivity issues. Covers environment verification, command availability, Python version checks, dependency validation, and conda/pip/uv environment configuration. Use when MCP tools are unavailable, when the server fails to start, when the user reports connection issues, or when starting a session that requires MCP tools.
Guides creation and configuration of MicroControllerInterface, ModuleInterface, and MQTTCommunication instances for microcontroller communication. Covers MicroControllerInterface initialization and lifecycle, ModuleInterface subclassing with command and parameter sending, MQTTCommunication setup, system ID allocation, and DataLogger integration. Use when writing code that creates MicroControllerInterface or MQTTCommunication instances or needs to understand the AXCI API.
Guides use of ataraxis-communication-interface MCP tools for microcontroller discovery, MQTT broker verification, manifest management, log archive assembly, and recording discovery. Use when discovering connected microcontrollers, testing MQTT connectivity, managing manifests, or assembling log archives.
End-to-end orchestration guide for the ataraxis-communication-interface data acquisition and analysis pipeline. Covers canonical phase ordering with handoff conditions, multi-controller planning with DataLogger topology, and decision trees for hardware, configuration, and processing setup. Use when planning a full data collection workflow, setting up multi-controller systems, or deciding between MCP and code.
Write SQL, explore datasets, and generate insights faster. Build visualizations and dashboards, and turn raw data into clear stories for stakeholders.
Comprehensive real estate investment analysis plugin with financial modeling, market data APIs, deal analysis agents, and tax-aware structuring. Covers all property types: residential, commercial, multifamily, short-term rentals, and land development.
Open-source, local-first Claude Code plugin for token reduction, context compression, and cost optimization using hybrid RAG retrieval (BM25 + vector search), reranking, AST-aware chunking, and compact context packets.
Agent Skills for AI/ML tasks including dataset creation, model training, evaluation, and research paper publishing on Hugging Face Hub
Context window optimization for Cowork. Sandboxes tool output, compresses what returns with a self-learning 3-stage pipeline, and routes tools automatically — reducing context consumption by 30–60% in typical sessions, more in research-heavy ones.