npx claudepluginhub ruvnet/ruview --plugin ruviewThis skill is limited to using the following tools:
What RuView can sense, and how to run each one. Assumes you have either the Docker demo (simulated CSI) or a live ESP32 sink (see `ruview-quickstart` / `ruview-hardware-setup`).
Enables advanced RuView multistatic sensing, RF tomography, cross-viewpoint fusion, and adversarial signal detection for research-grade or multi-node WiFi sensing deployments.
Generates optimized YAML for Frigate NVR covering object detection, recording, zones, hardware acceleration (Coral TPU, OpenVINO), and Home Assistant integration. Use for camera setup and troubleshooting detection issues.
Guides SensorKit integration for research-grade sensors like ambient light, motion, and device usage on iOS/watchOS. Covers entitlements, authorization for approved studies.
Share bugs, ideas, or general feedback.
What RuView can sense, and how to run each one. Assumes you have either the Docker demo (simulated CSI) or a live ESP32 sink (see ruview-quickstart / ruview-hardware-setup).
| Application | What it does | Entry point |
|---|---|---|
| Presence / occupancy | Detect people through walls, count them, track entries/exits (trained model + PIR fusion, ~0.012 ms latency) | sensing-server live mode; examples/environment/ |
| Vital signs | Breathing 6–30 BPM (bandpass 0.1–0.5 Hz), heart rate 40–120 BPM (bandpass 0.8–2.0 Hz), contactless while sleeping/sitting | wifi-densepose-vitals crate (ADR-021); examples/medical/ |
| Activity recognition | Walking, sitting, gestures, falls — from temporal CSI patterns | RuvSense gesture.rs (DTW), pose_tracker.rs; scripts/gait-analyzer.js |
| Pose estimation | 17 COCO keypoints via WiFlow architecture; dual-modal webcam+WiFi fusion demo | cargo run -p wifi-densepose-sensing-server + pose-fusion demo (ADR-059); see ruview-model-training to train |
| Sleep monitoring | Overnight monitoring, sleep-stage classification, apnea screening | examples/sleep/; scripts/apnea-detector.js |
| Environment mapping | RF fingerprinting identifies rooms, detects moved furniture, spots new objects | sensing-server --build-index env; RuvSense field_model.rs, cross_room.rs |
| Mass Casualty Assessment (MAT) | Disaster survivor detection — find people in rubble/smoke | wifi-densepose-mat crate; docs/wifi-mat-user-guide.md; examples/medical/ |
| 3D point cloud (optional fusion) | Camera depth (MiDaS) + WiFi CSI + mmWave radar → unified spatial model (~22 ms, 19K+ pts/frame) | scripts/mmwave_fusion_bridge.py; ADR-094 (GitHub Pages deploy) |
| Novel RF apps | Passive radar, material classification, device fingerprinting, mincut person-counting | scripts/passive-radar.js, material-classifier.js, device-fingerprint.js, mincut-person-counter.js (ADR-077/078) |
# Docker demo — everything, simulated CSI
docker run -p 3000:3000 ruvnet/wifi-densepose:latest # http://localhost:3000
# Live sensing server (consumes ESP32 UDP CSI)
cd v2 && cargo run -p wifi-densepose-sensing-server
# Live RF room scan (Cognitum Seed on :5006)
node scripts/rf-scan.js --port 5006
node scripts/snn-csi-processor.js --port 5006
# Embed a trained model + build an environment index
cd v2
cargo run -p wifi-densepose-sensing-server -- --model model.rvf --embed
cargo run -p wifi-densepose-sensing-server -- --model model.rvf --build-index env
# Python live demo
python examples/ruview_live.py
# Spectrogram / graph visualisers
node scripts/csi-spectrogram.js
node scripts/csi-graph-visualizer.js
ruview-model-training).docs/wifi-mat-user-guide.md).ruview-advanced-sensing), optionally + Cognitum Seed.examples/environment/ · examples/medical/ · examples/sleep/ · examples/stress/ · examples/happiness-vector/ · examples/ruview_live.py — each has a README.
README.md — feature matrix, latency/throughput numbersdocs/user-guide.md, docs/wifi-mat-user-guide.mdv2/crates/wifi-densepose-signal/src/ruvsense/ (14 modules)