npx claudepluginhub plurigrid/asi --plugin asiThis skill uses the workspace's default tool permissions.
> *"A gesture is a continuous curve in a topological category."*
Implements Guerino Mazzola's Topos of Music framework in Julia: forms, denotators, morphisms, Neo-Riemannian PLR operations, and Gay.jl color integration. For mathematical music theory apps.
Generates emergent algorithmic music patterns using Pattrns Lua engine in Renoise. For euclidean rhythms, evolving breakbeats, generative melodies, textures, and live coding in breakbeat, IDM, jazz, ambient styles.
Guides Next.js Cache Components and Partial Prerendering (PPR): 'use cache' directives, cacheLife(), cacheTag(), revalidateTag() for caching, invalidation, static/dynamic optimization. Auto-activates on cacheComponents: true.
Share bugs, ideas, or general feedback.
"A gesture is a continuous curve in a topological category." — Guerino Mazzola, Topos of Music III: Gestures
Trit: +1 (PLUS - generative) Color: #C42990 (from seed 137508, index 23) Foundation: Mazzola's Diamond Conjecture
Gestures are the missing link between structure (forms/denotators) and performance (physical action). This skill implements Mazzola's gesture theory from Topos of Music III.
Form (static) → Gesture (dynamic) → Performance (physical)
↓ ↓ ↓
Denotator Hypergesture Sound wave
A gesture is a continuous curve γ: [0,1] → X in a topological category:
struct Gesture{T}
domain::Interval # [0, 1] or [a, b]
target::T # Topological space
curve::Function # t → target
end
# Example: pitch gesture (glissando)
glissando = Gesture(
(0.0, 1.0),
PitchSpace,
t -> 60 + 12 * t # C4 to C5
)
A hypergesture is a gesture of gestures - a higher-order curve:
struct Hypergesture{T}
base_gestures::Vector{Gesture{T}}
interpolation::Function # Gesture × Gesture → Gesture
end
# Hypergesture: morphing between two melodic contours
melody_morph = Hypergesture(
[melody_a, melody_b],
(g1, g2, t) -> interpolate_gesture(g1, g2, t)
)
The fundamental theorem relating local to global:
H^n(Gesture) ≅ H^n(Skeleton) ⊗ H^n(Body)
Local gesture fragments glue iff cohomology obstructions vanish.
Gestures extend the Form/Denotator framework:
# Form → Gesture
NoteGestureForm = GestureForm(NoteForm)
# Denotator → Gestured Denotator
performed_note = GesturedDenotator(
note,
timing_gesture, # Micro-timing
dynamics_gesture # Expression curve
)
Sonify gesture trajectories:
function sonify_gesture(g::Gesture, seed::Int)
Gay.gay_seed!(seed)
for t in 0:0.1:1
point = g.curve(t)
color = Gay.next_color()
freq = pitch_to_freq(point)
trit = hue_to_trit(Gay.hue(color))
play_tone(freq, waveform_for_trit(trit))
end
end
Track gesture stability across filtrations:
# Gesture persistence: which contours survive simplification?
filtration = [ε for ε in 0.1:0.1:1.0]
persistence = compute_gesture_persistence(gesture_space, filtration)
# Stable gestures = robust performance features
stable_gestures = filter(g -> g.persistence > 0.5, all_gestures)
Verify gesture gluing conditions:
# Local gesture patches
left_hand = Gesture(...)
right_hand = Gesture(...)
# Check if they glue into coherent performance
H1 = compute_gesture_cohomology([left_hand, right_hand])
if H1 == 0
println("Gestures glue correctly!")
else
println("Obstruction detected: hands don't coordinate")
end
Gesture composition as net reduction:
┌──────┐ ┌──────┐
│ G1 ├─────┤ G2 │
└──┬───┘ └───┬──┘
│ │
└──────┬──────┘
│
┌────┴────┐
│ G1;G2 │ (composed gesture)
└─────────┘
Gestures as continuous star trajectories:
' Gesture as constellation with time parameter
(def gesture_star
[(+time T) (+pitch (interp P0 P1 T)) (-note N)])
' Hypergesture: nest gestures
(def hyper
[(+meta_time S)
(+gesture (gesture_star S))
(-performance P)])
Objects: Topological spaces (pitch, dynamics, timing, ...)
Morphisms: Continuous curves γ: I → X
Composition: Path concatenation (up to homotopy)
H_0(G) = Connected components of gesture space
H_1(G) = Loops in gesture space (repeating patterns)
H_n(G) = Higher-dimensional voids (complex structures)
Skeleton
↑
Body ←─── G ───→ Gesture
↓
Performance
G = Gesture object relating all three domains
# Create gesture from MIDI
just gesture-from-midi performance.mid
# Interpolate between gestures
just gesture-interpolate g1.json g2.json --steps 10
# Verify hypergesture cohomology
just gesture-h1 hypergesture.json
# Sonify gesture trajectory
just gesture-sonify contour.json --seed 137508
# Visualize gesture space
just gesture-viz --output gesture.svg
gesture-hypergestures (+1) ⊗ topos-of-music (0) ⊗ rubato-composer (-1) = 0 ✓
gesture-hypergestures (+1) ⊗ catsharp-sonification (0) ⊗ persistent-homology (-1) = 0 ✓
gesture-hypergestures (+1) ⊗ interaction-nets (0) ⊗ sheaf-cohomology (-1) = 0 ✓
# Rubato = tempo gesture
rubato = Gesture(
(0.0, 1.0),
TempoSpace,
t -> 120 * (1 + 0.1 * sin(2π * t * 4)) # Oscillating around 120 BPM
)
# Apply to note sequence
for note in score
performed_onset = apply_gesture(rubato, note.onset)
play(note.pitch, performed_onset, note.duration)
end
lib/gesture.jllib/hypergesture_homology.jllib/gesture_sonify.jl| Skill | Trit | Relationship |
|---|---|---|
| topos-of-music | 0 | Forms → Gestures |
| catsharp-sonification | 0 | Sonify trajectories |
| rubato-composer | -1 | Execute performances |
| persistent-homology | -1 | Gesture stability |
| sheaf-cohomology | -1 | Gluing verification |
Skill Name: gesture-hypergestures Type: Musical Performance Theory Trit: +1 (PLUS - GENERATOR) GF(3): Generates continuous performance curves Sonification: C#4 sine (hue 55°, warm)