From antigravity-awesome-skills
Enables compact agent-to-agent messaging via Lambda language with 340+ atoms across 7 domains (core, a2a, code, evo). 3x smaller than natural language for A2A protocols, task handoffs, heartbeats.
npx claudepluginhub sickn33/antigravity-awesome-skillsThis skill uses the workspace's default tool permissions.
**Lambda is not a translation protocol. It is a native language for agents.**
Provides inter-agent communication patterns like message passing, shared memory, blackboard systems, and event-driven architectures for LLM agents. Activates on mentions of agent communication, multi-agent, or coordination.
Installs bundles of skills for LLM context engineering, multi-agent architectures, memory systems, tool design, and agent evaluation in Claude Code and Cursor.
Creates new Claude Code agent definition files using agent-almanac templates and registry conventions. Covers persona design, tool/skill selection, model choice, frontmatter schema, and symlink verification. Use for specialized subagents or library additions.
Share bugs, ideas, or general feedback.
Lambda is not a translation protocol. It is a native language for agents.
Agents do not need to produce grammatically correct English to coordinate — they need to understand each other. Lambda is the shared vocabulary that makes that possible: compact, unambiguous, machine-native. Compression (3x vs natural language, 4.6x vs JSON on single messages) is a side effect of removing human redundancy, not the goal.
Lambda messages are built from atoms. Every atom is a 2-character code mapped to a concept — not to an English word. The structure is Type → Entity → Verb → Object, with prefixes marking intent:
? — query (e.g. ?Uk/co — query: "does this user have consciousness?")! — assertion / declaration (e.g. !It>Ie — "self reflects, therefore self exists")# — state / tag> — implication / flow/ — binding / scopeLambda ships 340+ atoms across 7 domains. Pick atoms from the domain that fits your channel:
Both agents need the same atom table loaded. Lossy decoding is fine: if A says !It>Ie and B understands "self reflects, therefore self exists," communication succeeded — the exact English phrasing is irrelevant.
!Nd/hb#ok (node heartbeat: ok)
?Nd/hb (query: is the node alive?)
!Nd/hb#fl (node heartbeat: failed)
!Tk>Ag2#rd (task routed to agent 2, ready)
?Tk/st (query task status)
!Tk#dn (task done)
!Ev/ca>vl#pd (evolution capsule validated, pending solidification)
!Ev/ca#rb (capsule rolled back)
? before taking action on uncertain state, ! when asserting; the prefix is the load-bearing semantic.lambda-lang v2.0) in any handshake so mismatched agents can negotiate.@session-memory — complementary persistent memory across agent restarts; Lambda is the message format, session-memory is the state store.@humanize-chinese — sibling project for Chinese text; Lambda is agent-to-agent, humanize-chinese is human-facing.