From ritual-dapp-skills
Provides Data Availability storage patterns for Ritual dApps using StorageRef for off-chain providers like GCS, HuggingFace, Pinata. Covers credentials, error handling, DKMS encryption for precompiles like multimodal, FHE, LLM, agents.
npx claudepluginhub ritual-foundation/ritual-dapp-skills --plugin ritual-dapp-skillsThis skill uses the workspace's default tool permissions.
Several Ritual precompiles read from or write to external storage during execution. The chain itself does not store large blobs — generated images, conversation history, FHE ciphertexts, and agent artifacts all live off-chain on storage providers (GCS, HuggingFace, Pinata). The executor handles all storage I/O inside the TEE.
Guides Next.js Cache Components and Partial Prerendering (PPR): 'use cache' directives, cacheLife(), cacheTag(), revalidateTag() for caching, invalidation, static/dynamic optimization. Auto-activates on cacheComponents: true.
Processes PDFs: extracts text/tables/images, merges/splits/rotates pages, adds watermarks, creates/fills forms, encrypts/decrypts, OCRs scans. Activates on PDF mentions or output requests.
Share bugs, ideas, or general feedback.
Several Ritual precompiles read from or write to external storage during execution. The chain itself does not store large blobs — generated images, conversation history, FHE ciphertexts, and agent artifacts all live off-chain on storage providers (GCS, HuggingFace, Pinata). The executor handles all storage I/O inside the TEE.
This skill covers the shared DA patterns. For precompile-specific ABI fields and usage, see the corresponding skill (ritual-dapp-multimodal, ritual-dapp-llm, ritual-dapp-agents).
StorageRef everywhere DA appears: (platform, path, keyRef).encryptedSecrets; keyRef links a DA field to its credential.hasError on result decode paths (callbacks and settled async outputs).outputStorageRef for image/audio/video?"encryptedSecrets?"| Precompile | Address | DA Fields | Read/Write | Skill |
|---|---|---|---|---|
| Image Call | 0x0818 | outputStorageRef (field 17) | Write (upload generated image) | ritual-dapp-multimodal |
| Audio Call | 0x0819 | outputStorageRef (field 17) | Write (upload generated audio) | ritual-dapp-multimodal |
| Video Call | 0x081A | outputStorageRef (field 17) | Write (upload generated video) | ritual-dapp-multimodal |
| FHE | 0x0807 | inputStorageRef (field 17), outputStorageRef (field 18) | Read + Write (ciphertext I/O) | ritual-dapp-precompiles |
| LLM Call | 0x0802 | convoHistory (field 29) | Read + Write (conversation JSONL, plaintext) | ritual-dapp-llm |
| Sovereign Agent | 0x080C | convoHistory (14), output (15), skills[] (16), systemPrompt (17) | Read + Write (DKMS-encrypted) | ritual-dapp-agents |
| Persistent Agent | 0x0820 | daConfig (15), soulRef (16), agentsRef (17), userRef (18), memoryRef (19), identityRef (20), toolsRef (21), openclawConfigRef (22) | Read + Write (DKMS-encrypted) | ritual-dapp-agents |
Every DA field uses the same type: a StorageRef tuple of three strings.
// ABI type: (string, string, string)
type StorageRef = [
string, // platform — storage provider identifier
string, // path — platform-specific object path
string, // keyRef — placeholder name in encryptedSecrets for credentials
];
How keyRef works: keyRef points to the credential value inside encryptedSecrets. For example, if keyRef = 'GCS_CREDS', then encryptedSecrets should include a GCS_CREDS entry with GCS credentials in the expected format.
Empty ref: ['', '', ''] means "no storage reference" — the executor skips this field.
import { encodeAbiParameters } from 'viem';
// StorageRef is encoded as a tuple of three strings
const storageRef = ['gcs', 'outputs/image.png', 'GCS_CREDS'];
// In the full precompile ABI, use the tuple type:
// { type: 'tuple', components: [
// { name: 'platform', type: 'string' },
// { name: 'path', type: 'string' },
// { name: 'keyRef', type: 'string' },
// ]}
// Or with parseAbiParameters shorthand:
// '(string,string,string)'
// StorageRef value passed as one tuple field in a larger precompile payload
StorageRef memory outRef = StorageRef({
platform: "gcs",
path: "outputs/image.png",
keyRef: "GCS_CREDS"
});
// Example: include outRef in your full abi.encode payload
bytes memory input = abi.encode(
// ... other fields ...
outRef
);
| Platform | platform value | path format | keyRef value | Auth Required |
|---|---|---|---|---|
| Google Cloud Storage | 'gcs' | Object path within bucket (bucket is in credential JSON) | Key name in encryptedSecrets holding {service_account_json, bucket} JSON | Yes |
| HuggingFace | 'hf' | org/repo/path/to/file (slash-separated, first two segments are repo ID) | Key name holding HF access token string | Yes |
| Pinata (IPFS) | 'pinata' | CID (empty for first upload, CID for subsequent) | Key name holding {jwt, gateway_url} JSON | Yes |
The path is an object key within the GCS bucket. The bucket name is specified inside the credential JSON (the bucket field), not in the StorageRef path.
// GCS StorageRef — object path only, bucket is in credentials
const gcsRef: StorageRef = ['gcs', 'multimodal/outputs/image_001.png', 'GCS_CREDS'];
// The GCS_CREDS value in encryptedSecrets must be a JSON string with two fields:
// '{"service_account_json":"{\"type\":\"service_account\",...}","bucket":"my-bucket"}'
//
// service_account_json: the full GCP service account key file JSON (stringified)
// bucket: the GCS bucket name
Slash-separated: the first two segments are the HF repo ID, the rest is the file path within the repo.
// org/repo/path/to/file — first two segments = repo ID, rest = file path
const hfRef: StorageRef = ['hf', 'my-org/agent-data/configs/SOUL.md', 'HF_TOKEN'];
// repo ID = "my-org/agent-data", file path = "configs/SOUL.md"
// Repo-root ref (no file subpath) — use for listing/downloading entire repo
const hfRepoRef: StorageRef = ['hf', 'my-org/agent-configs', 'HF_TOKEN'];
Pinata is append-only (content-addressed). Each upload returns a new CID. For the first upload, pass an empty path. For subsequent uploads, pass the CID from the previous response.
// First call — empty path, CID returned after upload
const pinataFirst: StorageRef = ['pinata', '', 'DA_PINATA_JWT'];
// Subsequent call — pass CID from previous response
const pinataNext: StorageRef = ['pinata', 'QmXyzAbc123...', 'DA_PINATA_JWT'];
Content is embedded directly in the path field. No storage I/O occurs — the executor reads the path string as the content. Read-only (never uploaded to).
const inlineRef: StorageRef = ['inline', 'You are a helpful DeFi agent.', ''];
Storage credentials are delivered inside encryptedSecrets, encrypted with the executor's ECIES public key. The keyRef field in each StorageRef tells the executor which key in the decrypted secrets map holds the credential.
| Platform | Credential Format | Example encryptedSecrets JSON |
|---|---|---|
| GCS | JSON object with service_account_json and bucket | {"GCS_CREDS": "{\"service_account_json\":\"{\\\"type\\\":\\\"service_account\\\",...}\",\"bucket\":\"my-bucket\"}"} |
| HuggingFace | HF access token string (plain, not JSON) | {"HF_TOKEN": "hf_abc123..."} |
| Pinata | JSON object with jwt and gateway_url | {"PINATA_CREDS": "{\"jwt\":\"eyJ...\",\"gateway_url\":\"https://my-gateway.mypinata.cloud/ipfs\"}"} |
import { encrypt, ECIES_CONFIG } from 'eciesjs';
import { encodeAbiParameters, hexToBytes, bytesToHex } from 'viem';
// MANDATORY: 12-byte nonce for Ritual ECIES
ECIES_CONFIG.symmetricNonceLength = 12;
// Combine storage credentials with any other secrets (e.g., LLM API key)
const gcsServiceAccountJson = process.env.GCS_DA_SERVICE_ACCOUNT_JSON!;
const gcsBucket = process.env.GCS_DA_BUCKET!;
const secretsJson = JSON.stringify({
ANTHROPIC_API_KEY: 'sk-ant-...',
GCS_CREDS: JSON.stringify({
service_account_json: gcsServiceAccountJson,
bucket: gcsBucket,
}),
});
// Encrypt to executor's public key (from TEEServiceRegistry)
const encryptedBuffer = encrypt(executorPublicKey.slice(2), Buffer.from(secretsJson));
const encryptedSecrets = [`0x${encryptedBuffer.toString('hex')}` as `0x${string}`];
ECIES nonce length = 12. Both
eciesjs(TypeScript) andeciespy(Python) default to 16-byte nonces. The executor uses 12. Mismatched nonce length causes silent decryption failure — the transaction mines but DA operations fail with no error message. SetECIES_CONFIG.symmetricNonceLength = 12(TypeScript) orECIES_CONFIG.symmetric_nonce_length = 12(Python) before anyencrypt()call.
import { createPublicClient, http, defineChain } from 'viem';
const ritualChain = defineChain({
id: 1979,
name: 'Ritual Chain',
nativeCurrency: { name: 'RITUAL', symbol: 'RITUAL', decimals: 18 },
rpcUrls: { default: { http: [process.env.RITUAL_RPC_URL!] } },
});
const publicClient = createPublicClient({ chain: ritualChain, transport: http() });
const TEE_SERVICE_REGISTRY = '0x9644e8562cE0Fe12b4deeC4163c064A8862Bf47F' as const;
// Use the capability for your precompile:
// 0 = HTTP_CALL (agents), 1 = LLM, 7 = IMAGE_CALL (multimodal)
const services = await publicClient.readContract({
address: TEE_SERVICE_REGISTRY,
abi: [{
name: 'getServicesByCapability',
type: 'function',
stateMutability: 'view',
inputs: [{ name: 'capability', type: 'uint8' }, { name: 'activeOnly', type: 'bool' }],
outputs: [{ type: 'tuple[]', components: [
{ name: 'node', type: 'tuple', components: [
{ name: 'paymentAddress', type: 'address' },
{ name: 'teeAddress', type: 'address' },
{ name: 'teeType', type: 'uint8' },
{ name: 'publicKey', type: 'bytes' },
{ name: 'endpoint', type: 'string' },
{ name: 'certPubKeyHash', type: 'bytes32' },
{ name: 'capability', type: 'uint8' },
]},
{ name: 'isValid', type: 'bool' },
{ name: 'workloadId', type: 'bytes32' },
]}],
}] as const,
functionName: 'getServicesByCapability',
args: [7, true], // 7 = IMAGE_CALL capability
});
const executor = services[0];
const executorAddress = executor.node.teeAddress;
const executorPublicKey = executor.node.publicKey as `0x${string}`;
Storage credentials in encryptedSecrets are visible on-chain (as ciphertext). Anyone can copy and replay them. To prevent unauthorized reuse, bind credentials to your contract using SecretsAccessControl:
import { keccak256, toBytes } from 'viem';
const SECRETS_AC = '0xf9BF1BC8A3e79B9EBeD0fa2Db70D0513fecE32FD' as const;
const secretsHash = keccak256(toBytes(encryptedSecrets[0]));
const currentBlock = await publicClient.getBlockNumber();
await walletClient.writeContract({
address: SECRETS_AC,
abi: [{
name: 'grantAccess',
type: 'function',
stateMutability: 'nonpayable',
inputs: [
{ name: 'delegate', type: 'address' },
{ name: 'secretsHash', type: 'bytes32' },
{ name: 'expiresAt', type: 'uint256' },
{ name: 'policy', type: 'tuple', components: [
{ name: 'allowedDestinations', type: 'string[]' },
{ name: 'allowedMethods', type: 'string[]' },
{ name: 'allowedPaths', type: 'string[]' },
{ name: 'allowedQueryParams', type: 'string[]' },
{ name: 'allowedHeaders', type: 'string[]' },
{ name: 'secretLocation', type: 'string' },
{ name: 'bodyFormat', type: 'string' },
]},
],
outputs: [],
}] as const,
functionName: 'grantAccess',
args: [
consumerContractAddress,
secretsHash,
currentBlock + 50_000n,
{ allowedDestinations: [], allowedMethods: [], allowedPaths: [],
allowedQueryParams: [], allowedHeaders: [], secretLocation: '', bodyFormat: '' },
],
});
See ritual-dapp-secrets for the full delegation pattern.
From an app-builder perspective, DA follows a simple lifecycle:
Prepare request inputs
StorageRef values.encryptedSecrets.Submit the precompile call
Receive success result
Or receive an error result
hasError=true and an errorMessage.Multimodal executors upload to GCS and return gs:// URIs. Browsers cannot fetch gs:// directly — convert to HTTPS before displaying:
function gsUriToHttps(uri: string): string {
if (uri.startsWith('gs://')) {
return uri.replace('gs://', 'https://storage.googleapis.com/');
}
return uri;
}
When storage operations fail (bad credentials, upload failure, network/provider issues), the request still resolves on-chain with a structured error result instead of hanging silently.
hasError=true.errorMessage explains the failure reason.500,000,000,000 wei (0.0000005 ETH), which is much smaller than a successful generation fee.hasError=true as a terminal state and surface the message to users.| Precompile | Common DA failure points |
|---|---|
| Image (0x0818) | Credential validation failure, output upload failure |
| Audio (0x0819) | Credential validation failure, output upload failure |
| Video (0x081A) | Credential validation failure, output upload failure |
| FHE (0x0807) | Input/output storage credential failure, output upload failure |
| LLM (0x0802) | Convo-history credential failure, convo-history upload failure |
All DA-using precompiles settle with a constant error fee (500,000,000,000 wei / 0.0000005 ETH) when hasError=true: a small constant amount is deducted from escrow, far less than successful generation pricing.
// In your Phase 2 callback (e.g., onImageReady):
function onImageReady(jobId: `0x${string}`, responseData: `0x${string}`) {
const [hasError, , outputUri, , , , , , errorMessage] = decodeAbiParameters(
[
{ type: 'bool' }, { type: 'bytes' }, { type: 'string' },
{ type: 'bytes32' }, { type: 'bool' }, { type: 'uint32' },
{ type: 'uint32' }, { type: 'uint32' }, { type: 'string' },
],
responseData,
);
if (hasError) {
// DA error — errorMessage contains the reason
// Common messages:
// "Failed to decrypt secrets: ..."
// "Failed to create storage client: ..."
// "Storage credentials invalid: ..."
// "Phase 2 processing failed after N retries: ..."
console.error('DA error:', errorMessage);
return;
}
// Success — use outputUri
}
// Solidity callback
address constant ASYNC_DELIVERY = 0x5A16214fF555848411544b005f7Ac063742f39F6;
function onImageReady(bytes32 jobId, bytes calldata responseData) external {
require(msg.sender == ASYNC_DELIVERY, "unauthorized callback sender");
(bool hasError, , , , , , , , string memory errorMsg) = abi.decode(
responseData,
(bool, bytes, string, bytes32, bool, uint32, uint32, uint32, string)
);
if (hasError) {
emit MediaFailed(jobId, errorMsg);
return;
}
// Process success...
}
Before generation completes
hasError=true without a usable output URI.After generation but before persistence
hasError=true, and should ask users to retry.Agent precompiles (Sovereign Agent, Persistent Agent) use DKMS (Decentralized Key Management) to derive a per-sender encryption key inside the TEE. All agent DA content is encrypted at rest.
LLM, multimodal (image/audio/video), and FHE precompiles store output in plaintext. Only Sovereign Agent and Persistent Agent use DKMS-derived encryption. LLM conversation history, generated images, audio, video, and FHE ciphertexts are written directly to the storage provider without an additional encryption layer.
dkms_encrypted: PrefixSkills and system prompts for agents can be pre-encrypted to the agent's DA public key. This keeps agent instructions private on the storage provider.
To use this pattern:
0x081B).keyRef to dkms_encrypted:<credential> on the StorageRef.const systemPrompt: StorageRef = [
'hf',
'my-org/workspace/system.encrypted.md',
'dkms_encrypted:HF_TOKEN', // prefix tells executor to DKMS-decrypt after download
];
Two different encryption targets.
encryptedSecretsis encrypted to the executor's public key (fromTEEServiceRegistry.node.publicKey).dkms_encrypted:content is encrypted to the agent's DA public key (from DKMS precompile 0x081B). These are different keys for different purposes.
See ritual-dapp-agents for the full DKMS agent pattern.
This is the most common DA failure. Phase 1 succeeds but the Phase 2 callback never comes.
Checklist:
outputStorageRef non-empty? An empty ['', '', ''] means no storage configured — the executor cannot upload the output.keyRef correct? Must match a key in the decrypted encryptedSecrets JSON exactly.service_account_json and bucket. HF expects a token string. Pinata expects a JSON payload containing jwt and gateway_url.ECIES_CONFIG.symmetricNonceLength = 12? Wrong nonce length = executor cannot decrypt secrets = silent failure.TEEServiceRegistry.getServicesByCapability().lockUntil must be > current block + ttl. Locks expire fast — 5,000 blocks is only ~29 min at Ritual's ~350ms conservative baseline. Use 100,000+ for development.hasError=trueThe executor detected a DA problem and settled with an error. Read the errorMessage field:
| Error Message Pattern | Cause | Fix |
|---|---|---|
Failed to decrypt secrets: ... | ECIES decryption failed | Check nonce length (12), public key correctness |
Failed to create storage client: ... | Credential format invalid for platform | Check credential JSON format per platform table above |
Storage credentials invalid: ... | Credentials are well-formed but rejected by the storage provider | Check SA permissions, token expiry, bucket existence |
Phase 2 processing failed after N retries: ... | Upload succeeded partially or network issues | Retry with fresh credentials; check storage provider status |
For Sovereign/Persistent agents:
encryptedSecrets. Agent DA requires credentials alongside the LLM API key.convoHistory and output StorageRefs are non-empty. Empty refs = no persistence between calls.| Item | Value |
|---|---|
| StorageRef type | (string platform, string path, string keyRef) |
| Supported platforms | gcs, hf, pinata, inline |
| Credential delivery | Via encryptedSecrets (ECIES-encrypted to executor public key) |
| ECIES nonce length | 12 (both TypeScript and Python default to 16 — must override) |
| DA error fee | 500,000,000,000 wei (constant, applies to Image/Audio/Video/FHE) |
| DA error detection | hasError field in response payload |
| DKMS DA encryption | Per-sender derived key, content encrypted at rest (agents only) |
| TEEServiceRegistry | 0x9644e8562cE0Fe12b4deeC4163c064A8862Bf47F |
| SecretsAccessControl | 0xf9BF1BC8A3e79B9EBeD0fa2Db70D0513fecE32FD |
| RitualWallet | 0x532F0dF0896F353d8C3DD8cc134e8129DA2a3948 |