From laguagu-claude-code-nextjs-skills
Builds AI agents, chatbots, tool integrations, streaming apps, and structured outputs with Vercel AI SDK v6. Covers ToolLoopAgent, generateText, streamText, smoothStream, and provider tools.
npx claudepluginhub joshuarweaver/cascade-code-languages-misc-1 --plugin laguagu-claude-code-nextjs-skillsThis skill uses the workspace's default tool permissions.
Use this skill when developing AI-powered features using Vercel AI SDK v6 (`ai` package).
Guides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Guides building MCP servers enabling LLMs to interact with external services via tools. Covers best practices, TypeScript/Node (MCP SDK), Python (FastMCP).
Generates original PNG/PDF visual art via design philosophy manifestos for posters, graphics, and static designs on user request.
Use this skill when developing AI-powered features using Vercel AI SDK v6 (ai package).
Docs location: bundled in
node_modules/ai/docs/. In Bun/pnpm/Yarn workspace monorepos deps aren't hoisted — useapps/*/node_modules/ai/docs/orpackages/*/node_modules/ai/docs/instead.
bun add ai @ai-sdk/openai zod # or @ai-sdk/anthropic, @ai-sdk/google, etc.
| Function | Purpose |
|---|---|
generateText | Non-streaming text generation (+ structured output with Output) |
streamText | Streaming text generation (+ structured output with Output) |
v6 Note:
generateObject/streamObjectare deprecated. UsegenerateText/streamTextwithoutput: Output.object({ schema })instead.
import { generateText, Output } from "ai";
import { z } from "zod";
const { output } = await generateText({
model: anthropic("claude-sonnet-4-6"),
output: Output.object({
schema: z.object({
sentiment: z.enum(["positive", "neutral", "negative"]),
topics: z.array(z.string()),
}),
}),
prompt: "Analyze this feedback...",
});
Output types: Output.object(), Output.array(), Output.choice(), Output.json()
import { ToolLoopAgent, tool, stepCountIs } from "ai";
import { anthropic } from "@ai-sdk/anthropic";
import { z } from "zod";
const myAgent = new ToolLoopAgent({
model: anthropic("claude-sonnet-4-6"),
instructions: "You are a helpful assistant.",
tools: {
getData: tool({
description: "Fetch data from API",
inputSchema: z.object({
query: z.string(),
}),
execute: async ({ query }) => {
return { result: "data" };
},
}),
},
stopWhen: stepCountIs(20),
});
// Usage
const { text } = await myAgent.generate({ prompt: "Hello" });
const stream = myAgent.stream({ prompt: "Hello" });
// app/api/chat/route.ts
import { createAgentUIStreamResponse } from "ai";
import { myAgent } from "@/agents/my-agent";
export async function POST(request: Request) {
const { messages } = await request.json();
return createAgentUIStreamResponse({
agent: myAgent,
uiMessages: messages,
});
}
import { createAgentUIStreamResponse, smoothStream } from "ai";
return createAgentUIStreamResponse({
agent: myAgent,
uiMessages: messages,
experimental_transform: smoothStream({
delayInMs: 15,
chunking: "word", // "word" | "line" | "none"
}),
});
"use client";
import { useChat } from "@ai-sdk/react";
import { DefaultChatTransport } from "ai";
import { useState } from "react";
export function Chat() {
const [input, setInput] = useState("");
const { messages, sendMessage, status } = useChat({
transport: new DefaultChatTransport({
api: "/api/chat",
}),
});
return (
<>
{messages.map((msg) => (
<div key={msg.id}>
{msg.parts.map((part, i) =>
part.type === "text" ? <span key={i}>{part.text}</span> : null
)}
</div>
))}
<form
onSubmit={(e) => {
e.preventDefault();
if (input.trim()) {
sendMessage({ text: input });
setInput("");
}
}}
>
<input
value={input}
onChange={(e) => setInput(e.target.value)}
disabled={status !== "ready"}
/>
<button type="submit" disabled={status !== "ready"}>
Send
</button>
</form>
</>
);
}
v6 Note:
useChatno longer manages input state internally. UseuseStatefor controlled inputs.
For detailed information, see:
For the latest information, see AI SDK docs.