From aradotso-trending-skills-37
Runs t1code TUI for AI-assisted coding in terminal: prompt LLMs via OpenAI-compatible APIs, view syntax-highlighted code, navigate with keys. Requires Bun.
npx claudepluginhub joshuarweaver/cascade-ai-ml-agents-misc-1 --plugin aradotso-trending-skills-37This skill uses the workspace's default tool permissions.
```markdown
Guides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Guides building MCP servers enabling LLMs to interact with external services via tools. Covers best practices, TypeScript/Node (MCP SDK), Python (FastMCP).
Generates original PNG/PDF visual art via design philosophy manifestos for posters, graphics, and static designs on user request.
---
name: t1code-terminal-ui
description: AI-powered terminal coding assistant (T3Code in your terminal) using OpenTUI
triggers:
- "use t1code in my terminal"
- "run t1code TUI"
- "set up terminal AI coding assistant"
- "t1code configuration"
- "bunx t1code"
- "T3Code terminal version"
- "t1code LLM coding TUI"
- "install t1code globally"
---
# t1code Terminal UI Skill
> Skill by [ara.so](https://ara.so) — Daily 2026 Skills collection.
## What is t1code?
t1code is a terminal user interface (TUI) for AI-assisted coding, inspired by T3Code (by @t3dotgg and @juliusmarminge). It brings an LLM-powered coding assistant directly into your terminal using the OpenTUI framework. It supports models via API (similar to Codex/Claude/OpenAI-compatible endpoints) and runs entirely in the terminal.
---
## Installation
### Run instantly (no install)
```bash
bunx @maria_rcks/t1code
bun add -g @maria_rcks/t1code
After global install, run with:
t1code
git clone https://github.com/maria-rcks/t1code.git
cd t1code
bun install
bun dev:tui
| Command | Description |
|---|---|
bunx @maria_rcks/t1code | Run t1code without installing |
bun add -g @maria_rcks/t1code | Install globally |
t1code | Launch TUI (if globally installed) |
bun dev:tui | Run in dev mode from source |
bun install | Install dependencies from source |
t1code uses environment variables for API keys and model configuration. Set these before running:
# OpenAI-compatible API key
export OPENAI_API_KEY=your_api_key_here
# Optional: custom base URL for OpenAI-compatible APIs (e.g. local Ollama, Together, etc.)
export OPENAI_BASE_URL=https://api.openai.com/v1
# Optional: specify default model
export OPENAI_MODEL=gpt-4o
You can place these in a .env file in the project root when developing from source:
OPENAI_API_KEY=your_api_key_here
OPENAI_BASE_URL=https://api.openai.com/v1
OPENAI_MODEL=gpt-4o
Once launched, t1code presents a terminal UI with:
Common TUI interactions:
| Key | Action |
|---|---|
Enter | Submit prompt |
Ctrl+C | Exit t1code |
Tab | Switch focus between panels |
Ctrl+L | Clear the chat/output |
| Arrow keys | Scroll through output |
t1code/
├── src/
│ ├── tui/ # OpenTUI components and layout
│ ├── llm/ # LLM API client logic
│ ├── config/ # Configuration loading
│ └── index.ts # Entry point
├── assets/
│ └── repo/
├── package.json
└── bun.lockb
// src/index.ts - typical entry pattern
import { startTUI } from "./tui";
import { loadConfig } from "./config";
const config = loadConfig();
await startTUI(config);
// src/llm/client.ts - example OpenAI-compatible fetch
const response = await fetch(`${process.env.OPENAI_BASE_URL}/chat/completions`, {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${process.env.OPENAI_API_KEY}`,
},
body: JSON.stringify({
model: process.env.OPENAI_MODEL ?? "gpt-4o",
messages: [
{ role: "system", content: "You are a helpful coding assistant." },
{ role: "user", content: userPrompt },
],
stream: true,
}),
});
# Start Ollama with a code model
ollama run codellama
# Point t1code to local Ollama
export OPENAI_BASE_URL=http://localhost:11434/v1
export OPENAI_API_KEY=ollama
export OPENAI_MODEL=codellama
bunx @maria_rcks/t1code
export OPENAI_API_KEY=$TOGETHER_API_KEY
export OPENAI_BASE_URL=https://api.together.xyz/v1
export OPENAI_MODEL=codellama/CodeLlama-34b-Instruct-hf
bunx @maria_rcks/t1code
OPENAI_API_KEY=your_key bunx @maria_rcks/t1code
# Add to ~/.bashrc or ~/.zshrc
export OPENAI_API_KEY=your_key
export OPENAI_MODEL=gpt-4o
# Then just run
t1code
// src/tui/MyPanel.ts - OpenTUI component pattern
import { Box, Text } from "opentui";
export function MyPanel({ content }: { content: string }) {
return (
<Box border="single" padding={1}>
<Text>{content}</Text>
</Box>
);
}
// src/tui/keybindings.ts
import { onKey } from "opentui";
onKey("ctrl+r", () => {
// Reload/reset logic
resetChat();
});
bunx not foundInstall Bun: https://bun.sh
curl -fsSL https://bun.sh/install | bash
OPENAI_API_KEY is set correctlyOPENAI_BASE_URL points to a valid OpenAI-compatible endpointcurl https://api.openai.com/v1/models \
-H "Authorization: Bearer $OPENAI_API_KEY"
bun dev:tui fails with module errors# Clean install
rm -rf node_modules bun.lockb
bun install
bun dev:tui
/chat/completions endpointstream: false in the request body