Apply specs, generate code, implement handlers with test-driven iteration
Orchestrates the complete build pipeline: applies schema changes, generates type-safe code, implements API handlers with test-driven iteration, and integrates everything. Use this when you need to build a feature end-to-end from spec to working implementation.
/plugin marketplace add claude-market/marketplace/plugin install claude-market-specforge-specforge@claude-market/marketplaceApply spec changes, run code generation, implement handlers, test and iterate until everything works.
The build process is orchestrated in phases:
Read the SpecForge configuration from CLAUDE.md to identify the database plugin:
# Extract database plugin from CLAUDE.md
grep "database:" CLAUDE.md | cut -d: -f2 | tr -d ' '
Use the database plugin's migration skill:
Invoke the {database-plugin}/migrations-expert skill with:
- action: "apply-migrations"
- migrations_dir: "migrations/"
- database_url: from environment or docker-compose
Steps:
Use the openapi-expert skill to validate the spec:
Invoke openapi-expert skill with:
- action: "validate-spec"
- spec_path: "spec/openapi.yaml"
Validation checks:
Identify the codegen plugin from CLAUDE.md:
grep "codegen:" CLAUDE.md | cut -d: -f2 | tr -d ' '
Invoke the codegen plugin's generation skill:
Use {codegen-plugin}/codegen-expert skill with:
- action: "generate-from-schema"
- schema_dir: "migrations/"
- output_dir: "{backend-path}/src/generated/db"
- database_url: from environment
This generates:
Example for rust-sql codegen:
// Generated: backend/src/generated/db/models.rs
#[derive(Debug, Clone, sqlx::FromRow)]
pub struct Order {
pub id: i64,
pub user_id: i64,
pub total_cents: i64,
pub status: OrderStatus,
pub created_at: chrono::DateTime<Utc>,
}
#[derive(Debug, Clone, sqlx::Type)]
#[sqlx(type_name = "TEXT")]
pub enum OrderStatus {
Pending,
Completed,
Cancelled,
}
// Generated: backend/src/generated/db/queries.rs
pub async fn get_orders_by_user_id(
pool: &SqlitePool,
user_id: i64,
limit: i64,
offset: i64,
) -> Result<Vec<Order>, sqlx::Error> {
sqlx::query_as!(
Order,
r#"
SELECT id, user_id, total_cents, status as "status: OrderStatus",
created_at
FROM orders
WHERE user_id = ?
ORDER BY created_at DESC
LIMIT ? OFFSET ?
"#,
user_id,
limit,
offset
)
.fetch_all(pool)
.await
}
Identify the backend plugin:
grep "backend:" CLAUDE.md | cut -d: -f2 | tr -d ' '
Invoke backend plugin's OpenAPI type generation:
Use {backend-plugin}/openapi-types-expert skill with:
- action: "generate-types"
- spec: "spec/openapi.yaml"
- output: "{backend-path}/src/generated/api"
This generates:
If frontend plugin is configured:
Use {frontend-plugin}/codegen-expert skill with:
- action: "generate-client"
- spec: "spec/openapi.yaml"
- output: "{frontend-path}/src/api"
Read the OpenAPI spec and extract all endpoints:
const spec = parseYAML("spec/openapi.yaml");
const endpoints = [];
for (const [path, methods] of Object.entries(spec.paths)) {
for (const [method, operation] of Object.entries(methods)) {
if (["get", "post", "put", "patch", "delete"].includes(method)) {
endpoints.push({
path,
method: method.toUpperCase(),
operationId: operation.operationId,
summary: operation.summary,
description: operation.description,
parameters: operation.parameters || [],
requestBody: operation.requestBody,
responses: operation.responses,
complexity: assessComplexity(operation),
});
}
}
}
For each endpoint, determine complexity to select appropriate agent model:
Simple CRUD (Haiku):
Complex (Sonnet):
Group endpoints by complexity and spawn handler agents in parallel (max 5 concurrent):
const simpleEndpoints = endpoints.filter((e) => e.complexity === "simple");
const complexEndpoints = endpoints.filter((e) => e.complexity === "complex");
// Process in batches of 5
const batches = chunk([...simpleEndpoints, ...complexEndpoints], 5);
for (const batch of batches) {
const results = await Promise.all(
batch.map((endpoint) => {
const agentModel = endpoint.complexity === "simple" ? "haiku" : "sonnet";
const backendPlugin = getBackendPlugin(); // from CLAUDE.md
return invokeAgent(`${backendPlugin}/handler-agent`, {
model: agentModel,
endpoint: endpoint,
generated_db_types: `${backendPath}/src/generated/db`,
generated_api_types: `${backendPath}/src/generated/api`,
patterns: getBackendPatterns(backendPlugin),
});
})
);
// Track results for Phase 4
handlerResults.push(...results);
}
Each handler agent receives:
Context:
{
"endpoint": {
"path": "/api/users/{id}/orders",
"method": "GET",
"description": "...",
"parameters": [...],
"responses": {...}
},
"generated_db_types": "backend/src/generated/db",
"generated_api_types": "backend/src/generated/api",
"patterns": "Framework-specific patterns documentation"
}
Instructions:
1. Import generated database models and queries
2. Import generated API request/response types
3. Implement handler following framework patterns
4. Use ONLY generated queries - NO manual SQL
5. Handle all error cases from OpenAPI spec
6. Add logging for debugging
7. Return handler file path when complete
Example Handler Implementation (Rust/Axum):
// backend/src/handlers/orders.rs
use axum::{
extract::{Path, Query, State},
http::StatusCode,
response::IntoResponse,
Json,
};
// Import generated types
use crate::generated::db::{get_orders_by_user_id, get_user_by_id, count_orders_by_user_id};
use crate::generated::api::{OrdersResponse, Pagination, PaginationParams};
use crate::error::ApiError;
use crate::AppState;
pub async fn get_user_orders(
State(state): State<AppState>,
Path(user_id): Path<i64>,
Query(params): Query<PaginationParams>,
) -> Result<impl IntoResponse, ApiError> {
// 1. Validate user exists
let _user = get_user_by_id(&state.db, user_id)
.await?
.ok_or(ApiError::NotFound("User not found".to_string()))?;
// 2. Calculate offset from page
let limit = params.limit.min(100); // Enforce max
let offset = (params.page - 1) * limit;
// 3. Get orders using generated query
let orders = get_orders_by_user_id(&state.db, user_id, limit, offset).await?;
// 4. Get total count for pagination
let total = count_orders_by_user_id(&state.db, user_id).await?;
// 5. Build response
Ok(Json(OrdersResponse {
data: orders,
pagination: Pagination {
page: params.page,
limit,
total,
total_pages: (total + limit - 1) / limit,
},
}))
}
For each implemented handler, run a test-driven iteration loop:
for (const handler of handlerResults) {
let success = false;
let iterationCount = 0;
const MAX_ITERATIONS = 3;
while (!success && iterationCount < MAX_ITERATIONS) {
// 1. Generate and run tests
const testResult = await invokeAgent(`${backendPlugin}/test-agent`, {
model: "haiku",
handler_path: handler.path,
endpoint: handler.endpoint,
generated_types: handler.generated_types,
});
if (testResult.status === "passed") {
success = true;
console.log(`✓ ${handler.endpoint.path} tests passed`);
break;
}
// 2. Diagnose issues
const diagnostic = await invokeAgent(`${codegenPlugin}/diagnostics-agent`, {
model: "sonnet",
handler_path: handler.path,
errors: testResult.errors,
compile_errors: testResult.compile_errors,
test_failures: testResult.test_failures,
generated_code_path: handler.generated_types,
});
// 3. Apply fixes
await applyFixes(diagnostic.fixes);
iterationCount++;
}
if (!success) {
console.error(
`✗ Failed to get ${handler.endpoint.path} working after ${MAX_ITERATIONS} iterations`
);
console.error("Last errors:", testResult.errors);
// Ask user for guidance
const userDecision = await askUser({
question: `Handler for ${handler.endpoint.path} failed tests. What should I do?`,
options: [
"Skip for now and continue",
"Try again with more iterations",
"Show me the errors and let me fix it",
"Abort the build",
],
});
handleUserDecision(userDecision);
}
}
The test agent should:
Generate Unit Tests:
Generate Integration Tests:
Run Tests:
Behavior Observation:
When tests fail, the diagnostics agent analyzes:
Compile Errors:
Test Failures:
Runtime Errors:
Diagnostic Output:
{
"status": "errors_found",
"issues": [
{
"type": "compile_error",
"location": "backend/src/handlers/orders.rs:45",
"message": "expected i64, found i32",
"fix": "Change parameter type to i64 to match generated query signature"
},
{
"type": "test_failure",
"test": "test_get_user_orders_pagination",
"message": "assertion failed: total_pages == 3",
"fix": "Pagination calculation is incorrect. Use (total + limit - 1) / limit"
}
],
"fixes": [
{
"file": "backend/src/handlers/orders.rs",
"changes": [...]
}
]
}
Once all handlers pass tests, wire them to the router:
Use {backend-plugin}/integration-expert skill with:
- action: "wire-handlers"
- handlers: [list of handler file paths]
- output: "{backend-path}/src/main" or router file
Example Router Integration (Rust/Axum):
// backend/src/main.rs
mod handlers;
mod generated;
mod error;
use axum::{
routing::{get, post, patch},
Router,
};
#[tokio::main]
async fn main() {
// Database connection
let db = setup_database().await;
let app_state = AppState { db };
// Router with all handlers
let app = Router::new()
.route("/api/users/:id/orders", get(handlers::orders::get_user_orders))
.route("/api/orders", post(handlers::orders::create_order))
.route("/api/orders/:id", get(handlers::orders::get_order))
.route("/api/orders/:id", patch(handlers::orders::update_order_status))
.with_state(app_state);
// Start server
let listener = tokio::net::TcpListener::bind("0.0.0.0:3000").await.unwrap();
axum::serve(listener, app).await.unwrap();
}
# Start services
docker-compose up -d
# Wait for health checks
docker-compose exec api curl http://localhost:3000/health
# Run integration test suite
docker-compose exec api cargo test --test integration
Use validation tools to ensure API matches spec:
# Generate Prism mock server from OpenAPI spec
npx @stoplight/prism-cli mock spec/openapi.yaml
# Run contract tests
docker-compose exec api cargo test --test contract_tests
Provide comprehensive build report:
✓ Build Complete!
Phase 1: Spec Changes
✓ Applied 1 database migration (003_add_orders.sql)
✓ Validated OpenAPI spec
Phase 2: Code Generation
✓ Generated database models (Order, OrderItem)
✓ Generated query functions (5 functions)
✓ Generated API types (OrdersResponse, CreateOrderRequest)
Phase 3: Handler Implementation
✓ Implemented 4 handlers (2 simple, 2 complex)
- GET /api/users/{id}/orders (Haiku)
- GET /api/orders/{id} (Haiku)
- POST /api/orders (Sonnet)
- PATCH /api/orders/{id} (Sonnet)
Phase 4: Testing
✓ Generated 12 unit tests
✓ Generated 4 integration tests
✓ All tests passing
Phase 5: Integration
✓ Wired handlers to router
✓ Verified against OpenAPI spec
Build Statistics:
- Time: 45 minutes
- Tokens used: 48,000
- Cost: ~$0.45
- Files modified: 8
- Tests created: 16
Next Steps:
1. Run `/specforge:test` for full test suite
2. Run `/specforge:validate` to validate everything
3. Start services: docker-compose up -d
4. Test manually: curl http://localhost:3000/api/users/1/orders
If migration fails:
If codegen fails:
If handler agent fails:
If tests still fail after 3 iterations:
.specforge/build-state.json.specforge/build-state.json