Use when modifying existing Bknd schema. Covers renaming entities, renaming fields, changing field types, altering field constraints, handling destructive changes, data migration strategies, and the sync workflow.
npx claudepluginhub cameronapak/bknd-expert --plugin bknd-research-skillsThis skill uses the workspace's default tool permissions.
Modify existing schema in Bknd: rename entities/fields, change field types, or alter constraints.
Searches, retrieves, and installs Agent Skills from prompts.chat registry using MCP tools like search_skills and get_skill. Activates for finding skills, browsing catalogs, or extending Claude.
Searches prompts.chat for AI prompt templates by keyword or category, retrieves by ID with variable handling, and improves prompts via AI. Use for discovering or enhancing prompts.
Checks Next.js compilation errors using a running Turbopack dev server after code edits. Fixes actionable issues before reporting complete. Replaces `next build`.
Modify existing schema in Bknd: rename entities/fields, change field types, or alter constraints.
bknd-create-entity)bknd.config.tsBknd's schema sync detects differences between your code and database. Some changes are safe; others cause data loss.
.required())Warning: Bknd has no native rename. Renaming = DROP old + CREATE new = DATA LOSS.
// Step 1: Add new entity alongside old
const schema = em({
// OLD - will be removed later
posts: entity("posts", {
title: text().required(),
content: text(),
}),
// NEW - desired name
articles: entity("articles", {
title: text().required(),
content: text(),
}),
});
// Step 2: Migrate data (run once via script or CLI)
const api = app.getApi();
const oldData = await api.data.readMany("posts", { limit: 10000 });
for (const item of oldData.data) {
await api.data.createOne("articles", {
title: item.title,
content: item.content,
});
}
// Step 3: Remove old entity from schema
const schema = em({
articles: entity("articles", {
title: text().required(),
content: text(),
}),
});
# Step 4: Sync with force to drop old table
npx bknd sync --force
http://localhost:1337)Warning: Bknd treats field renames as drop + create = DATA LOSS on that column.
// Step 1: Add new field alongside old
const schema = em({
users: entity("users", {
name: text(), // OLD - will be removed
full_name: text(), // NEW - desired name
}),
});
// Step 2: Migrate data
const api = app.getApi();
const users = await api.data.readMany("users", { limit: 10000 });
for (const user of users.data) {
if (user.name && !user.full_name) {
await api.data.updateOne("users", user.id, {
full_name: user.name,
});
}
}
// Step 3: Remove old field
const schema = em({
users: entity("users", {
full_name: text(),
}),
});
# Step 4: Sync with force to drop old column
npx bknd sync --force
Type changes are risky. Some conversions work; others fail or truncate.
| From | To | Notes |
|---|---|---|
text | text (with different constraints) | Usually safe |
number | text | Safe (numbers become strings) |
boolean | number | Safe (0/1 values) |
boolean | text | Safe ("true"/"false") |
| From | To | Risk |
|---|---|---|
text | number | Fails if non-numeric data |
text | boolean | Fails if not "true"/"false"/0/1 |
text | date | Fails if not valid date format |
json | text | May truncate; loses structure |
// Step 1: Add new field with new type
const schema = em({
products: entity("products", {
price: text(), // OLD - string prices
price_cents: number(), // NEW - integer cents
}),
});
// Step 2: Transform and migrate data
const api = app.getApi();
const products = await api.data.readMany("products", { limit: 10000 });
for (const product of products.data) {
if (product.price && !product.price_cents) {
const cents = Math.round(parseFloat(product.price) * 100);
await api.data.updateOne("products", product.id, {
price_cents: cents,
});
}
}
// Step 3: Remove old field, rename new if desired
const schema = em({
products: entity("products", {
price_cents: number(),
}),
});
Risk: Fails if existing records have null values.
// Before
entity("users", {
email: text(), // Optional
});
// After
entity("users", {
email: text().required(), // Now required
});
Safe approach:
.required()// Step 1: Fill nulls with default
const api = app.getApi();
const usersWithNull = await api.data.readMany("users", {
where: { email: { $isnull: true } },
});
for (const user of usersWithNull.data) {
await api.data.updateOne("users", user.id, {
email: "unknown@example.com",
});
}
// Step 2: Now safely add .required()
Risk: Fails if duplicates exist.
// Before
entity("users", {
username: text(),
});
// After
entity("users", {
username: text().unique(),
});
Safe approach:
.unique()// Check for duplicates via raw SQL or manual inspection
// Resolve duplicates by updating or deleting
// Then add .unique() constraint
Generally safe:
// Before
entity("users", {
email: text().required().unique(),
});
// After - loosening constraints is safe
entity("users", {
email: text(), // Now optional, non-unique
});
# See what sync would do without applying
npx bknd sync
Output shows:
# Applies only additive changes
npx bknd sync
# WARNING: This will drop tables/columns
npx bknd sync --force
# Specifically enables drop operations
npx bknd sync --drop
Error: Cannot convert column type from X to Y
Fix: Use migration approach - create new field, copy data, drop old.
Error: Column contains null values, cannot add NOT NULL
Fix: Update all null values to non-null first, then re-sync.
Error: Duplicate values exist for column
Fix: Remove duplicates before adding unique constraint.
Problem: Renamed entity/field and lost all data.
Fix: Unfortunately, data is gone. Restore from backup. Use migration approach next time.
Problem: --force doesn't seem to apply changes.
Fix: Check sync output for actual errors. May be validation issue, not permission.
For complex migrations, create a standalone script:
// scripts/migrate-schema.ts
import { App } from "bknd";
async function migrate() {
const app = new App({
connection: { url: process.env.DB_URL! },
});
await app.build();
const api = app.getApi();
console.log("Starting migration...");
// Read all records from old structure
const records = await api.data.readMany("old_entity", { limit: 100000 });
console.log(`Found ${records.data.length} records`);
// Transform and insert into new structure
let migrated = 0;
for (const record of records.data) {
await api.data.createOne("new_entity", {
// Transform fields as needed
new_field: record.old_field,
});
migrated++;
if (migrated % 100 === 0) {
console.log(`Migrated ${migrated}/${records.data.length}`);
}
}
console.log("Migration complete!");
process.exit(0);
}
migrate().catch(console.error);
Run with:
npx bun scripts/migrate-schema.ts
# or
npx ts-node scripts/migrate-schema.ts
# 1. Check sync status
npx bknd sync
# 2. Verify schema in debug output
npx bknd schema --pretty
const api = app.getApi();
// Verify field exists by querying
const result = await api.data.readMany("entity_name", { limit: 1 });
console.log(result.data[0]); // Check field names/values
DO:
npx bknd sync before forcingDON'T:
--force without previewing first.required() to fields with null data.unique() to fields with duplicates