npx machina-cli add skill JanSzewczyk/claude-plugins/db-migration --openclawFirestore Migration Skill
Generate safe, idempotent migration scripts for Firestore data changes. This skill helps you evolve your database schema without data loss.
Context
This skill creates migration scripts for:
- Adding new fields to existing documents
- Renaming or restructuring fields
- Transforming data formats
- Removing deprecated fields
- Backfilling computed values
- Splitting or merging collections
Migration Principles
- Idempotent: Running twice produces the same result
- Resumable: Can continue from where it left off if interrupted
- Dry-run first: Always preview changes before applying
- Batched: Process documents in batches to avoid timeouts
- Logged: Track progress and errors for debugging
Instructions
When the user requests a migration:
1. Analyze the Change
Gather information about:
- Source collection name
- Current document structure
- Target document structure
- Number of documents affected (estimate)
- Any dependencies or relationships
2. Assess Risk Level
| Risk | Criteria | Approach |
|---|---|---|
| Low | Adding optional field | Direct migration |
| Medium | Restructuring data | Migration with validation |
| High | Removing/renaming fields | Dual-write period recommended |
| Critical | Changing primary keys | Manual review required |
3. Generate Migration Script
File Location: scripts/migrations/YYYY-MM-DD-description.ts
Script Template:
/**
* Migration: [Description]
* Created: [Date]
* Author: [Name]
*
* Purpose:
* [Detailed description of what this migration does]
*
* Affected Collection: [collection-name]
* Estimated Documents: [number]
*
* Rollback Strategy:
* [How to undo this migration if needed]
*/
import { db } from "~/lib/firebase";
import { FieldValue } from "firebase-admin/firestore";
import { createLogger } from "~/lib/logger";
const logger = createLogger({ module: "migration-[name]" });
// Configuration
const COLLECTION_NAME = "collection-name";
const BATCH_SIZE = 500;
const DRY_RUN_DEFAULT = true;
interface MigrationOptions {
dryRun?: boolean;
startAfter?: string; // Document ID to resume from
limit?: number; // Max documents to process (for testing)
}
interface MigrationResult {
processed: number;
updated: number;
skipped: number;
errors: number;
dryRun: boolean;
lastDocId?: string;
}
export async function migrate(
options: MigrationOptions = {},
): Promise<MigrationResult> {
const { dryRun = DRY_RUN_DEFAULT, startAfter, limit } = options;
logger.info({ dryRun, startAfter, limit }, "Starting migration");
const result: MigrationResult = {
processed: 0,
updated: 0,
skipped: 0,
errors: 0,
dryRun,
};
try {
let query = db
.collection(COLLECTION_NAME)
.orderBy("__name__")
.limit(BATCH_SIZE);
if (startAfter) {
const startDoc = await db
.collection(COLLECTION_NAME)
.doc(startAfter)
.get();
if (startDoc.exists) {
query = query.startAfter(startDoc);
}
}
let hasMore = true;
let totalLimit = limit ?? Infinity;
while (hasMore && result.processed < totalLimit) {
const snapshot = await query.get();
if (snapshot.empty) {
hasMore = false;
break;
}
const batch = db.batch();
let batchCount = 0;
for (const doc of snapshot.docs) {
if (result.processed >= totalLimit) break;
result.processed++;
result.lastDocId = doc.id;
const data = doc.data();
// Skip condition: Check if already migrated
if (shouldSkip(data)) {
result.skipped++;
logger.debug({ docId: doc.id }, "Skipping already migrated document");
continue;
}
try {
const updates = computeUpdates(data);
if (!dryRun) {
batch.update(doc.ref, {
...updates,
updatedAt: FieldValue.serverTimestamp(),
});
batchCount++;
}
result.updated++;
logger.debug({ docId: doc.id, updates }, "Document will be updated");
} catch (error) {
result.errors++;
logger.error({ docId: doc.id, error }, "Error processing document");
}
}
// Commit batch
if (!dryRun && batchCount > 0) {
await batch.commit();
logger.info(
{ batchCount, totalProcessed: result.processed },
"Batch committed",
);
}
// Prepare next batch
const lastDoc = snapshot.docs[snapshot.docs.length - 1];
query = db
.collection(COLLECTION_NAME)
.orderBy("__name__")
.startAfter(lastDoc)
.limit(BATCH_SIZE);
}
logger.info(result, "Migration completed");
return result;
} catch (error) {
logger.error({ error, result }, "Migration failed");
throw error;
}
}
/**
* Determine if a document should be skipped (already migrated)
*/
function shouldSkip(data: FirebaseFirestore.DocumentData): boolean {
// TODO: Implement skip logic based on migration requirements
// Example: return data.newField !== undefined;
return false;
}
/**
* Compute the updates to apply to a document
*/
function computeUpdates(
data: FirebaseFirestore.DocumentData,
): Record<string, unknown> {
// TODO: Implement update logic based on migration requirements
// Example:
// return {
// newField: computeNewFieldValue(data),
// oldField: FieldValue.delete()
// };
return {};
}
// CLI execution
if (require.main === module) {
const args = process.argv.slice(2);
const dryRun = !args.includes("--apply");
const startAfter = args
.find((a) => a.startsWith("--start-after="))
?.split("=")[1];
const limit = args.find((a) => a.startsWith("--limit="))?.split("=")[1];
console.log(`
╔══════════════════════════════════════════════════════════════╗
║ FIRESTORE MIGRATION ║
╠══════════════════════════════════════════════════════════════╣
║ Mode: ${dryRun ? "DRY RUN (no changes will be made)" : "APPLY (changes will be committed)"}
║ Collection: ${COLLECTION_NAME}
${startAfter ? `║ Starting after: ${startAfter}\n` : ""}${limit ? `║ Limit: ${limit}\n` : ""}╚══════════════════════════════════════════════════════════════╝
`);
migrate({
dryRun,
startAfter,
limit: limit ? parseInt(limit, 10) : undefined,
})
.then((result) => {
console.log("\n📊 Migration Results:");
console.log(` Processed: ${result.processed}`);
console.log(` Updated: ${result.updated}`);
console.log(` Skipped: ${result.skipped}`);
console.log(` Errors: ${result.errors}`);
if (result.lastDocId) {
console.log(` Last Doc ID: ${result.lastDocId}`);
}
if (result.dryRun) {
console.log("\n⚠️ This was a DRY RUN. Use --apply to commit changes.");
}
process.exit(result.errors > 0 ? 1 : 0);
})
.catch((error) => {
console.error("\n❌ Migration failed:", error);
process.exit(1);
});
}
4. Usage Instructions
Include these in the migration file:
## How to Run
1. **Preview changes (dry run):**
```bash
npx ts-node scripts/migrations/YYYY-MM-DD-description.ts
```
-
Apply changes:
npx ts-node scripts/migrations/YYYY-MM-DD-description.ts --apply -
Resume from specific document:
npx ts-node scripts/migrations/YYYY-MM-DD-description.ts --apply --start-after=docId123 -
Test with limited documents:
npx ts-node scripts/migrations/YYYY-MM-DD-description.ts --limit=10
### 5. Update Type Definitions
After migration, update relevant type files:
```typescript
// Before
export type ResourceBase = {
name: string;
};
// After
export type ResourceBase = {
name: string;
newField: string; // Added in migration YYYY-MM-DD
};
Common Migration Patterns
Adding a New Field
function shouldSkip(data: FirebaseFirestore.DocumentData): boolean {
return data.newField !== undefined;
}
function computeUpdates(data: FirebaseFirestore.DocumentData) {
return {
newField: "defaultValue", // or computed from existing data
};
}
Renaming a Field
function shouldSkip(data: FirebaseFirestore.DocumentData): boolean {
return data.newFieldName !== undefined && data.oldFieldName === undefined;
}
function computeUpdates(data: FirebaseFirestore.DocumentData) {
return {
newFieldName: data.oldFieldName,
oldFieldName: FieldValue.delete(),
};
}
Restructuring Nested Data
function computeUpdates(data: FirebaseFirestore.DocumentData) {
// Flatten nested structure
return {
"settings.theme": data.preferences?.theme ?? "light",
"settings.language": data.preferences?.language ?? "pl",
preferences: FieldValue.delete(),
};
}
Converting Data Types
function computeUpdates(data: FirebaseFirestore.DocumentData) {
// Convert string array to object map
const tagsMap =
(data.tags as string[])?.reduce(
(acc, tag) => ({ ...acc, [tag]: true }),
{},
) ?? {};
return {
tagsMap,
tags: FieldValue.delete(),
};
}
Safety Checklist
Before running migration with --apply:
- Dry run completed successfully
- Sample of changes reviewed manually
- Database backup created (or point-in-time recovery enabled)
- Type definitions ready to update
- Rollback script prepared (for high-risk migrations)
- Team notified of migration window
- Monitoring in place for errors
Rollback Considerations
For reversible migrations, include a rollback function:
export async function rollback(options: MigrationOptions = {}) {
// Reverse the migration logic
function computeRollbackUpdates(data: FirebaseFirestore.DocumentData) {
return {
oldFieldName: data.newFieldName,
newFieldName: FieldValue.delete(),
};
}
// ... rest of migration logic with rollback updates
}
Questions to Ask
When unclear about the migration:
- What is the current structure of affected documents?
- How many documents need to be migrated?
- Is there a deadline or maintenance window?
- What happens to the application during migration?
- Do we need dual-write support during transition?
- What's the rollback strategy if something goes wrong?
Source
git clone https://github.com/JanSzewczyk/claude-plugins/blob/main/plugins/firebase-auth/skills/db-migration/SKILL.mdView on GitHub Overview
This skill generates safe, idempotent Firestore migration scripts to evolve your data schema. It covers adding or renaming fields, transforming data formats, removing deprecated fields, backfilling computed values, and splitting or merging collections to minimize risk and downtime.
How This Skill Works
The process starts with analyzing the proposed change (source/target collections, current vs. target document structures, impact). It then assesses risk and produces a migration script placed at scripts/migrations/YYYY-MM-DD-description.ts, using a template that supports idempotence, resumable execution, dry-run previews, batched processing, and logging for traceability.
When to Use It
- You need to add a new optional field to all documents in a collection.
- You need to rename or restructure a field across a collection (e.g., categoryId to categoryIds).
- You must transform data formats or restructure nested fields (e.g., date strings to timestamps).
- You want to backfill computed values or populate derived fields.
- You need to split or merge collections or reorganize document relationships.
Quick Start
- Step 1: Analyze the change: source/target, affected docs, risks.
- Step 2: Create a migration file at scripts/migrations/YYYY-MM-DD-description.ts using the template.
- Step 3: Run a dry-run, review results, then apply the migration in batches.
Best Practices
- Always start with a dry-run to preview changes before applying.
- Design idempotent operations so running the script twice has no adverse effects.
- Process documents in batches to avoid Firestore timeouts.
- Make the migration resumable by tracking the last processed document.
- Provide a clear rollback strategy and thorough logging for debugging.
Example Use Cases
- Add 'tags' field to all budget documents
- Rename 'categoryId' to 'categoryIds' across expenses collection
- Migrate user preferences to new structure
- Remove deprecated 'oldField' from all documents
- Backfill computed values across users to support new analytics
Frequently Asked Questions
Related Skills
postgresql
chaterm/terminal-skills
PostgreSQL 数据库管理
sql-optimization
chaterm/terminal-skills
SQL 优化与调优
mongodb
chaterm/terminal-skills
MongoDB 数据库管理
SEO Audit
openclaw/skills
Full website SEO audit with parallel subagent delegation. Crawls up to 500 pages, detects business type, delegates to 6 specialists, generates health score.
SEO Page
openclaw/skills
Deep single-page SEO analysis covering on-page elements, content quality, technical meta tags, schema, images, and performance.
SEO Schema
openclaw/skills
Detect, validate, and generate Schema.org structured data in JSON-LD format.