feat: implement 4 project type flows with unique AI experiences
- New multi-step CreateProjectFlow replaces 2-step modal with TypeSelector and 4 setup components (Fresh Idea, Chat Import, Code Import, Migrate) - overview/page.tsx routes to unique main component per creationMode - FreshIdeaMain: wraps AtlasChat with post-discovery decision banner (Generate PRD vs Plan MVP Test) - ChatImportMain: 3-stage flow (intake → extracting → review) with editable insight buckets (decisions, ideas, questions, architecture, users) - CodeImportMain: 4-stage flow (input → cloning → mapping → surfaces) with architecture map and surface selection - MigrateMain: 5-stage flow with audit, review, planning, and migration plan doc with checkbox-tracked tasks and non-destructive warning banner - New API routes: analyze-chats, analyze-repo, analysis-status, generate-migration-plan (all using Gemini) - ProjectShell: accepts creationMode prop, filters/renames tabs per type (code-import hides PRD, migration hides PRD/Grow/Insights, renames Atlas tab) - Right panel adapts content based on creationMode Made-with: Cursor
This commit is contained in:
37
app/api/projects/[projectId]/analysis-status/route.ts
Normal file
37
app/api/projects/[projectId]/analysis-status/route.ts
Normal file
@@ -0,0 +1,37 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { getServerSession } from 'next-auth';
|
||||
import { authOptions } from '@/lib/auth/authOptions';
|
||||
import { query } from '@/lib/db-postgres';
|
||||
|
||||
export async function GET(
|
||||
_req: Request,
|
||||
{ params }: { params: Promise<{ projectId: string }> }
|
||||
) {
|
||||
try {
|
||||
const { projectId } = await params;
|
||||
const session = await getServerSession(authOptions);
|
||||
if (!session?.user?.email) {
|
||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 });
|
||||
}
|
||||
|
||||
const rows = await query<{ data: Record<string, unknown> }>(
|
||||
`SELECT p.data FROM fs_projects p
|
||||
JOIN fs_users u ON u.id = p.user_id
|
||||
WHERE p.id = $1::text AND u.data->>'email' = $2::text LIMIT 1`,
|
||||
[projectId, session.user.email]
|
||||
);
|
||||
|
||||
if (rows.length === 0) {
|
||||
return NextResponse.json({ error: 'Project not found' }, { status: 404 });
|
||||
}
|
||||
|
||||
const data = rows[0].data ?? {};
|
||||
const stage = (data.analysisStage as string) ?? 'cloning';
|
||||
const analysisResult = stage === 'done' ? data.analysisResult : undefined;
|
||||
|
||||
return NextResponse.json({ stage, analysisResult });
|
||||
} catch (err) {
|
||||
console.error('[analysis-status]', err);
|
||||
return NextResponse.json({ error: 'Internal error' }, { status: 500 });
|
||||
}
|
||||
}
|
||||
126
app/api/projects/[projectId]/analyze-chats/route.ts
Normal file
126
app/api/projects/[projectId]/analyze-chats/route.ts
Normal file
@@ -0,0 +1,126 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { getServerSession } from 'next-auth';
|
||||
import { authOptions } from '@/lib/auth/authOptions';
|
||||
import { query } from '@/lib/db-postgres';
|
||||
|
||||
export const maxDuration = 60;
|
||||
|
||||
const GEMINI_API_KEY = process.env.GOOGLE_API_KEY || '';
|
||||
const GEMINI_MODEL = process.env.GEMINI_MODEL || 'gemini-2.0-flash-exp';
|
||||
const GEMINI_BASE_URL = 'https://generativelanguage.googleapis.com/v1beta/models';
|
||||
|
||||
async function callGemini(prompt: string): Promise<string> {
|
||||
const res = await fetch(`${GEMINI_BASE_URL}/${GEMINI_MODEL}:generateContent?key=${GEMINI_API_KEY}`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
contents: [{ parts: [{ text: prompt }] }],
|
||||
generationConfig: { temperature: 0.2, maxOutputTokens: 4096 },
|
||||
}),
|
||||
});
|
||||
const data = await res.json();
|
||||
const text = data?.candidates?.[0]?.content?.parts?.[0]?.text ?? '';
|
||||
return text;
|
||||
}
|
||||
|
||||
function parseJsonBlock(raw: string): unknown {
|
||||
const trimmed = raw.trim();
|
||||
const cleaned = trimmed.startsWith('```')
|
||||
? trimmed.replace(/^```(?:json)?/i, '').replace(/```$/, '').trim()
|
||||
: trimmed;
|
||||
return JSON.parse(cleaned);
|
||||
}
|
||||
|
||||
export async function POST(
|
||||
req: Request,
|
||||
{ params }: { params: Promise<{ projectId: string }> }
|
||||
) {
|
||||
try {
|
||||
const { projectId } = await params;
|
||||
const session = await getServerSession(authOptions);
|
||||
if (!session?.user?.email) {
|
||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 });
|
||||
}
|
||||
|
||||
const body = await req.json() as { chatText?: string };
|
||||
const chatText = body.chatText?.trim() || '';
|
||||
|
||||
if (!chatText) {
|
||||
return NextResponse.json({ error: 'chatText is required' }, { status: 400 });
|
||||
}
|
||||
|
||||
// Verify project ownership
|
||||
const rows = await query<{ data: Record<string, unknown> }>(
|
||||
`SELECT p.data FROM fs_projects p
|
||||
JOIN fs_users u ON u.id = p.user_id
|
||||
WHERE p.id = $1::text AND u.data->>'email' = $2::text LIMIT 1`,
|
||||
[projectId, session.user.email]
|
||||
);
|
||||
if (rows.length === 0) {
|
||||
return NextResponse.json({ error: 'Project not found' }, { status: 404 });
|
||||
}
|
||||
|
||||
const extractionPrompt = `You are a product analyst. A founder has pasted AI chat conversation history below.
|
||||
|
||||
Extract and categorise the following from those conversations. Return ONLY valid JSON — no markdown, no explanation.
|
||||
|
||||
JSON schema:
|
||||
{
|
||||
"decisions": ["string — concrete decisions already made"],
|
||||
"ideas": ["string — product ideas and features mentioned"],
|
||||
"openQuestions": ["string — unresolved questions that still need answers"],
|
||||
"architecture": ["string — technical architecture notes, stack choices, infra decisions"],
|
||||
"targetUsers": ["string — user segments, personas, or target audiences mentioned"]
|
||||
}
|
||||
|
||||
Each array can be empty if nothing was found for that category. Extract real content — be specific and concise. Max 10 items per bucket.
|
||||
|
||||
--- CHAT HISTORY START ---
|
||||
${chatText.slice(0, 12000)}
|
||||
--- CHAT HISTORY END ---
|
||||
|
||||
Return only the JSON object:`;
|
||||
|
||||
const raw = await callGemini(extractionPrompt);
|
||||
|
||||
let analysisResult: {
|
||||
decisions: string[];
|
||||
ideas: string[];
|
||||
openQuestions: string[];
|
||||
architecture: string[];
|
||||
targetUsers: string[];
|
||||
};
|
||||
|
||||
try {
|
||||
analysisResult = parseJsonBlock(raw) as typeof analysisResult;
|
||||
} catch {
|
||||
// Fallback: return empty buckets with a note
|
||||
analysisResult = {
|
||||
decisions: [],
|
||||
ideas: [],
|
||||
openQuestions: ["Could not parse extracted insights — try pasting more structured conversation"],
|
||||
architecture: [],
|
||||
targetUsers: [],
|
||||
};
|
||||
}
|
||||
|
||||
// Save analysis result to project data
|
||||
const current = rows[0].data ?? {};
|
||||
const updated = {
|
||||
...current,
|
||||
analysisResult,
|
||||
creationStage: 'review',
|
||||
updatedAt: new Date().toISOString(),
|
||||
};
|
||||
|
||||
await query(
|
||||
`UPDATE fs_projects SET data = $2::jsonb WHERE id = $1::text`,
|
||||
[projectId, JSON.stringify(updated)]
|
||||
);
|
||||
|
||||
return NextResponse.json({ analysisResult });
|
||||
} catch (err) {
|
||||
console.error('[analyze-chats]', err);
|
||||
return NextResponse.json({ error: 'Internal error' }, { status: 500 });
|
||||
}
|
||||
}
|
||||
216
app/api/projects/[projectId]/analyze-repo/route.ts
Normal file
216
app/api/projects/[projectId]/analyze-repo/route.ts
Normal file
@@ -0,0 +1,216 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { getServerSession } from 'next-auth';
|
||||
import { authOptions } from '@/lib/auth/authOptions';
|
||||
import { query } from '@/lib/db-postgres';
|
||||
import { execSync } from 'child_process';
|
||||
import { existsSync, readdirSync, readFileSync, statSync, rmSync } from 'fs';
|
||||
import { join } from 'path';
|
||||
|
||||
export const maxDuration = 120;
|
||||
|
||||
const GEMINI_API_KEY = process.env.GOOGLE_API_KEY || '';
|
||||
const GEMINI_MODEL = process.env.GEMINI_MODEL || 'gemini-2.0-flash-exp';
|
||||
const GEMINI_BASE_URL = 'https://generativelanguage.googleapis.com/v1beta/models';
|
||||
|
||||
async function callGemini(prompt: string): Promise<string> {
|
||||
const res = await fetch(`${GEMINI_BASE_URL}/${GEMINI_MODEL}:generateContent?key=${GEMINI_API_KEY}`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
contents: [{ parts: [{ text: prompt }] }],
|
||||
generationConfig: { temperature: 0.2, maxOutputTokens: 6000 },
|
||||
}),
|
||||
});
|
||||
const data = await res.json();
|
||||
return data?.candidates?.[0]?.content?.parts?.[0]?.text ?? '';
|
||||
}
|
||||
|
||||
function parseJsonBlock(raw: string): unknown {
|
||||
const trimmed = raw.trim();
|
||||
const cleaned = trimmed.startsWith('```')
|
||||
? trimmed.replace(/^```(?:json)?/i, '').replace(/```$/, '').trim()
|
||||
: trimmed;
|
||||
return JSON.parse(cleaned);
|
||||
}
|
||||
|
||||
// Read a file safely, returning empty string on failure
|
||||
function safeRead(path: string, maxBytes = 8000): string {
|
||||
try {
|
||||
if (!existsSync(path)) return '';
|
||||
const content = readFileSync(path, 'utf8');
|
||||
return content.slice(0, maxBytes);
|
||||
} catch {
|
||||
return '';
|
||||
}
|
||||
}
|
||||
|
||||
// Walk directory and collect file listing (relative paths), limited to avoid huge outputs
|
||||
function walkDir(dir: string, depth = 0, maxDepth = 4, acc: string[] = []): string[] {
|
||||
if (depth > maxDepth) return acc;
|
||||
try {
|
||||
const entries = readdirSync(dir, { withFileTypes: true });
|
||||
for (const e of entries) {
|
||||
if (e.name.startsWith('.') || e.name === 'node_modules' || e.name === '__pycache__' || e.name === '.git') continue;
|
||||
const full = join(dir, e.name);
|
||||
const rel = full.replace(dir + '/', '');
|
||||
if (e.isDirectory()) {
|
||||
acc.push(rel + '/');
|
||||
walkDir(full, depth + 1, maxDepth, acc);
|
||||
} else {
|
||||
acc.push(rel);
|
||||
}
|
||||
}
|
||||
} catch { /* skip */ }
|
||||
return acc;
|
||||
}
|
||||
|
||||
async function updateStage(projectId: string, currentData: Record<string, unknown>, stage: string) {
|
||||
const updated = { ...currentData, analysisStage: stage, updatedAt: new Date().toISOString() };
|
||||
await query(
|
||||
`UPDATE fs_projects SET data = $2::jsonb WHERE id = $1::text`,
|
||||
[projectId, JSON.stringify(updated)]
|
||||
);
|
||||
return updated;
|
||||
}
|
||||
|
||||
export async function POST(
|
||||
req: Request,
|
||||
{ params }: { params: Promise<{ projectId: string }> }
|
||||
) {
|
||||
try {
|
||||
const { projectId } = await params;
|
||||
const session = await getServerSession(authOptions);
|
||||
if (!session?.user?.email) {
|
||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 });
|
||||
}
|
||||
|
||||
const body = await req.json() as { repoUrl?: string };
|
||||
const repoUrl = body.repoUrl?.trim() || '';
|
||||
|
||||
if (!repoUrl.startsWith('http')) {
|
||||
return NextResponse.json({ error: 'Invalid repository URL' }, { status: 400 });
|
||||
}
|
||||
|
||||
// Verify ownership
|
||||
const rows = await query<{ data: Record<string, unknown> }>(
|
||||
`SELECT p.data FROM fs_projects p
|
||||
JOIN fs_users u ON u.id = p.user_id
|
||||
WHERE p.id = $1::text AND u.data->>'email' = $2::text LIMIT 1`,
|
||||
[projectId, session.user.email]
|
||||
);
|
||||
if (rows.length === 0) {
|
||||
return NextResponse.json({ error: 'Project not found' }, { status: 404 });
|
||||
}
|
||||
|
||||
let currentData = rows[0].data ?? {};
|
||||
currentData = await updateStage(projectId, currentData, 'cloning');
|
||||
|
||||
// Clone repo into temp dir (fire and forget — status is polled separately)
|
||||
const tmpDir = `/tmp/vibn-${projectId}`;
|
||||
|
||||
// Run async so the request returns quickly and client can poll
|
||||
setImmediate(async () => {
|
||||
try {
|
||||
// Clean up any existing clone
|
||||
if (existsSync(tmpDir)) {
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
}
|
||||
|
||||
execSync(`git clone --depth=1 "${repoUrl}" "${tmpDir}"`, {
|
||||
timeout: 60_000,
|
||||
stdio: 'ignore',
|
||||
});
|
||||
|
||||
let data = { ...currentData };
|
||||
data = await updateStage(projectId, data, 'reading');
|
||||
|
||||
// Read key files
|
||||
const manifest: Record<string, string> = {};
|
||||
const keyFiles = [
|
||||
'package.json', 'package-lock.json', 'yarn.lock', 'pnpm-lock.yaml',
|
||||
'requirements.txt', 'Pipfile', 'pyproject.toml',
|
||||
'Dockerfile', 'docker-compose.yml', 'docker-compose.yaml',
|
||||
'README.md', '.env.example', '.env.sample',
|
||||
'next.config.js', 'next.config.ts', 'next.config.mjs',
|
||||
'vite.config.ts', 'vite.config.js',
|
||||
'tsconfig.json',
|
||||
'prisma/schema.prisma', 'schema.prisma',
|
||||
];
|
||||
for (const f of keyFiles) {
|
||||
const content = safeRead(join(tmpDir, f));
|
||||
if (content) manifest[f] = content;
|
||||
}
|
||||
|
||||
const fileListing = walkDir(tmpDir).slice(0, 300).join('\n');
|
||||
|
||||
data = await updateStage(projectId, data, 'analyzing');
|
||||
|
||||
const analysisPrompt = `You are a senior full-stack architect. Analyse this repository and return a structured architecture map.
|
||||
|
||||
File listing (top-level):
|
||||
${fileListing}
|
||||
|
||||
Key file contents:
|
||||
${Object.entries(manifest).map(([k, v]) => `\n### ${k}\n${v}`).join('')}
|
||||
|
||||
Return ONLY valid JSON with this structure:
|
||||
{
|
||||
"summary": "1-2 sentence project summary",
|
||||
"rows": [
|
||||
{ "category": "Tech Stack", "item": "Next.js 15", "status": "found", "detail": "next.config.ts present" },
|
||||
{ "category": "Database", "item": "PostgreSQL", "status": "found", "detail": "prisma/schema.prisma detected" },
|
||||
{ "category": "Auth", "item": "Authentication", "status": "missing", "detail": "No auth library detected" }
|
||||
],
|
||||
"suggestedSurfaces": ["marketing", "admin"]
|
||||
}
|
||||
|
||||
Categories to cover: Tech Stack, Infrastructure, Database, API Surface, Frontend, Auth, Third-party, Missing / Gaps
|
||||
Status values: "found", "partial", "missing"
|
||||
suggestedSurfaces should only include items from: ["marketing", "web-app", "admin", "api"]
|
||||
Suggest surfaces that are MISSING or incomplete in the current codebase.
|
||||
|
||||
Return only the JSON:`;
|
||||
|
||||
const raw = await callGemini(analysisPrompt);
|
||||
let analysisResult;
|
||||
try {
|
||||
analysisResult = parseJsonBlock(raw);
|
||||
} catch {
|
||||
analysisResult = {
|
||||
summary: 'Could not fully parse the repository structure.',
|
||||
rows: [{ category: 'Tech Stack', item: 'Repository detected', status: 'found', detail: fileListing.split('\n').slice(0, 5).join(', ') }],
|
||||
suggestedSurfaces: ['marketing'],
|
||||
};
|
||||
}
|
||||
|
||||
// Save result and mark done
|
||||
const finalData = {
|
||||
...data,
|
||||
analysisStage: 'done',
|
||||
analysisResult,
|
||||
creationStage: 'mapping',
|
||||
sourceData: { ...(data.sourceData as object || {}), repoUrl },
|
||||
updatedAt: new Date().toISOString(),
|
||||
};
|
||||
await query(
|
||||
`UPDATE fs_projects SET data = $2::jsonb WHERE id = $1::text`,
|
||||
[projectId, JSON.stringify(finalData)]
|
||||
);
|
||||
} catch (err) {
|
||||
console.error('[analyze-repo] background error', err);
|
||||
await query(
|
||||
`UPDATE fs_projects SET data = $2::jsonb WHERE id = $1::text`,
|
||||
[projectId, JSON.stringify({ ...currentData, analysisStage: 'error', analysisError: String(err) })]
|
||||
);
|
||||
} finally {
|
||||
// Clean up
|
||||
try { if (existsSync(tmpDir)) rmSync(tmpDir, { recursive: true, force: true }); } catch { /* ok */ }
|
||||
}
|
||||
});
|
||||
|
||||
return NextResponse.json({ started: true });
|
||||
} catch (err) {
|
||||
console.error('[analyze-repo]', err);
|
||||
return NextResponse.json({ error: 'Internal error' }, { status: 500 });
|
||||
}
|
||||
}
|
||||
139
app/api/projects/[projectId]/generate-migration-plan/route.ts
Normal file
139
app/api/projects/[projectId]/generate-migration-plan/route.ts
Normal file
@@ -0,0 +1,139 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import { getServerSession } from 'next-auth';
|
||||
import { authOptions } from '@/lib/auth/authOptions';
|
||||
import { query } from '@/lib/db-postgres';
|
||||
|
||||
export const maxDuration = 120;
|
||||
|
||||
const GEMINI_API_KEY = process.env.GOOGLE_API_KEY || '';
|
||||
const GEMINI_MODEL = process.env.GEMINI_MODEL || 'gemini-2.0-flash-exp';
|
||||
const GEMINI_BASE_URL = 'https://generativelanguage.googleapis.com/v1beta/models';
|
||||
|
||||
async function callGemini(prompt: string): Promise<string> {
|
||||
const res = await fetch(`${GEMINI_BASE_URL}/${GEMINI_MODEL}:generateContent?key=${GEMINI_API_KEY}`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
contents: [{ parts: [{ text: prompt }] }],
|
||||
generationConfig: { temperature: 0.3, maxOutputTokens: 8000 },
|
||||
}),
|
||||
});
|
||||
const data = await res.json();
|
||||
return data?.candidates?.[0]?.content?.parts?.[0]?.text ?? '';
|
||||
}
|
||||
|
||||
export async function POST(
|
||||
req: Request,
|
||||
{ params }: { params: Promise<{ projectId: string }> }
|
||||
) {
|
||||
try {
|
||||
const { projectId } = await params;
|
||||
const session = await getServerSession(authOptions);
|
||||
if (!session?.user?.email) {
|
||||
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 });
|
||||
}
|
||||
|
||||
const body = await req.json() as {
|
||||
analysisResult?: Record<string, unknown>;
|
||||
sourceData?: { repoUrl?: string; liveUrl?: string; hosting?: string };
|
||||
};
|
||||
|
||||
// Verify ownership
|
||||
const rows = await query<{ data: Record<string, unknown> }>(
|
||||
`SELECT p.data FROM fs_projects p
|
||||
JOIN fs_users u ON u.id = p.user_id
|
||||
WHERE p.id = $1::text AND u.data->>'email' = $2::text LIMIT 1`,
|
||||
[projectId, session.user.email]
|
||||
);
|
||||
if (rows.length === 0) {
|
||||
return NextResponse.json({ error: 'Project not found' }, { status: 404 });
|
||||
}
|
||||
|
||||
const current = rows[0].data ?? {};
|
||||
const projectName = (current.productName as string) || (current.name as string) || 'the product';
|
||||
const { analysisResult, sourceData } = body;
|
||||
|
||||
const prompt = `You are a senior DevOps and platform migration architect. Generate a comprehensive, phased migration plan in Markdown for migrating an existing product into a new infrastructure (VIBN — a self-hosted PaaS).
|
||||
|
||||
Product: ${projectName}
|
||||
Repo: ${sourceData?.repoUrl || 'Not provided'}
|
||||
Live URL: ${sourceData?.liveUrl || 'Not provided'}
|
||||
Current hosting: ${sourceData?.hosting || 'Unknown'}
|
||||
|
||||
Architecture audit summary:
|
||||
${analysisResult?.summary || 'No audit data provided.'}
|
||||
|
||||
Detected components:
|
||||
${JSON.stringify(analysisResult?.rows || [], null, 2).slice(0, 3000)}
|
||||
|
||||
Generate a complete migration plan with exactly these 4 phases:
|
||||
|
||||
# ${projectName} — Migration Plan
|
||||
|
||||
## Overview
|
||||
Brief 2-3 sentence description of the migration approach and guiding principle (non-destructive duplication).
|
||||
|
||||
## Phase 1: Mirror
|
||||
Set up parallel infrastructure on VIBN without touching production.
|
||||
- [ ] Clone repository to VIBN Gitea
|
||||
- [ ] Configure Coolify application
|
||||
- [ ] Set up identical database schema
|
||||
- [ ] Configure environment variables
|
||||
- [ ] Verify build passes
|
||||
|
||||
## Phase 2: Validate
|
||||
Run both systems in parallel and compare outputs.
|
||||
- [ ] Route 5% of traffic to new infrastructure (or test internally)
|
||||
- [ ] Compare API responses between old and new
|
||||
- [ ] Run full end-to-end test suite
|
||||
- [ ] Validate data sync between databases
|
||||
- [ ] Sign off on performance benchmarks
|
||||
|
||||
## Phase 3: Cutover
|
||||
Redirect production traffic to the new infrastructure.
|
||||
- [ ] Update DNS records to point to VIBN load balancer
|
||||
- [ ] Monitor error rates and latency for 24h
|
||||
- [ ] Validate all integrations (auth, payments, third-party APIs)
|
||||
- [ ] Keep old infrastructure on standby for 7 days
|
||||
|
||||
## Phase 4: Decommission
|
||||
Remove old infrastructure after successful validation period.
|
||||
- [ ] Confirm all data has been migrated
|
||||
- [ ] Archive old repository access
|
||||
- [ ] Terminate old hosting resources
|
||||
- [ ] Update all internal documentation
|
||||
|
||||
## Risk Register
|
||||
| Risk | Likelihood | Impact | Mitigation |
|
||||
|------|-----------|--------|------------|
|
||||
| Database migration failure | Medium | High | Full backup before any migration step |
|
||||
| DNS propagation delay | Low | Medium | Use low TTL before cutover |
|
||||
| Third-party integration breakage | Medium | High | Test all webhooks and OAuth in Phase 2 |
|
||||
|
||||
## Rollback Plan
|
||||
At any phase, revert by: pointing DNS back to original infrastructure. Data written during parallel run must be synced back manually. Old infrastructure MUST remain live until Phase 4 completes.
|
||||
|
||||
---
|
||||
|
||||
Write a thorough, specific plan. Use real details from the audit where available. Every checklist item should be actionable. Return only the Markdown document.`;
|
||||
|
||||
const migrationPlan = await callGemini(prompt);
|
||||
|
||||
// Save to project
|
||||
const updated = {
|
||||
...current,
|
||||
migrationPlan,
|
||||
creationStage: 'plan',
|
||||
updatedAt: new Date().toISOString(),
|
||||
};
|
||||
await query(
|
||||
`UPDATE fs_projects SET data = $2::jsonb WHERE id = $1::text`,
|
||||
[projectId, JSON.stringify(updated)]
|
||||
);
|
||||
|
||||
return NextResponse.json({ migrationPlan });
|
||||
} catch (err) {
|
||||
console.error('[generate-migration-plan]', err);
|
||||
return NextResponse.json({ error: 'Internal error' }, { status: 500 });
|
||||
}
|
||||
}
|
||||
Reference in New Issue
Block a user