Replaces the single 800-line tools.ts and its switch dispatcher with a
Theia-inspired registry pattern — each tool domain is its own file, and
dispatch is a plain Map.get() call with no central routing function.
New structure in src/tools/:
registry.ts — ToolDefinition (with handler), registerTool(), executeTool(), ALL_TOOLS
context.ts — ToolContext, MemoryUpdate interfaces
security.ts — PROTECTED_* constants + assertGiteaWritable/assertCoolifyDeployable
utils.ts — safeResolve(), EXCLUDED set
file.ts — read_file, write_file, replace_in_file, list_directory, find_files, search_code
shell.ts — execute_command
git.ts — git_commit_and_push
coolify.ts — coolify_*, list_all_apps, get_app_status, deploy_app
gitea.ts — gitea_*, list_repos, list_all_issues, read_repo_file
agent.ts — spawn_agent, get_job_status
memory.ts — save_memory
index.ts — barrel with side-effect imports + re-exports
Adding a new tool now requires only a new file + registerTool() call.
No switch statement, no shared array to edit. External API unchanged.
Made-with: Cursor
- Empty message fix: skip pushing assistant msg to history when both
content and tool_calls are absent (GLM-5 mid-reasoning token exhaustion).
Also filter preexisting empty assistant messages from returned history.
- System prompt now correctly injects knowledgeContext from opts into the
Tier-B system message (was missing in the loop's buildMessages).
- GITEA_API_TOKEN updated externally in Coolify (old token was invalid).
Made-with: Cursor
The VM's metadata server doesn't grant cloud-platform scope by default.
Read GOOGLE_APPLICATION_CREDENTIALS_JSON env var (service account key JSON)
and pass it directly to GoogleAuth. Falls back to metadata server if unset.
This restores GLM-5 access via Vertex AI.
Made-with: Cursor
gcloud is not available inside the Docker container. Use google-auth-library
instead, which reads credentials from the GCP metadata server (works on any
GCP VM) or GOOGLE_APPLICATION_CREDENTIALS env var. Also rebuilds dist/.
Made-with: Cursor
src/llm.ts was never committed — this caused the Docker build to fail
with "Cannot find module './llm'". Also commit updated agent-runner.ts,
agents.ts, and .env.example that reference the new LLM client.
Made-with: Cursor
- Dockerfile now runs tsc during build so committed dist/ is never stale
- ChatResult interface was missing history[] and memoryUpdates[] fields
- Re-add missing MemoryUpdate import in orchestrator.ts
- Rebuild dist/ with all new fields included
Made-with: Cursor
- ToolContext gets memoryUpdates[] — accumulated by save_memory calls
- orchestratorChat accepts preloadedHistory and knowledgeContext opts
- History trimmed to last 40 messages per turn (cost control)
- Knowledge items injected into system prompt as ## Project Memory
- ChatResult returns history[] and memoryUpdates[] for frontend persistence
- server.ts accepts history/knowledge_context from POST body
- save_memory tool: lets AI persist facts (key, type, value) to long-term memory
Made-with: Cursor