Treyspace
AI workspace shipped to 26 beta users with 5,000+ interactions and 13M tokens processed. Built the full stack: GraphRAG engine, collaborative backend, tool-execution layer.
Outcomes
- 26 beta users
- 5,000+ AI interactions
- 13M tokens processed
Stack
- TypeScript
- Node.js
- Redis
- GCP
- PostgreSQL
What I Learned
- Optimistic concurrency: Distributed locks and version vectors prevented stale graph writes during high-burst collaboration.
- Deterministic execution: Prompt adapters around tool calls reduced model confusion and improved reliability.
- Operational guardrails: Token and latency observability were required before scaling concurrent sessions.
Implementation Notes
- Canvas events stream into a graph ingestion service.
- Graph context is embedded and indexed for retrieval.
- Tool runner executes selected actions with strict output contracts.
- Session gateway persists state snapshots and merges collaborator edits.
Code Snippet
type ToolCall = { name: string; args: Record<string, unknown> };
export async function runTool(call: ToolCall) {
const schema = toolRegistry[call.name];
const args = schema.parse(call.args);
const result = await handlers[call.name](args);
return { name: call.name, result };
}