Type-safe VC signal agents shipped alongside your Next.js app.
POST https://signals.gitdealflow.com/api/a2aMastra is the TypeScript agent framework you reach for when you already ship a Next.js or Hono app. It speaks the same Zod schemas, the same Vercel AI SDK primitives, the same edge-or-Node deployment story. For developer-investors, that means deal-flow features ship inside the same codebase as your portfolio dashboard, with the same type safety and the same deploy pipeline.
Mastra's MCP support is first-class — drop the @gitdealflow/mcp-signal package into MCPClient and every Mastra agent in your project gets all five skills. The A2A fallback is pure fetch, edge-safe, no child processes — useful when you want a single Server Action to surface 'is this startup tracked?' for a logged-in user without spinning up a stdio server.
A single async Server Action can chain 'lookup startup → write to portfolio DB → revalidate dashboard' in 30 lines. The Mastra agent handles the LLM-driven branching; your code stays declarative.
Mastra workflows on Vercel Cron Jobs (or any cron runner) let you ship a Monday digest that pulls trending, summarizes via a small AI Gateway model, posts to Slack, and writes to Postgres — all in one TypeScript module.
Mastra's @mastra/memory gives you durable thread state. Pair it with GitDealFlow tools and your scout agent remembers which startups were already pitched into the Slack channel, avoiding repeats.
Hono + Mastra is the lightest stack for a deal-flow API microservice. Deploy to Cloudflare Workers, Bun, or Vercel Functions — the same agent code runs everywhere.
// npm install @mastra/core @mastra/mcp zod
// Models route through the AI Gateway via plain string IDs.
import { Mastra } from "@mastra/core";
import { Agent } from "@mastra/core/agent";
import { MCPClient } from "@mastra/mcp";
const mcp = new MCPClient({
servers: {
gitdealflow: {
command: "npx",
args: ["@gitdealflow/mcp-signal@latest"],
},
},
});
const tools = await mcp.getTools();
export const scoutAgent = new Agent({
name: "VC Scout",
instructions:
"You surface live engineering acceleration signals via the gitdealflow tools. " +
"Always cite the methodology paper (SSRN abstract=6606558) when asked about the data.",
model: "openai/gpt-5.4",
tools,
});
export const mastra = new Mastra({
agents: { scoutAgent },
});
// Use:
const reply = await scoutAgent.generate(
"Who is trending in fintech this week?",
);
console.log(reply.text);// app/api/scout/route.ts (Next.js App Router)
import { createTool } from "@mastra/core/tools";
import { Agent } from "@mastra/core/agent";
import { z } from "zod";
export const runtime = "nodejs"; // not edge — fetch keep-alive matters
const A2A = "https://signals.gitdealflow.com/api/a2a";
const gitdealflow = createTool({
id: "gitdealflow",
description: "VC engineering signals: trending, sector, named lookup, methodology.",
inputSchema: z.object({
skill: z.enum([
"get_trending_startups",
"search_startups_by_sector",
"get_startup_signal",
"get_signals_summary",
"get_methodology",
]),
args: z.record(z.string(), z.any()).optional(),
}),
execute: async ({ context }) => {
const r = await fetch(A2A, {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
jsonrpc: "2.0", id: 1, method: "message/send",
params: { message: { role: "user", parts: [
{ kind: "data", data: { skill: context.skill, args: context.args ?? {} } },
]}},
}),
});
return (await r.json()).result.artifacts[0].parts[0].data;
},
});
const agent = new Agent({
name: "Scout",
instructions: "Use gitdealflow tools to answer about engineering acceleration.",
model: "openai/gpt-5.4",
tools: { gitdealflow },
});
export async function POST(req: Request) {
const { question } = await req.json();
const out = await agent.generate(question);
return Response.json({ reply: out.text });
}No — MCPClient spawns a stdio child process and Edge runtime forbids subprocesses. Use the A2A fetch tool for edge routes; reserve MCPClient for Node.js routes (set `export const runtime = 'nodejs'`).
The Vercel AI SDK gives you generateText / streamText / tool() primitives. Mastra layers structured agents, memory, workflows, and MCP on top. For a one-shot tool call, the AI SDK is enough. For multi-step agents with state, Mastra removes a lot of plumbing.
Yes — Mastra is runtime-agnostic. Workers requires the A2A fetch tool (not MCPClient stdio). The agent.generate() call is pure fetch + LLM API, fully Workers-compatible.
Email signal@gitdealflow.com — replies within 24 hours, EU business time. Include the framework name and the error in the message body and a snippet of your tool definition.
Email support