Stateful VC analyst agents with persistent memory of every startup they've scouted.
POST https://signals.gitdealflow.com/api/a2aLetta (formerly MemGPT) is the only mainstream agent framework where memory is a first-class primitive. For venture work, that maps perfectly: deal flow is fundamentally a longitudinal exercise — you watched this team six months ago, the signal was warm, you passed; now the signal is breakout and you want to remember why you hesitated. Stateless agents lose that thread on every restart.
Wire GitDealFlow into a Letta agent and the agent's archival memory becomes a personal scouting database. Every startup it looks up gets remembered. Every methodology citation gets indexed. When you ask 'have I seen Roboflow before?', the agent can answer with the exact prior context, plus the live commit-velocity delta since you last looked. That's a VC analyst that compounds.
Every startup the agent surfaces is written to archival memory automatically. Restart the server, the watchlist survives. Ask 'who have I been tracking?' and the agent recalls.
Letta agents can edit their own persona block. Start with 'I am a venture analyst' and after 50 conversations the agent has refined its own thesis prompt — 'I focus on dev-tools breakouts under 10 contributors.'
Letta's RAG over conversation history means follow-ups don't burn tool calls. 'What did Modular's velocity look like last week?' hits memory, not the API. Faster, cheaper, and capture-able context.
Run Letta server, expose REST endpoints for the team, every analyst hits the same shared agent with the same accumulated knowledge. The crew's deal flow IQ goes up over time.
# pip install letta-client letta requests
# Run `letta server` first, then:
from letta_client import Letta
import requests
A2A = "https://signals.gitdealflow.com/api/a2a"
# Define the tool as a plain Python function. Letta uses the docstring
# and signature to auto-generate the JSON schema.
def gitdealflow_query(skill: str, args: dict = None) -> dict:
"""Live VC engineering signals from GitDealFlow.
Args:
skill (str): One of get_trending_startups, search_startups_by_sector,
get_startup_signal, get_signals_summary, get_methodology.
args (dict): Skill-specific arguments (e.g. {"sector": "ai-ml"}).
Returns:
dict: JSON-RPC response with .result.artifacts[0].parts[0].data
containing the structured payload.
"""
body = {
"jsonrpc": "2.0", "id": 1,
"method": "message/send",
"params": {"message": {"role": "user", "parts": [
{"kind": "data", "data": {"skill": skill, "args": args or {}}},
]}},
}
return requests.post(A2A, json=body, timeout=15).json()
client = Letta(base_url="http://localhost:8283")
# Upload tool, then create a stateful agent that uses it
tool = client.tools.upsert_from_function(func=gitdealflow_query)
agent = client.agents.create(
name="vc_scout",
memory_blocks=[
{"label": "persona", "value": "I am a VC analyst that tracks engineering acceleration across 985+ startups. I remember every startup I've seen and refine my thesis over time."},
{"label": "human", "value": "The user is a developer-investor writing angel checks."},
],
tool_ids=[tool.id],
model="openai/gpt-5.4",
embedding="openai/text-embedding-3-small",
)
response = client.agents.messages.create(
agent_id=agent.id,
messages=[{"role": "user", "content": "What's accelerating in fintech this week?"}],
)
print(response.messages[-1].content)# Same agent, weeks later. The agent remembers prior context.
from letta_client import Letta
client = Letta(base_url="http://localhost:8283")
# Reuse the agent_id from the prior session.
agent_id = "agent-..." # persisted in your DB
# Letta auto-recalls relevant archival entries. The agent will:
# 1. Surface that Roboflow was tracked 8 weeks ago at warm signal.
# 2. Re-query GitDealFlow for the *current* signal.
# 3. Compute the delta and reason about it in conversation.
response = client.agents.messages.create(
agent_id=agent_id,
messages=[{"role": "user", "content": "Has anything I tracked changed materially?"}],
)
print(response.messages[-1].content)
# Inspect the agent's accumulated archival memory.
archival = client.agents.archival_memory.list(agent_id=agent_id, limit=20)
for entry in archival:
print(entry.text[:160])As of late 2025, Letta exposed an MCP-compatible interface for tool definitions, and the @gitdealflow/mcp-signal package can be wrapped in a Letta tool. The simplest path remains a plain Python function tool that hits the A2A endpoint — Letta's tool auto-generation reads the docstring and signature to build the schema for the LLM.
Memory. LangChain agents are stateless by default; you have to bolt on a vector store and engineer the recall logic yourself. Letta agents have core memory + archival memory + recall memory built in, plus self-editing primitives. For a deal-flow agent that compounds knowledge over months, Letta is the path of least resistance.
Yes — Letta ships a REST API server (`letta server`) that supports multi-user agent management, persistent storage (SQLite or Postgres), and authentication. Multiple analysts can share one VC-scout agent or fork into per-user agents that share archival memory.
Email signal@gitdealflow.com — replies within 24 hours, EU business time. Include the framework name and the error in the message body and a snippet of your tool definition.
Email support