Eleven criteria for evaluating an investor-side startup discovery tool, with the question to ask each vendor and how VC Deal Flow Signal handles it. Use it to filter your shortlist before any demo. The list is opinionated — it weights data transparency, reproducibility, and pricing honesty heavier than coverage breadth or pretty UI.
Already convinced? VC Deal Flow Signal’s pricing is published openly at /pricing — six tiers from free to €4,970/yr Sharp Tier (active-fund tier, application-gated) and a €1,997 one-time Sector Sweep. Founding-member rates on the recurring plans. Use code PH50OFF for 50% off your first 3 months on Dashboard or Insider.
If a tool can’t tell you where the data came from, it can’t tell you when it’s wrong. Closed data sources are unauditable, mean reproducing a finding requires a paid licence, and force you to trust the vendor’s freshness claims.
Question to ask the vendor
“What public APIs or feeds does this tool pull from, and can I see a methodology page that lists every endpoint and refresh cadence?”
How VC Deal Flow Signal handles it
Public GitHub REST API only — every endpoint listed at /methodology, no scraping, no private data. The full methodology is published as an SSRN preprint (abstract id 6606558) and mirrored on Zenodo with a DOI.
Most fund-raise predictors are useful only if they fire weeks before the round closes. A tool that surfaces signals after the press release is a research tool, not a deal-flow tool.
Question to ask the vendor
“What is the typical lead time between a positive signal firing and the public fundraise announcement, with median and percentile data?”
How VC Deal Flow Signal handles it
The 14-day commit-velocity signal fires three to six weeks before the typical seed-stage fundraise announcement. The lag is published in the SSRN preprint with median + interquartile-range numbers.
A 14-day trial is not a free tier. A free tier that strips the actual signal (rankings, refresh cadence, sector breakdown) is not a free tier either. Real free tiers exist because the marginal cost of an extra subscriber is near zero and the discovery value is high.
Question to ask the vendor
“Is there a free tier that surfaces ranked startups every week, or is the only free option a marketing-collateral demo?”
How VC Deal Flow Signal handles it
Free Signal Digest delivers five ranked startups every Monday — same data the paid tiers use, narrowed to the top five. The free MCP server (five read-only tools) bundled with every tier is permanently ungated.
Investors who use Claude / Cursor / ChatGPT for research want to query their deal-flow tool from inside the assistant, not switch tabs. MCP (Model Context Protocol) is the emerging standard for this.
Question to ask the vendor
“Does the tool expose an MCP server, ChatGPT GPT, or Custom Connector that lets me query its data from inside an AI assistant?”
How VC Deal Flow Signal handles it
MCP server published on npm (@gitdealflow/mcp-signal) and live at signals.gitdealflow.com/api/mcp/rpc. ChatGPT GPT live with four read-only Actions. Mistral Le Chat Custom Connector live. All free, all permanent.
A signal you can’t reproduce is a signal you can’t audit. If the vendor goes out of business, raises prices, or pivots, you have nothing — you can’t replicate the metric from public data.
Question to ask the vendor
“Can I take the published methodology and replicate at least one rank on the leaderboard from the same public data the vendor uses?”
How VC Deal Flow Signal handles it
Methodology paper, dataset, and computation code are published openly. Anyone can take the SSRN preprint plus public GitHub data and replicate the 14-day commit-velocity rank for any startup on the leaderboard.
Tools that hide pricing behind ‘contact sales’ assume you’re institutional. Most investors evaluating tools at the seed-or-syndicate level want to compare prices in 30 seconds, not after a 45-minute discovery call.
Question to ask the vendor
“Are list prices published on the website with no ‘contact sales’ gating for the standard tiers?”
How VC Deal Flow Signal handles it
All five tiers (free, €7, €9.97/mo, €97/mo, €1,997 one-time) are published at /pricing with Stripe checkout deep links. No contact-sales gating below €15,000/yr.
Funds that wire deal-flow signals into their own pipelines need machine-readable access. UI-only tools force manual copy-paste, which is fine until the volume crosses a few dozen entries per week.
Question to ask the vendor
“Is there a JSON API and bulk CSV export, with documented rate limits, available at the standard tier or only as an enterprise add-on?”
How VC Deal Flow Signal handles it
Free JSON endpoints (signals.json, weekly summary, methodology) at standard rate limits. Insider Circle (€97/mo) adds bulk CSV pulls and webhook delivery on threshold triggers. No enterprise contract required for higher-volume API access.
Subscription tools that lock you into annual contracts with no refund terms transfer all the risk to the buyer. The vendor either has confidence in the output or they don’t — refund policy reveals which.
Question to ask the vendor
“Is there a money-back window, and can I cancel month-to-month from a self-service portal?”
How VC Deal Flow Signal handles it
30-day Signal-or-It’s-Free guarantee on every paid tier — reply to any email with REFUND in the body, full refund, no questions. Month-to-month subscriptions managed through Stripe customer portal. No cancellation fee.
Tools optimised for US Series A coverage are often blind to European seed and Asian late-stage. Coverage breadth matters more than depth in any one slice for funds with global mandates.
Question to ask the vendor
“How many countries and how many sectors are tracked, and where is the coverage thinnest?”
How VC Deal Flow Signal handles it
109 startups currently tracked across 19 sectors with active GitHub orgs in 31 countries. Thinnest coverage is climate-tech in Asia and bio-infrastructure in Latin America; both surface in the dataset but with smaller sample sizes than AI-infrastructure or fintech.
A deal-flow tool that disappears in 18 months wastes the months you spent building it into your workflow. Open methodology, public dataset, and a reproducibility story are the cheapest insurance against vendor risk.
Question to ask the vendor
“If the vendor goes out of business tomorrow, what survives? The dataset? The methodology? Or am I starting from zero?”
How VC Deal Flow Signal handles it
Dataset CC-BY-4.0 licensed, mirrored on Zenodo (DOI 10.5281/zenodo.19650920) and Kaggle. Methodology open at SSRN. Even if the product itself shuts down, the data and the rules survive — funds can replicate the metric from public GitHub data using the published code.
Investors who write code, contribute to OSS, or maintain personal infrastructure want signals expressed in primitives they understand — commits, contributors, dependencies, language mix. Tools built primarily for non-technical analysts often gloss these over.
Question to ask the vendor
“Does the tool expose raw engineering primitives (commit velocity, contributor count, dependent count, language mix) or only synthesised opinion scores?”
How VC Deal Flow Signal handles it
Both. Raw primitives are published per-startup at /signals/<startup>/raw, and synthesised signal types (hiring burst, infrastructure buildout, shipping sprint, platform migration) are published alongside them. Pick whichever fits your evaluation style.
Start free with the weekly Signal Digest, or buy a one-time Sector Deep Dive at €7 to test the data on a sector you already know. Both are linked from the pricing page.
A small fund evaluating VC deal-flow tools should weight three criteria heaviest: free tier honesty (does the free tier surface real ranked output, or is it a marketing demo), methodology transparency (can you audit the signal calculation against public data), and pricing transparency (are list prices published, or is everything contact-sales). These three filter out roughly two-thirds of the deal-flow tooling market and concentrate the shortlist on tools that are confident enough in their output to publish it openly. The remaining eight criteria — recency, AI integration, reproducibility, API access, guarantee, coverage, vendor stability, developer-investor fit — refine the choice between two or three remaining candidates.
Ask the vendor for: (1) a list of every public API or data feed they pull from, (2) the median lead time between a positive signal firing and the public fundraise announcement with percentile data, (3) a methodology document specific enough that a competent analyst could reproduce one rank on the leaderboard from the same public data, (4) the published list prices for all tiers without a discovery call, (5) the cancellation and refund terms in writing, and (6) what happens to the dataset and methodology if the company shuts down. A vendor that hesitates on any of these six is signalling weakness on a criterion that matters; a vendor that answers all six confidently is shortlist material regardless of price.
They are correlated, not independent. A tool with high published-signal accuracy but a closed methodology offers no way to verify the accuracy claim — you take the vendor at their word, which is the same epistemic position as not having the tool at all. A tool with open methodology and slightly lower accuracy is auditable: an analyst can replicate the metric, see where it works and where it fails, and decide whether the failure modes matter for their thesis. For funds that care about reproducibility — including any fund whose LP base might one day audit the deal-flow process — open methodology is a hard prerequisite, and accuracy becomes the secondary criterion among open-methodology tools.
A practical benchmark for a small fund is one to three percent of an annual sourcing budget. For a fund deploying €5M per year with a sourcing-and-diligence budget of €100K (analyst time plus tooling), that puts deal-flow tooling spend in the €1K-3K per year range. Several tools in this category come in under that ceiling at the standard tier — for example, a €9.97 per month subscription and a €97 per month subscription stack to roughly €1,300 per year. Spending more than three percent of sourcing budget on tooling is a signal that the fund is buying tools as a substitute for analyst time rather than as a multiplier — an inversion that usually shows up later as low conversion from sourced deals to closed investments.
The cheapest test is a one-time deep-dive on a single sector you already know well. If a vendor offers a one-time sector report for under €20, buy that, not the subscription. The signal quality you observe on a sector you already understand at depth is the cleanest possible read on the tool — you can immediately tell whether the rankings track your own knowledge or contradict it for legible reasons. VC Deal Flow Signal’s First Look Pass at €7 is structured exactly for this test: full sector deep dive, €7 credited toward the recurring Dashboard subscription if you upgrade within 14 days. A vendor that doesn’t offer a low-friction sector test is signalling either confidence (no need to test, the data speaks) or absence of confidence (test would expose weaknesses).
Pricing transparency matters because the alternative is a 45-minute discovery call before the buyer can compare options. Discovery calls are expensive in three ways: the buyer’s time, the vendor’s revealed information about the buyer’s budget (which usually anchors the eventual quote upward), and the loss of comparison velocity (most buyers compare two or three tools and the one that requires the longest sales cycle gets dropped). Tools with published pricing absorb a small amount of revenue ceiling in exchange for higher conversion at the small-fund tier. For tools targeting funds below the €100M AUM line, published pricing is usually correlated with higher overall revenue — the friction reduction more than offsets the lost upsell on the few buyers who would have paid more.
Funding-round signals are lagging — the round has either closed or leaked by the time the signal exists. Engineering signals are leading — they fire weeks before the round closes and are based on observable activity (commits, contributors, infrastructure work) that necessarily precedes a fundraise. For sourcing top-of-funnel deals, engineering signals dominate. For tracking competitor moves and benchmarking portfolio companies post-investment, funding-round signals are useful but most funds already have free or near-free access to that data via Crunchbase alerts and PitchBook digest emails. The premium-paid layer should be the leading signal, not the lagging one.
MCP is the open standard for letting AI assistants like Claude, Cursor, and Windsurf query third-party tools. For investor tooling, MCP turns a deal-flow database from a tab in your browser into a callable function inside whatever assistant you already use for research. Asking Claude ‘which AI-infrastructure startups are showing engineering acceleration this week’ and getting a ranked list back, with citations, is qualitatively different from logging into a dashboard. Tools that expose an MCP server are bridging into the agentic-research workflow that most investors will use within 18 months; tools that don’t are betting that the dashboard-as-UI assumption survives, which is increasingly contested. The cost to evaluate is near zero — every MCP server is free to install in Claude or Cursor — so it’s an easy criterion to test in five minutes.
Buyers guide last updated 2026-05-05. Criteria reflect typical small-to-mid fund evaluation patterns; weighting may differ for institutional / late-stage funds with dedicated procurement processes.
By sector (Q2 2026)
By signal type
By stage
Other entry points