Marvin Labs
Common Ground: How Marvin Tracks the Analyst's Mental Model
AI in Research

Common Ground: How Marvin Tracks the Analyst's Mental Model

9 min readAlex Hoffmann, Co-Founder and CEO

An analyst who has covered Apple for eight years does not need an AI to tell them that the company sells iPhones. They know the revenue mix by segment, the gross margin trajectory, the services attach rates, the regulatory overhang in each major geography, and which suppliers matter for each product line. They know who the CFO was before the current one and why they left.

That knowledge was earned over years of reading filings, listening to calls, and updating their view as the story evolved. It is the analyst's most valuable asset on the name. An AI tool that re-explains Apple from scratch every time a new 10-Q drops is generating work, not saving it.

That is the design problem behind the AI analyst mental model we had to build into Marvin. The analyst already has one. It is being updated manually as new information comes in. If our AI cannot engage with it, the tool is either redundant or noise.

Our answer is the Common Ground, which is how Marvin stays in sync with what an analyst already knows.

What analysts actually do during coverage

Coverage is not a series of one-off research tasks. It is a running state that gets updated whenever something new arrives.

A typical analyst covers 40 to 60 names. Each one has a storyline in their head: what the business does, how it makes money, what management has guided to, where the risks sit, and where their view diverges from consensus. When a new press release or 8-K hits the tape, the analyst does not start from scratch. They compare the new content to what they already believed, flag the differences, and decide whether the differences are material enough to revisit the thesis.

Most of the new content is not material. A scheduled dividend announcement confirms what was already guided. A press release on a regional office move does not move the model. The work is triage: separating the lines that reinforce the existing view, the lines that contradict it, and the handful that introduce something the analyst did not know.

That triage is the bottleneck. An analyst covering 50 names reads through dozens of filings, presentations, trade press items, and transcripts in a typical week. Read everything in full and the rest of the coverage list falls behind. Read only the summaries and the one sentence that matters gets lost.

Why generic AI fails this job

Drop a 10-Q into a generic chat and ask for a summary. It will produce a competent summary of the 10-Q. It will not tell you what is different about this 10-Q compared to the last one, or compared to the guidance given on the last earnings call, or compared to the analyst's own model.

The generic chat has no persistent context on the company. Every document arrives as if it were the first. You get the business description re-explained and things you already knew repeated, while the three lines that actually moved the view go unflagged. Those lines only look material when read against what came before, and the chat has no idea what came before.

We have written elsewhere about why generic AI misses analyst priorities: equity research requires prioritization, validation, and persistent context that general-purpose models are not built to maintain. The Common Ground is how we handle the persistent context.

The Common Ground framework

The Common Ground is the set of facts our AI Marvin knows about a company at a given point in time. Basic identifiers like the CEO's name, the fiscal year end, and the reporting segments sit alongside specifics like the guidance range given for segment operating margin on last quarter's call, the named competitors in the risk factors, and the working capital assumptions in the last cash flow walk. It is a structured model of the company, not a vector store of document chunks.

Three properties make it useful in research:

  1. It is cumulative. Each new primary source adds to it. Nothing is thrown away.
  2. It is dated. Every fact is tagged with the source document and the date it was asserted. A CFO who stepped down in 2024 is recorded as CFO until that date, not deleted.
  3. It is contested. When two sources disagree, both are recorded. The Common Ground holds the contradiction explicitly rather than silently picking one.

The same fact tends to show up in many representations. The full-year capex guidance might be delivered as a single sentence in the CFO's prepared remarks on the earnings call, restated as a bullet on page 4 of the earnings release, expanded on in the Q&A when an analyst probes the ramp, and later footnoted in the 10-Q. Each of those is a separate primary source with its own wording, emphasis, and context. Marvin recognizes all four as instances of the same underlying fact, links them together, and records which source said what. The Common Ground is organized around facts, not documents, which is why the extraction step has to look past the packaging.

What happens when a new document arrives

Every time Marvin ingests a new primary source, whether that is a 10-K, a press release, an earnings call transcript, or a regulatory filing, it runs the same process against the Common Ground.

Step one: extract the facts

New Primary Source
Assertions Identified
Structured Facts
Step 1: facts are extracted from the primary source

Marvin pulls the assertions out of the new document. A line in the MD&A stating that segment revenue grew 12% is a fact. A forward-looking statement that management expects full-year capex of $4.2B is a fact. A mention that the CFO will retire at year-end is a fact.

Step two: compare against the Common Ground

Each extracted fact is matched against what Marvin already knows. For every fact, the result falls into one of three categories:

  • Confirmation. The new fact agrees with the existing Common Ground. Nothing changes in the model, but the confirmation itself is logged. This is useful for guidance tracking: when a quarterly result comes in within the range management guided to two quarters ago, the alignment is recorded.
  • Contradiction. The new fact disagrees with something in the existing Common Ground. This is the flag an analyst cares about. A prior statement that the company would not pursue acquisitions in Europe, followed by an announcement of a European acquisition, is a contradiction.
  • New information. The fact addresses something the Common Ground had no prior position on. A new named competitor, a new segment disclosure, a new customer concentration risk.

Step three: update the Common Ground

Confirmations strengthen existing facts. Contradictions become contested facts, with both versions preserved and dated. New information is added.

Step four: surface what matters

The contradictions and the genuinely new information are what the analyst needs to see. The confirmations can stay quiet unless asked about.

Why this changes the output

When an analyst asks Marvin about a new filing, the response is framed against the Common Ground rather than against a blank page. The analyst does not get a summary of the 10-Q. They get a summary of what is new or different in the 10-Q relative to everything previously filed, guided, or disclosed. The lines that changed the picture sit at the top. The confirmation that margin guidance was reiterated is one sentence, not three paragraphs. The restated business description is absent, because the analyst does not need it.

The same framing carries through when the analyst asks a more specific question. "How has management talked about China exposure over the last two years?" is a query against the Common Ground, not against a pile of documents. The answer traces each claim back to the source and date it was asserted, which means the analyst can see the trajectory rather than the most recent statement in isolation.

What the Common Ground is not

The Common Ground is not the analyst's mental model. We do not try to reconstruct an analyst's private view of a stock, their valuation framework, their read on management credibility, or the conversations they have had with the company's IR team. Those stay with the analyst.

What the Common Ground does cover is everything the company has publicly asserted about itself, continuously and at a level of detail that manual tracking cannot match. It runs alongside the analyst's private model. Our job is to read every word of every new document, compare it against the full history, and surface the pieces that deserve attention. The analyst's judgement sits on top.

What this means in practice

For analysts running larger coverage lists, the triage stage compresses. The dozens of primary documents arriving each week reduce to a handful of flagged contradictions and genuinely new items. The judgement calls stay where they belong, and the mechanical work of tracking what everyone has said against what they said before gets automated.

Ramping up on a new name gets easier too. Instead of reading the last eight quarters of filings from scratch to reconstruct the picture, the analyst can start from the current state of the Common Ground and drill into the specific threads that matter for their thesis. The cold-start problem, which we also wrote about in structured AI for equity research, gets smaller.

The analyst is already doing most of the work. The tool's job is the part that does not scale.

Alex Hoffmann
by Alex Hoffmann

Alex is the co-founder and CEO of Marvin Labs. Prior to that, he spent five years in credit structuring and investments at Credit Suisse. He also spent six years as co-founder and CTO at TNX Logistics, which exited via a trade sale. In addition, Alex spent three years in special-situation investments at SIG-i Capital.

Get Started

Experience professional-grade AI for equity research, validate insights for yourself, and see how it fits into your workflow.