• Stop being a LURKER - join our dealer community and get involved. Sign up and start a conversation.

I Audited 100 VDPs to See Which Buyer Questions Go Unanswered

jackcarlson

Lot Lizard
Nov 19, 2025
9
4
Awards
3
First Name
Jack
I audited 100 VDPs, not to see what information they contain, but to see what questions they actually answer.

My thesis: buyers aren’t leaving dealer sites due to lack of data- they’re leaving because their questions aren’t resolved, and they head to Google, Reddit, and ChatGPT to fill the gaps.

The setup (brief)​

I started by analyzing open-source AI prompt data and Google search query data to understand the real questions shoppers ask while researching vehicles, the same questions they ask search engines, forums, and AI tools.

Then I audited 100 live VDPs to see whether those questions are:
  • Answered clearly on the VDP
  • Answered in a way a human or AI can easily interpret
  • Or left unanswered, forcing the buyer to leave the site

What the data showed​

1. Basics are solved​

This part is working extremely well:
  • Year / trim / package clarity: ~96% covered (assuming these are accurate, out of scope for my analysis)
  • Basic purchase confidence (price, CTA): ~96% covered
  • Mileage visibility: ~95% covered
Inventory systems are doing their job. Cars are published cleanly and consistently.

2. The biggest failure: “Why this car?”​

This is the most consistent weakness in the entire dataset:
  • Comparisons & alternatives: ~67% of VDPs fail to help buyers understand how this car compares to similar options
  • 0% of VDPs fully covered this category
Dealers explain what a car is.
They almost never explain why a buyer should choose it over something else.

3. Price is visible, value is not​

Even when pricing is present:
  • ~24% of VDPs fail to justify or contextualize the price

4. Ownership confidence is cautiously handled​

Ownership confidence (durability, expectations, risk reduction): ~15% gap rate
Mileage is usually shown.
But guidance around ownership expectations- service history, durability context, reassurance, is inconsistent.

Buyers are actively seeking this elsewhere.

A pattern that showed up everywhere​

Across nearly every low-performing VDP, I saw the same thing:
  • Pages lead with basic information buyers already assume
  • Long, unprioritized feature dumps
  • Copy-pasted ChatGPT-style un-formatted markdown blocks
  • Massive bullet lists (FM radio, hill-hold assist, door trim, etc.)
  • Heavy boilerplate and disclaimers
This is either information buyers take for granted, or information that’s so in the weeds it doesn’t help them decide.

It’s not how humans read, and it’s not how decisions are made.

The result: lots of text, very little clarity.

In contrast, the strongest VDPs delivered:
  • Clear answers
  • Less noise
  • More meaning per word
Listing data is not the same as answering questions.

Relevance to today​

Modern buyers don’t just read VDPs, they ask AI and search engines to interpret them.

AI does like structured data.
But it really rewards pages that clearly answer common user questions.

When VDPs don’t do that, the dealer loses control of:
  • The narrative
  • The comparison
  • The buyer’s trust
And the buyer finishes the decision somewhere else.

How scoring works (high level)​

Each VDP is evaluated on answer quality, not content presence:
  • Does the page directly answer common buyer questions?
  • Is the information readable and skimmable?
  • Does it provide insight and context, not just specs?
  • Does it reduce the need for the buyer to leave and research elsewhere?

The takeaway​

Most VDPs are digital brochures, not digital salespeople.

They handle inventory well.
They struggle with the consultative part of the sale comparisons, value framing, and ownership confidence, which is exactly where buyers feel the most friction.

If any dealers want me to:
  • Run this audit on their own VDP
  • See where buyers are likely leaving their site
  • Get clear, actionable recommendations on how to improve answer quality
I’m happy to share what I’ve learned.

Full meta analysis: VDP Meta Analysis
Sample for an individual VDP: Audit Report: 2014 Chevrolet Cruze 1LT Alliance OH | Lavery Automotive Sales and Service 1G1PC5SBXE7465037

Thanks for reading! I hope you find some value in this. Open to any questions.
Regards,
Jack
 
A couple quick questions ...

1. How do you know why they leave?​

Do you have:
  • Session recordings
  • Exit surveys
  • Clickstream data
  • Correlation between unanswered questions and exits
  • A/B test results
If not then wouldn't this just be a guess:
“Buyers aren’t leaving due to lack of data — they’re leaving because their questions aren’t resolved”

Why this matters​

Buyers might leave because:
  • They always cross-check on third-party sites (Cars.com, Reddit, KBB)
  • They don’t trust dealer-controlled content by default
  • They want social proof, not explanations
  • They’re comparison shopping
You could be correct or it could be a guess?

2. “What are the real questions shoppers ask?”​

Do you have a list of questions?

3. Is this actually proof of anything?​

What it does prove​

  • VDPs are optimized for data completeness, not decision-making
  • Feature dumps dominate
  • Comparison and context are rare

What it does not prove​

  • That unanswered questions are the primary reason for exits
  • That fixing this alone would improve conversion
  • That buyers would trust dealer-provided comparisons
  • That AI “rewards” this in rankings or lead quality
 
@DjSec

Good questions. Let me clarify what this research is and isn’t.

1. “How do you know why buyers leave?”
I’m not claiming direct causality. I don’t have session replays, exit surveys, or A/B tests tied to these pages. If someone wants to call my thesis a hypothesis rather than a proven causal statement, that’s fair.

What I do have is a very consistent pattern across ~100 live VDPs:
  • Dealers are excellent at publishing inventory data
  • Dealers are weak at answering decision-stage questions
  • Those same questions show up repeatedly in Google searches, forums, and AI prompts
So the claim isn’t “this is the only reason buyers leave.” It’s that there’s a clear gap between what buyers ask and what VDPs answer. That gap almost certainly contributes to leakage, even if buyers also cross-check third-party sites by default.

2. “Buyers might leave for other reasons.”
Absolutely. Buyers comparison shop, seek social proof, and don’t automatically trust dealer content.

I believe this reinforces the point. If buyers are already doing those things, and VDPs don’t even try to address comparisons, value, or ownership context, dealers are effectively opting out of that part of the decision.

3. “What are the real questions shoppers ask?”
Yes, there’s a defined set, which I can share if you're curious. It comes from open-source AI prompt data and Google query trees (People Also Ask via tools like AlsoAsked).
Common examples:
  • Is this car worth buying?
  • What are common problems with this model or year?
  • Is this a good deal compared to similar cars?
  • What should I watch out for before buying?
  • What are better alternatives?
The audit checks whether those questions are answered clearly on the VDP or not.

4. “Is this actually proof of anything?”
It does show that VDPs are optimized for data completeness, not decision-making. Feature dumps and boilerplate dominate, while comparison, value framing, and ownership context are rare.

It does not claim this alone drives conversion, guarantees trust, or replaces third-party sites.

My thesis is narrow: buyers aren’t leaving because dealers lack data, they’re leaving because VDPs rarely resolve the questions buyers are already asking elsewhere.

It's an observable gap, not a closed-loop attribution claim. Surely there is a lot at play here!