"First off, a quick clarification—I’m actually building
@DealerInt , not Horizon! We aren’t an AI SEO tool; we are building the core SaaS infrastructure that dealerships run on.
The reason I’m in this thread is that as an infrastructure builder, I see the root cause of the 'Catch-22' you're describing: the underlying legacy tech is fundamentally broken.
You’re 100% right that global discovery comes first. But discovery doesn't require a paid centralized hub. It requires Infrastructure Sovereignty.
The Hybrid Stack Approach:
Discovery (The Signal): We leverage standard, free rails—like Google Merchant Center feeds and Server-Side Rendered (SSR) structured Schema—to put the dealer on the map. This is how Perplexity, OpenAI, and Anthropic already 'know' a car arrived yesterday without the dealer paying an 'Aggregator Tax.'
Execution (The Handshake): The MCP layer is the 'verification' step. Once discovery points the agent to the site, the agent uses a tool-call to verify price and live status directly from the dealer’s database in real-time.
The 'Discovery Tax' only exists today because 92% of local dealer domains are too technically bloated to be crawled effectively by anything other than a massive aggregator. According to the recent PageSpeed Study, nearly 70% of these sites are in the 'Poor' tier.
That’s why we believe the first step isn't just an llms.txt file; it’s building a clean, fast pipe that those Search APIs actually want to crawl.
I don't have a marketing white paper for you, but as we continue building out this architecture, I’m planning to share some of our Technical Benchmarks—comparing the crawl-latency of legacy stacks to our clean-pipe environment—right here on the forum. I think 'Building in Public' is the best way to prove that the problem isn't the AI agents, it's the legacy infrastructure."