Skip to main content

Stop Making AI Optimizations That Don't Actually Work

Serving Markdown to crawlers risks cloaking penalties. Adding llms.txt won't boost visibility. Separate fact from hype in AI optimization.

Stop Making AI Optimizations That Don't Actually Work
Share on LinkedIn

The last few months have been confusing. What should we do with our websites to prepare them for AI? What should we avoid? Finding the right answers is harder than ever because multiple authorities are saying the opposite things.

To figure out what actually improves visibility across AI search and tools, we need to go deeper. We need to understand how AI models work, what server logs reveal about crawler behavior, and track AI visibility. I covered that last topic in the previous episode. It's complex because tracking is neither easy nor cheap, and it's not suitable for every website.

So let's talk about some popular "optimizations" that are doing more harm than good.

Serving Markdown Instead of Standard Pages

Probably the most controversial suggestion I've heard recently is serving Markdown to AI crawlers instead of traditional web pages. It sounds interesting at first. But the problem runs deeper than most people realize.

Crawlers don't just extract text. They evaluate websites as a whole. That's how they distinguish authority and pick up on signals that plain Markdown simply cannot provide.

One of the biggest risks here is cloaking. When you serve different content based on the user agent, Google can remove your website from traditional search entirely. That's a risk no one should take lightly.

On the other hand, Markdown can save a significant amount of tokens. The reduction is often up to 80%. Even Cloudflare recently announced Markdown for Agents. But there is still no wider support from other companies.

For now, the risks outweigh the benefits.

llms.txt and llms-full.txt

These two files are not an official standard yet. Based on my monitoring of server logs from multiple websites across different languages, sizes, and industries, I can tell you that none of the current big players are using these files.

The story with llms.txt is simple. You can create it in a few minutes and leave it on your server until AI crawlers start to use it. It's a nice "set it and forget it" addition (ideally you should update it with content updates).

The story with llms-full.txt is much worse.

For small websites with a dozen pages, creating and updating this file is an easy task. AI crawlers could potentially use it because the file would stay below 1M tokens. But on larger websites, it becomes impossible. Generating that much content at once would be an extremely heavy task for servers to process.

There's also a second problem. The token input limits of current AI models. For most larger websites, this file would be significantly over 1M tokens. In many cases, it could reach over 100M tokens or even 1B tokens. That makes it completely unusable. You wouldn't even be able to open it in current tools and browsers.

Client-Side Schema Is Invisible to AI

I've noticed multiple services that use AI to generate Schema implementation for your website or e-shop. All you need to do is add one piece of JavaScript from the service's domain.

While this approach can help in traditional search because Google is able to render JavaScript, it's still not recommended by Google to implement Schema this way.

For AI, it's even worse. The majority of AI crawlers still don't render JavaScript at all. That means they won't see those JSON-LD schemas. You're paying for an expensive service that is invisible to the very systems you're trying to optimize for.

What Should You Actually Do?

So what works and what doesn't? The simplest advice I can give to all website owners and managers is this: follow SEO standards. Technical SEO, in particular, is even more important for AI crawlers than it is for traditional search.

On LinkedIn, X, and other social platforms, you can see millions of self-proclaimed "AI Experts" who have no clue how things actually work. So as I said last time, don't give in to FOMO. Focus on fundamentals. They still matter the most.

Martin Stepanek

Martin Stepanek

Technical SEO & Web Performance Consultant

With 10+ years building and optimizing websites, I've learned that technical excellence drives business success. I help companies maximize their website's potential through strategic technical SEO and performance improvements that create better experiences for users and stronger results for businesses.

Newsletter

Get Biweekly Technical SEO Insights

Get actionable strategies that help business owners and developers create exceptional user experiences, optimize technical SEO and performance, and drive revenue growth.

    Mersudin ForbesMark Williams-CookAleyda Solis
    Recommended by industry leaders

    No spam, ever. Unsubscribe at any time.

    By subscribing, I agree to the Privacy Policy and Terms and Conditions.

    Get Free Technical SEO & Web Performance Tips

    Follow Me