Cloudflare just released a tool that scans your website and tells you how ready it is for AI agents. It is called Is It Agent Ready and it evaluates emerging standards like robots.txt Content Signals, Markdown negotiation, MCP, OAuth, Agent Skills and more.
My blog scored 50 out of 100. With vanilla PHP, MariaDB, shared hosting and zero frameworks. No protected APIs, no OAuth, no MCP Server Card. And that score is exactly what it should be.
What the tool evaluates
The test checks 10 standards. Some make sense for any website. Others are exclusive to SaaS platforms with public APIs. The mistake would be trying to implement all 10 without understanding which ones apply to your case.
The 10 standards it evaluates are: Content Signals in robots.txt, Markdown negotiation for agents, API Catalog discovery, OAuth/OIDC metadata, OAuth Protected Resource, MCP Server Card, Agent Skills discovery, WebMCP, Link headers RFC 8288, and llms.txt.
What I implemented (and why)
Out of the 10 standards, I implemented 4. These are the ones that make sense for a personal blog that wants to be cited by AI models.
First, llms.txt. A file at the domain root that tells AI models who I am, what I write about and which pages matter. I have had it for months and it is the foundation of my entire Generative Engine Optimization strategy. I built a free generator so anyone can create their own.
Second, Content Signals in robots.txt. One line that tells AI crawlers my content usage preferences. Mine says: do not train models with my content, yes show it in search results, yes use it as input for AI responses. It is the difference between "train on my work for free" and "cite me when you answer questions about my topics."
Third, Markdown for Agents. When an AI agent requests a page with the Accept: text/markdown header, my server returns a clean Markdown version instead of HTML. Agents process Markdown more efficiently than HTML. The implementation was a PHP file that converts post content automatically and an .htaccess rule that detects the header.
Fourth, Agent Skills discovery. A JSON file at /.well-known/agent-skills/index.json that describes the tools available on my site: the llms.txt generator, the GEO Tarot, the sitemap and llms.txt itself. Agents can discover what they can do on your site without navigating every page.
What I did not implement (and why)
The other 6 standards do not apply to a blog. Implementing them would be like putting airbags on a bicycle.
OAuth/OIDC discovery and OAuth Protected Resource are for sites with protected APIs that require authentication. My blog has no login and no private APIs. There is nothing to protect with tokens.
API Catalog (RFC 9727) is for publishing a catalog of public APIs with endpoints, documentation and health status. My blog has interactive tools but does not expose REST APIs.
MCP Server Card is a draft standard for agents to discover MCP servers. It is not even finalized. Implementing something that could change tomorrow is voluntary technical debt.
WebMCP is a Chrome experimental feature for exposing site actions to agents through the browser. Almost nobody implements it yet.
Link headers (RFC 8288) are HTTP headers for resource discovery. Nice to have but zero impact on a blog's AI visibility.
How to implement the 4 that matter
All four changes can be implemented in less than an hour. You do not need frameworks or external dependencies.
For llms.txt, create a Markdown file at your domain root with your name, description, topics and key pages. My free generator walks you through it step by step.
For Content Signals, add one line to your robots.txt:
Content-Signal: ai-train=no, search=yes, ai-input=yes
For Markdown for Agents, you need a PHP script that detects the Accept: text/markdown header and returns your content converted, plus an .htaccess rule:
RewriteCond %{HTTP_ACCEPT} text/markdown
RewriteRule ^(.*)$ markdown-negotiator.php [L,QSA]
For Agent Skills, create a JSON file listing your site resources and serve it at /.well-known/agent-skills/index.json.
Why 50/100 is the right score
Not all standards apply to all sites. A personal blog does not need OAuth. An online store does not need MCP. The perfect score is not 100. It is the maximum that makes sense for your case.
50/100 for a blog running vanilla PHP on shared hosting means I am implementing everything an AI agent needs to discover, understand and cite my content. Nothing more, nothing less.
Most blogs do not even have llms.txt. Most do not know Content Signals exist. Most do not return Markdown when an agent asks for it. If your blog has those four things, you are more prepared for AI agents than 95% of sites on the internet.
You can test your own site at isitagentready.com. No account needed. Just your URL.
The next SEO battlefield is not rankings. It is citations. And citations start with structure.
I also wrote about this topic: Claude Code Tutorial: How a Designer Built a Website from Scratch (2026) | Shinobis.