L2

visibility

L1

capability

openclaw.ai

openclaw.ai

✓ verified
AI Visibility: ✓ check completed — level L2
AI Capability: ✓ check completed — level L1
L1 Basic Accessibility 6/6
Major AI crawlers (GPTBot, ClaudeBot, PerplexityBot, Google-Extended) are permitted to access your site. AI crawling allowed
Main content is visible in the HTML source, not only rendered after JavaScript executes. Page content directly readable
The page has a clear title and meta description, helping AI quickly identify the topic. Clear title and description
The page responds quickly enough to avoid AI crawl failures or timeouts. Reasonable response time
The site uses a valid HTTPS certificate. HTTPS secured
Core content isn't blocked by login walls, membership gates, or paywalls. Content is not gated
L2 Content Comprehensibility 5/6
Uses Schema.org / JSON-LD to help AI understand page content more accurately. Has structured data
Open Graph tags provide supplementary title and summary information. Has social sharing info
A canonical URL tells search engines and AI which version of the URL is authoritative. Clear canonical address
The page has a clear H1 and uses H2/H3 headings to organize content logically. Clear heading hierarchy
The HTML lang attribute is set, helping AI identify the page language. Language declared correctly
The page has meaningful text content, not just a few sentences of boilerplate. Substantial content
L3 Discoverability 3/6
An accessible XML sitemap helps AI and search engines discover your pages. Provides a sitemap
The sitemap includes recent pages and isn't neglected over time. Sitemap stays updated
Key content pages are easily reachable from the homepage and main pages. Clear internal linking
Page URLs clearly reflect the content topic, rather than being cryptic parameter strings. Clean, readable URLs
A /llms.txt file proactively tells LLMs which content is most worth paying attention to. Provides llms.txt
The canonical tag points to the current page's standard address, avoiding duplicate page confusion. Consistent canonical setup
L4 Trust & Authority 4/6
Structured data includes basic info like company/organization name, website, and logo. Organization info is clear
Both users and AI can easily find your contact or about page. About and contact info visible
Pages attribute content to an author, team, or organization. Content source is clear
Pages include publish or update dates, helping assess content freshness. Publication dates are clear
The site has essential pages like privacy policy and terms of service. Legal info is complete
Basic security response headers are set, reflecting site maintenance quality. Proper security configuration
L5 AI-Optimized 0/6
Page content is structured for AI to directly extract answers. Has FAQ / HowTo / Q&A structure
Helps AI understand the page's position and hierarchy within the site. Has breadcrumb structure
Products, services, or content include structured Review/Rating data. Has review information
Multilingual pages have clear corresponding relationships, such as hreflang tags. Supports multiple languages
Uses multiple effective Schema.org types, not just one. Richer structured data
Pages contain FAQs, tables, lists, definitions, etc., making it easy for AI to extract and summarize. Clear content block structure

AI Readiness Report

download .md

Executive Summary

OpenClaw.ai has a solid technical foundation for AI discoverability but lacks advanced optimization, limiting its visibility to AI systems. Its API infrastructure is robust for programmatic use, yet it fails to provide key discovery and integration files that would enable seamless agent interaction. The site is functional but not fully optimized for the modern AI ecosystem.

AI Visibility — L2

The site is fundamentally crawlable and its content is clear, but it lacks structured data (Schema.org) and a sitemap, which hinders AI's ability to deeply understand and confidently recommend its pages. Missing trust signals like organization details and publish dates further reduce its authority score with AI systems.

AI Capability — L1

While the site offers a well-structured API with support for write operations and real-time features, it fails to advertise these capabilities to AI agents through standard discovery files like robots.txt, sitemaps, or an agent card. This makes it difficult for autonomous agents to find and integrate with the service programmatically.

A Visibility score of 2/5 means AI systems can find the site but are unlikely to feature it prominently in recommendations. A Capability score of 1/5 indicates AI agents would struggle to discover and autonomously use the site's APIs, missing opportunities for automation and integration.

Top Issues

CRITICAL Missing Basic Structured Data capability · L1 · developer

Why: AI systems rely on structured data to accurately understand the meaning and entities on a page. Without it, your content is ambiguous and harder for AI to process.

Impact: AI assistants like ChatGPT are less likely to cite your site or understand your product correctly, reducing referral traffic and brand authority.

Fix: Add JSON-LD structured data blocks to key pages (e.g., homepage, product page). Use types like WebSite, Organization, and SoftwareApplication. For example, add a <script type="application/ld+json"> tag in the <head> of your homepage with basic site and company info.

HIGH Lacks Schema.org for Content Clarity visibility · L2 · developer

Why: Schema.org markup provides explicit context about your page's content, helping AI systems parse and interpret it with higher accuracy.

Impact: AI summaries and answers derived from your site will be less precise, potentially misrepresenting your services and reducing user trust in AI-provided information about you.

Fix: Implement JSON-LD structured data on all major content pages. Identify the primary content type for each page (e.g., Article for blog posts, SoftwareApplication for product pages) and add the corresponding Schema.org script block.

CRITICAL Missing or Restrictive robots.txt File capability · L2 · devops

Why: A robots.txt file controls which AI crawlers and search engines can access your site. Without it, or if it blocks key AI agents, your content is invisible to those systems.

Impact: Major AI crawlers (e.g., GPTBot, ClaudeBot) may be blocked from indexing your site, preventing your content from being included in AI training data and knowledge bases.

Fix: Ensure a /robots.txt file exists and is accessible. Explicitly allow key AI crawlers by adding directives like 'User-agent: GPTBot\nAllow: /' and 'User-agent: ClaudeBot\nAllow: /'. Avoid blanket 'Disallow: /' rules.

HIGH No XML Sitemap Published capability · L2 · devops

Why: An XML sitemap is a roadmap for AI crawlers and search engines to discover all important pages on your site efficiently.

Impact: AI systems will struggle to find and index your content beyond the homepage, leading to incomplete representation of your site in AI knowledge bases.

Fix: Generate and publish a valid XML sitemap at a standard location like /sitemap.xml. Ensure it includes all indexable URLs (pages, blog posts, docs). Submit it via your site's robots.txt file using a 'Sitemap:' directive.

HIGH XML Sitemap Not Accessible visibility · L3 · devops

Why: Even if a sitemap exists, it must be publicly accessible and correctly formatted for AI crawlers to use it for discovery.

Impact: Reduces the speed and completeness with which AI systems can discover new or updated content, delaying its inclusion in AI responses.

Fix: Verify that /sitemap.xml is publicly accessible (returns a 200 status code) and contains valid XML. Ensure it's referenced in your robots.txt file and that no robots meta tags or headers block access to it.

Quick Wins

Missing or Restrictive robots.txt File — Ensure a /robots.txt file exists and is accessible. Explicitly allow key AI crawlers by adding directives like 'User-agent: GPTBot\nAllow: /' and 'User-agent: ClaudeBot\nAllow: /'. Avoid blanket 'Disallow: /' rules. (devops)
No XML Sitemap Published — Generate and publish a valid XML sitemap at a standard location like /sitemap.xml. Ensure it includes all indexable URLs (pages, blog posts, docs). Submit it via your site's robots.txt file using a 'Sitemap:' directive. (devops)
XML Sitemap Not Accessible — Verify that /sitemap.xml is publicly accessible (returns a 200 status code) and contains valid XML. Ensure it's referenced in your robots.txt file and that no robots meta tags or headers block access to it. (devops)
Missing Organization Schema — Add a JSON-LD Organization block to your homepage. Include required properties: '@type': 'Organization', 'name', 'url', and 'logo'. Ensure the logo URL points to a publicly accessible image. (developer)
Missing LLMs.txt Guidance File — Create a plain text file at /llms.txt. Format it as Markdown: start with '# Site Name', followed by a '> tagline', then sections like '## Docs' or '## API' with bullet-point links to key pages. Do not use robots.txt syntax. See llmstxt.org for spec. (content)

30-Day Roadmap

Week 1: Quick Wins

— Create and publish a /robots.txt file that explicitly allows key AI crawlers (e.g., GPTBot, ClaudeBot) and avoids blanket 'Disallow: /' rules.

— Generate, publish, and verify a valid XML sitemap at /sitemap.xml, ensuring it includes all indexable URLs and is referenced in robots.txt.

— Create a /llms.txt file formatted as Markdown with site name, tagline, and sections linking to key pages (e.g., Docs, API).

Capability L1 → L2, Visibility L2 → L3

Week 2: Foundation

— Add a JSON-LD Organization structured data block to the homepage with required properties: '@type', 'name', 'url', and 'logo'.

— Add JSON-LD structured data blocks (e.g., WebSite, SoftwareApplication) to the homepage and core product pages.

Capability L2 → L3, Visibility L3 → L4

Weeks 3-4: Advanced

— Implement JSON-LD structured data on all major content pages, identifying and using the correct Schema.org type for each (e.g., Article for blog posts, SoftwareApplication for product pages).

Visibility L4 → L5

By addressing foundational crawlability and structured data, the site can realistically achieve AI Capability Level 3 and AI Visibility Level 5 within 30 days.

// embed badge
L2 AI Visibility L1 AI Capability

AI Visibility — markdown:

[![ReadyforAI](https://readyforai.dev/badge/openclaw.ai)](https://readyforai.dev/websites/openclaw.ai)

AI Capability — markdown:

[![ReadyforAI](https://readyforai.dev/badge/openclaw.ai?track=capability)](https://readyforai.dev/websites/openclaw.ai)
About badges