visibility
capability
zhimao.cc
zhimao.cc
Levels are cumulative — you must pass L1 before reaching L2, L2 before L3, and so on.
AI Readiness Report
Executive Summary
The website zhimao.cc has a foundational technical setup that allows AI crawlers to access its basic content, but it lacks the structured data and discoverability features needed for AI systems to effectively understand, recommend, or interact with it. Its primary strengths are a crawlable, fast-loading site with clear contact information, but it fails to implement most modern AI optimization and integration standards. The site is missing significant opportunities to be found and utilized by AI agents and recommendation engines.
AI Visibility — L1
AI crawlers can find and read the site's basic content, but the site lacks the structured metadata, clear site architecture, and trust signals that AI systems use to confidently understand and recommend content. Without features like sitemaps, schema markup, and clear content attribution, the site is unlikely to be surfaced as a high-quality source by AI assistants like ChatGPT or Perplexity.
AI Capability — L1
The site is readable by AI agents but offers no programmatic interfaces for them to use. There is no public API, no structured data for automation, and no documented methods for agents to perform actions or retrieve data systematically. This makes the site a passive information source rather than an active service an AI agent could integrate with or automate.
A score of 1/5 in both categories indicates the site is merely accessible to AI, not optimized for it. Practically, this means the site is missing out on being recommended by AI search tools and cannot be used by AI agents for tasks like data retrieval, booking, or customer service automation.
Top Issues
Why: AI systems and search engines need substantial, relevant text to understand the page's topic and value. Pages with only boilerplate or minimal text are often ignored.
Impact: AI assistants will not recommend or summarize your site's content, drastically reducing referral traffic and brand visibility in AI-generated answers.
Fix: Audit key pages (homepage, product/service pages, blog). For each, ensure there is at least 300-500 words of unique, descriptive text that clearly explains your offerings, value proposition, and key information.
Why: JSON-LD structured data is the primary way to explicitly tell AI systems about the entities on your page (e.g., your company, products, articles). Without it, AI must guess, often incorrectly.
Impact: AI cannot reliably extract key facts about your business, products, or content, making it invisible for rich results and structured answers. This is a foundational failure for AI readiness.
Fix: Add a JSON-LD script block to the <head> of your homepage. Start with a basic 'Organization' schema (name, url, logo) and a 'WebSite' schema. Extend to other page types (e.g., 'Article' for blog posts).
Why: Schema.org markup provides a clear, structured signal about page content, helping AI disambiguate topics and understand context beyond plain text.
Impact: Reduces the accuracy with which AI systems categorize and recommend your content, leading to missed opportunities in AI search and chat responses.
Fix: Implement JSON-LD structured data on all key content pages. Use types relevant to the content (e.g., 'Article' for blog posts, 'SoftwareApplication' for a product page). Use Google's Structured Data Testing Tool to validate.
Why: An XML sitemap is a roadmap of your site's important pages. AI crawlers and search engines use it to efficiently discover and index content they might otherwise miss.
Impact: New or deep-linked pages may never be found by AI systems, limiting the scope of content they can access and recommend from your site.
Fix: Generate an XML sitemap (e.g., using a CMS plugin or a sitemap generator tool). Place it at the standard location (e.g., /sitemap.xml). Ensure it's referenced in your robots.txt file and submit it to search consoles.
Why: For AI agents and automated systems, a sitemap is a critical API-like endpoint that programmatically lists available resources, enabling efficient site exploration.
Impact: AI agents cannot systematically discover your site's structure or content, preventing them from performing comprehensive analysis or data retrieval tasks.
Fix: Create and publish a valid XML sitemap at /sitemap.xml. Ensure it follows the sitemaps.org protocol, lists all important URLs, and is accessible without authentication. Update it automatically when content changes.
Quick Wins
30-Day Roadmap
Week 1: Quick Wins
— Add a JSON-LD 'Organization' schema block to the homepage <head> (name, url, logo).
— Add basic Open Graph meta tags (og:title, og:description, og:image, og:url, og:type) to the <head> of every page.
— Add a <link rel="canonical" href="[full-page-url]" /> tag to the <head> of every page.
— Add the 'lang' attribute (e.g., 'zh-CN') to the site's opening <html> tag.
Capability L1 → L2, Visibility L1 → L2
Week 2: Foundation
— Generate and publish a valid XML sitemap at /sitemap.xml, following the sitemaps.org protocol.
— Reference the sitemap in robots.txt and submit it to search consoles.
— Audit key pages (homepage, product/service pages, blog) and add at least 300-500 words of unique, descriptive text to each.
Visibility L2 → L3, Capability L2 → L3
Weeks 3-4: Advanced
— Extend structured data: add a 'WebSite' schema to the homepage and implement relevant types (e.g., 'Article', 'SoftwareApplication') on all key content pages using JSON-LD.
— Validate all structured data implementations using Google's Structured Data Testing Tool.
— Ensure the XML sitemap is updated automatically when content changes.
Visibility L3 → L4, Capability L3 → L4
By addressing foundational visibility and capability issues, the site can realistically achieve AI Visibility Level 3 and AI Capability Level 3 within 30 days.