AI crawler access
We check your robots.txt for all major AI crawlers — GPTBot (OpenAI), ClaudeBot (Anthropic), Google-Extended (Gemini), PerplexityBot, Bytespider, CCBot, and Amazonbot. If you block them, your content won't appear in AI answers.
GEO (Generative Engine Optimization) measures how well your website is prepared to appear in AI-generated answers from ChatGPT, Gemini, Claude, and Perplexity. We check three dimensions that determine whether AI engines can find, understand, and cite your content.
We check your robots.txt for all major AI crawlers — GPTBot (OpenAI), ClaudeBot (Anthropic), Google-Extended (Gemini), PerplexityBot, Bytespider, CCBot, and Amazonbot. If you block them, your content won't appear in AI answers.
A new standard that tells AI models about your site's content and preferences — like robots.txt but for LLMs. We check if your site has one and whether it's properly formatted.
JSON-LD Organization markup tells AI exactly who you are — your brand name, logo, URL, and social profiles. Without it, AI may not recognize your brand as a distinct entity or may confuse it with others.
If your brand name differs between your title tag, Open Graph tags, and schema markup, AI models may treat them as separate entities. We check all three sources match.
Rich schema types like FAQPage, HowTo, Article, and Product make your content eligible for featured AI answers. We check which types you have and which high-value ones you're missing.
AI engines prefer content that's easy to extract — clear headings, FAQ blocks with question-answer pairs, lists, tables, and sufficient word count. We measure all of these.
Your GEO score is split into three categories, each measuring a different aspect of AI readiness. The total is out of 100.
Can AI crawlers access and understand your site? This covers robots.txt, llms.txt, sitemap.xml, meta tags (title, description, canonical, language), and whether the page has a robots meta tag blocking indexing.
The biggest single factor is llms.txt (7 points) — most sites don't have one yet, so adding it gives an immediate score boost. AI crawler access is worth up to 9 points across 8 bots.
Can AI clearly identify your brand? This covers Organization schema (with logo and social links), Open Graph completeness, Twitter Cards, brand name consistency across meta tags, and author signals.
Organization schema is the highest-value item (up to 12 points) because it's the primary signal AI uses to disambiguate your brand from similarly named entities.
Is your content structured for AI extraction? This covers JSON-LD structured data types (especially FAQPage, HowTo, Article), heading hierarchy, FAQ content detection, word count, and presence of lists and tables.
Rich schema types (FAQPage, HowTo, Article) are worth up to 7 points. FAQ content with 5+ questions earns up to 5 points. These are the most actionable items — adding a FAQ section with proper schema can boost your score significantly.
The tool is simple — enter a domain, get a detailed report with scores and prioritized recommendations.
Type any domain — example.com, blog.example.com. The checker fetches the homepage and analyzes robots.txt, llms.txt, sitemap.xml, and the HTML content.
Technical Readiness, Entity Readiness, and Answer Readiness are scored independently. Each section shows exactly what was found and what's missing, so you know where to focus.
Prioritized recommendations tell you exactly what to fix first. High-priority items (like adding Organization schema or unblocking AI crawlers) have the biggest impact on your AI visibility.
You can access the GEO checker programmatically through our REST API or MCP integration.
POST https://api.nslookup.io/v1/geo/check
Content-Type: application/json
{
"domain": "example.com"
}
# Returns: scores, technical/entity/answer readiness,
# AI crawler status, structured data, recommendations# In Claude, Cursor, or any MCP-compatible client: # "Check the GEO score for example.com" # Returns full AI readiness report in your conversation
A one-time check shows where you stand today. Monitoring tracks your GEO score over time — so you can see the impact of changes, catch regressions, and compare against competitors. nslookup.io checks your domains daily and stores the history.
Start free GEO monitoring →If your organization relies on web traffic, lead generation, or brand visibility — GEO directly affects your pipeline. AI-generated answers are replacing traditional search clicks, and the sites that appear in those answers are the ones that follow GEO best practices.
You control robots.txt, server headers, and deployment pipelines. One misconfigured robots.txt rule can block all AI crawlers. GEO checks ensure your infrastructure doesn't silently kill AI visibility after a deploy.
Structured data, JSON-LD schemas, meta tags, and heading hierarchy are code-level decisions. GEO validates that your markup is correct and complete — catching missing Organization schema, broken Open Graph tags, or heading structure issues.
Traditional SEO tools don't check AI readiness. GEO fills the gap — showing whether your content has FAQ blocks, extractable answer paragraphs, and sufficient depth to be cited by AI. The recommendations are directly actionable.
AI crawlers have different access patterns than traditional search bots. GEO shows exactly which AI bots can access your site, whether your robots meta tag blocks indexing, and if llms.txt exposes information you didn't intend to share.
AI engines now drive a growing share of how users discover products, documentation, and services. When ChatGPT or Gemini answers a question about your product, it either cites your site — or your competitor's. The difference often comes down to technical implementation: proper schema markup, correct robots.txt configuration, and content that AI can parse.
GEO is not marketing fluff. It's a technical checklist of signals that AI systems use when deciding which content to cite. Every item in the GEO report maps to a concrete code change — add a JSON-LD block, fix a heading hierarchy, update a meta tag. If you've ever debugged why Google wasn't indexing a page, GEO is the same discipline applied to AI engines.
GEO (Generative Engine Optimization) is the practice of optimizing your website to appear in AI-generated answers from search engines and AI assistants like ChatGPT, Google Gemini, Claude, and Perplexity. It's the AI equivalent of SEO.
SEO optimizes for traditional search engine rankings (the "10 blue links"). GEO optimizes for AI-generated answers where your content is cited or referenced. The technical foundations overlap (meta tags, structured data, crawlability), but GEO adds new dimensions: AI crawler access, llms.txt, entity clarity, and content extractability.
No. Google's regular search crawler (Googlebot) is separate from Google-Extended (used for Gemini AI). You can block Google-Extended while keeping Googlebot allowed. However, blocking Google-Extended means your content won't appear in Gemini-powered AI answers or Google AI Overviews.
llms.txt is a proposed standard (similar to robots.txt) that provides AI models with a structured description of your website's content, purpose, and preferences. It helps LLMs understand your site without crawling every page. See llmstxt.org for the specification.
After any significant content or technical change. For ongoing monitoring, our daily automated checks will track your score over time and show trends. Most sites see score changes when they modify structured data, update robots.txt, or add/remove content sections.
Currently the checker analyzes the homepage. For page-specific analysis, we recommend checking the domain and using the recommendations to improve your site-wide AI readiness, which benefits all pages.