GEO & SEO
Make your business visible where people actually search—Google, ChatGPT, Perplexity, and Bing.
The search landscape has fractured
When someone asks ChatGPT “best A.I. consultants in the Hudson Valley,” or asks Perplexity “dispensaries near Beacon NY,” the answer comes from a generative engine—not a list of ten blue links. Traditional SEO still matters, but it’s no longer sufficient. Generative Engine Optimization (GEO) is the practice of structuring your digital presence so that A.I. tools cite you, recommend you, and surface you in conversational search results.
Most businesses haven’t adapted. Their websites are invisible to A.I./LLM crawlers, their structured data is missing or malformed, and their content doesn’t answer the questions these engines are trained to prioritize. Businesses that move first on GEO will own conversational search in their category for years—the same way early SEO adopters dominated Google’s first page in the 2010s.
What I do
- Technical SEO audit—Crawl your site for broken links, missing meta, slow Core Web Vitals, and indexing gaps across Google, Bing, and A.I. search engines (ChatGPT, Perplexity, Claude).
- GEO optimization—Structure your content, metadata, and
llms.txtfiles so A.I. engines can parse, cite, and recommend your business in conversational search results. - Structured data implementation—Add JSON-LD schema markup (Organization, LocalBusiness, Product, FAQ, HowTo) so search engines and A.I. tools understand what you do, not just what you say.
- Content strategy—Identify the questions your customers are asking A.I. tools, then create or restructure content to answer them directly and authoritatively.
- Competitive intelligence—Map your visibility against competitors across Google, Bing, ChatGPT, and Perplexity, and identify the gaps you can own.
- Platform audits—Claim, optimize, and monitor your listings on the platforms your customers actually search (Weedmaps, Leafly, Yelp, Google Business Profile).
Proof of concept
worldwarwatcher.com—I built this site in early March 2026. It’s already ranking #1 on Google and #1 on Bing for its URL. That’s the velocity these tools enable when the technical foundation, content strategy, and structured data are aligned from day one.
What you get
- Full technical audit report with prioritized recommendations
- GEO-optimized site structure and
llms.txtimplementation - Structured data markup, deployed and validated
- Content roadmap targeting A.I. search queries in your vertical
- Monthly ranking reports across Google, Bing, ChatGPT, and Perplexity
- Ongoing optimization as A.I. search algorithms evolve
GEO implementation checklist
| Check | Purpose | Status |
|---|---|---|
llms.txt | Structured index for AI engines | Required |
robots.txt AI directives | Allow GPTBot, ClaudeBot, PerplexityBot | Required |
| JSON-LD schema | Organization, LocalBusiness, Service | Required |
skill.md | Machine-readable capability manifest | Recommended |
agent-permissions.json | API access rules for AI agents | Recommended |
| Markdown mirrors | .md versions of key pages | Recommended |
# Verify your llms.txt is accessible to AI crawlers
curl -s https://yoursite.com/llms.txt | head -20
# Test structured data with Google's Rich Results Test
npx structured-data-testing-tool https://yoursite.com
# Check AI crawler access via robots.txt
curl -s https://yoursite.com/robots.txt | grep -A2 "GPTBot\|ClaudeBot"
Results
| Metric | Before GEO | After GEO | Change |
|---|---|---|---|
| Google ranking | Not indexed | #1 for URL | — |
| Bing ranking | Not indexed | #1 for URL | — |
| Perplexity | Not cited | Cited as source | — |
| AI crawler access | Blocked | 21 crawlers allowed | +21 |
| Structured data schemas | 0 | 5 types deployed | +5 |
llms.txt | Missing | Deployed (1K tokens) | — |
| Agentic-SEO score | 55/100 (D) | 90/100 (A) | +35 pts |
Measured on worldwarwatcher.com and minoanmystery.org, both built with the same GEO methodology.
Frequently asked questions (6)
What is Generative Engine Optimization (GEO)?
GEO is the practice of structuring your website so that A.I. platforms—ChatGPT, Perplexity, Claude, Gemini—can find, understand, and cite your business in conversational search results. It covers structured data, AI-readable content formats like llms.txt, crawler access configuration, and content strategy for the questions people ask AI tools.
How is GEO different from traditional SEO?
Traditional SEO optimizes for Google’s ranking algorithm: keywords, backlinks, page speed, meta tags. GEO optimizes for how A.I. models consume and cite your content: structured data, machine-readable formats, direct answers to natural-language questions, and explicit crawler permissions. You need both—GEO doesn’t replace SEO, it extends it.
Which A.I. platforms does GEO target?
ChatGPT (GPTBot), Perplexity (PerplexityBot), Google’s AI Overviews (Google-Extended), Claude (ClaudeBot), Bing Chat (Bingbot), and Amazon’s product search (Amazonbot). Each has its own crawler and content preferences. A proper GEO implementation covers all of them.
How long does it take to see results?
Technical implementation (structured data, llms.txt, robots.txt, skill.md) takes 1–2 weeks. Ranking improvements in traditional search follow normal SEO timelines (1–3 months). A.I. citation improvements can appear faster—within days of proper crawler access and structured data deployment—because AI models re-index frequently.
Do I still need traditional SEO if I implement GEO?
Yes. 92% of A.I. Overview citations come from pages already in the top 10 of traditional search results. GEO makes your content citable by AI, but traditional SEO gets you into the pool of sources AI models draw from. They’re complementary, not interchangeable.
What is llms.txt and why does my business need one?
llms.txt is a structured text file (similar to robots.txt) that tells A.I. systems what your site is, what it offers, and where to find key content. Over 844,000 sites have adopted it, and sites with llms.txt see 2.3× higher AI citation rates. It’s the single fastest GEO win for most businesses.
Implementation deep-dives
How to create an effective llms.txt
Your llms.txt should live at the root of your domain (/llms.txt) and follow a structured format: a title line, a brief description, then sections with markdown links to your key pages. Include token count annotations so AI systems know the document’s size before fetching it. Keep it under 5,000 tokens—concise enough to fit in a single context window. Link to an extended version (llms-full.txt) for comprehensive content.
Configuring robots.txt for AI crawlers
Each AI crawler needs its own User-agent block in robots.txt—grouping them causes parser failures. The essential crawlers to allow: GPTBot, ChatGPT-User, ClaudeBot, Claude-Web, PerplexityBot, Google-Extended, GoogleOther, Amazonbot, cohere-ai, and Bytespider. Block access to API routes and build artifacts (/api/, /_astro/) but allow any public-facing API endpoints you want AI tools to use.
JSON-LD structured data for AI engines
AI engines don’t just read your text—they parse your structured data to understand what your business is. At minimum, implement Organization (or LocalBusiness), Service, BreadcrumbList, and FAQPage schemas. Use @id entity linking so schemas reference each other (e.g., your Service schema’s provider links to your Person @id). Validate with Google’s Rich Results Test before deploying.
Creating a skill.md capability manifest
skill.md is a machine-readable document that tells AI agents what your site can do. It uses YAML frontmatter (name, description) and a structured body covering capabilities, required inputs, constraints, rate limits, and documentation links. This is the emerging standard from the agent-skills format (25K+ GitHub stars). Place it at /skill.md in your public directory.
The businesses that show up in A.I. search results today will be the default answers tomorrow—and the window to claim that ground is still open.
The proof
Two sites built with these exact techniques—live rankings you can verify yourself:
Ready to start? Get in touch today.
The first call is free. 30 minutes, no pitch—just your stack, your problems, and what I'd do about them. Book a call.