0gravity.ch Free Consultation

Why Slow Websites Never Get Recommended by ChatGPT

| 0gravity
AI Performance ChatGPT SEO

The way people find information is changing fundamentally. More and more users are asking ChatGPT, Perplexity, or Claude instead of Google. These AI agents search the web, evaluate websites, and give concrete recommendations — often with a single result instead of ten blue links.

This has consequences. If your website is slow, poorly structured, or hard to access, these systems will ignore it. Not penalize — simply not perceive.

For Swiss businesses, the stakes are particularly high. In a market where trust and professionalism are paramount, being invisible to AI assistants means losing a rapidly growing source of qualified leads. This article explains exactly how AI crawlers evaluate website speed, what happens when your site fails those checks, and the concrete steps you can take to ensure your business gets recommended.

How AI crawlers see the web

AI agents like GPTBot (OpenAI), ClaudeBot (Anthropic), or PerplexityBot work differently from traditional search engine crawlers. They visit your website, download the HTML, and try to understand the content. What they look for:

  • Load speed: A crawler that doesn’t get a response within 3 seconds moves on. It has millions of other pages to visit.
  • Clean HTML: JavaScript-heavy pages that deliver no content without browser rendering are invisible to most AI crawlers.
  • Structured data: Schema.org markup, clear heading hierarchies, and semantic HTML help crawlers categorize content correctly.

This means: if your website relies on client-side JavaScript to display content, an AI crawler sees a blank page. If your server takes 4 seconds to respond, your page won’t be crawled at all.

How GPTBot differs from Googlebot

Googlebot has decades of engineering behind it. It can render JavaScript, wait for content to load, and even interact with pages. GPTBot and other AI crawlers are far less patient. They operate on a strict time budget because they need to fetch, process, and synthesize information from multiple sources in real time — often within seconds of a user asking a question.

Where Googlebot might give your page 10-15 seconds to fully render, an AI crawler typically allows 2-3 seconds at most. If your server responds slowly or your page requires JavaScript execution to display content, the crawler simply receives an empty or incomplete page and moves on to a competitor’s site that delivers content instantly.

What AI crawlers actually receive from your site

To understand why speed matters so much, consider what an AI crawler actually sees when it visits your website:

  1. The crawler sends an HTTP request to your server
  2. Your server processes the request — this involves database queries, PHP execution, or simply serving a static file
  3. The HTML response is returned — this is all the crawler gets; it typically does not execute JavaScript
  4. The crawler parses the HTML for content, structure, and metadata

If step 2 takes too long (more than 500-1,000 milliseconds), the crawler may time out or deprioritize your page. If step 3 returns minimal HTML because the content is loaded via JavaScript, the crawler sees essentially nothing.

You can test what AI crawlers see by disabling JavaScript in your browser and visiting your own website. If the page is blank or missing critical content, AI systems see the same thing.

Google’s Core Web Vitals — Largest Contentful Paint (LCP), Cumulative Layout Shift (CLS), and Interaction to Next Paint (INP) — are more than a Google ranking factor. They are a general indicator of website quality.

AI systems use similar signals:

LCP under 2.5 seconds means the main content is available quickly. For a crawler, this means the page responds reliably and delivers content without unnecessary delay.

No layout shift means stable, predictable content. Crawlers can extract text reliably without dealing with dynamically loaded content.

Fast server response (TTFB) is the most critical factor. If the server doesn’t respond within a few hundred milliseconds, the page is simply too slow for automated systems.

Research shows that websites with a Lighthouse Performance Score above 90 appear significantly more often in AI-generated answers. This isn’t coincidence — it’s the logical consequence of how these systems index the web.

TTFB: the metric that matters most for AI crawlers

Time to First Byte (TTFB) measures how long it takes for your server to send the first byte of the response after receiving a request. For human visitors, TTFB is one factor among many. For AI crawlers, it is often the deciding factor.

Here is why: AI crawlers process thousands of pages per minute. They allocate a strict time budget per page. If your TTFB is 800 milliseconds, the crawler has already spent most of its budget before it even starts receiving content. A TTFB of 50-100 milliseconds, by contrast, means the crawler gets full content almost instantly.

TTFB benchmarks for AI visibility:

TTFB RangeAI Crawler Impact
Under 100msExcellent — full content captured reliably
100-300msGood — most content captured
300-800msRisky — partial content, lower priority
Over 800msPoor — likely skipped or timed out

Static hosting on CDNs (like Vercel, Netlify, or Cloudflare Pages) routinely achieves TTFB under 100 milliseconds because the HTML is pre-built and served from edge servers close to the user — or in this case, close to the crawler.

Real examples: slow vs fast sites in AI responses

To illustrate the difference, consider two fictional but representative Swiss businesses in the same industry — both offering IT consulting services in Zurich.

Company A: WordPress on shared hosting

  • TTFB: 1,200ms
  • Lighthouse Performance: 38
  • Client-side rendered content via Elementor
  • No structured data
  • Result: When asked “Which IT consulting firms in Zurich are recommended?”, ChatGPT never mentions Company A. The crawler either times out or receives incomplete HTML.

Company B: Astro on Vercel

  • TTFB: 45ms
  • Lighthouse Performance: 99
  • Server-rendered HTML with full content
  • Schema.org LocalBusiness markup
  • Result: Company B appears consistently in AI-generated recommendations because its content is instantly accessible and clearly structured.

This pattern repeats across industries. We have observed it with restaurants, law firms, medical practices, and e-commerce shops. The businesses that invest in performance and structure are the ones that AI systems can actually read and recommend.

How to test if your site is AI-visible

Before you can fix the problem, you need to understand where you stand. Here is a step-by-step process to test whether AI systems can see and recommend your business:

Step 1: Ask the AI directly

Open ChatGPT, Perplexity, and Claude. Ask questions that your potential customers would ask:

  • “Which [your service] in [your city] do you recommend?”
  • “What is the best [your service] provider in Switzerland?”
  • “Compare [your service] options in [your region]”

If you are not mentioned but your competitors are, you have an AI visibility problem.

Step 2: Test your TTFB

Use WebPageTest.org or your browser’s developer tools (Network tab) to measure your server response time. Anything over 500ms needs attention.

Step 3: View your site without JavaScript

In Chrome, open Developer Tools (F12), press Ctrl+Shift+P (Cmd+Shift+P on Mac), type “Disable JavaScript”, and reload your page. What you see is approximately what AI crawlers see.

Step 4: Check your Lighthouse score

Run a Lighthouse audit in Chrome DevTools or at pagespeed.web.dev. Focus on the mobile score — AI crawlers often use mobile user agents. A score below 80 indicates significant issues.

Step 5: Validate your structured data

Use Google’s Rich Results Test to check whether your Schema.org markup is correctly implemented. Missing or malformed structured data means AI systems have to guess what your business does.

Step 6: Check your robots.txt

Some websites accidentally block AI crawlers. Check your robots.txt file (yoursite.com/robots.txt) for rules that block GPTBot, ClaudeBot, or other AI user agents. While you may have reasons to block certain crawlers, blocking them all means complete AI invisibility.

Why WordPress websites are particularly affected

The majority of Swiss SME websites run on WordPress. The problem: a typical WordPress installation loads 30-50 HTTP requests, takes 2-5 seconds for the first visible content, and depends on PHP servers that slow down under load.

Add plugins loading additional JavaScript, render-blocking CSS, and fonts streamed from external CDNs. For a human visitor on a fast connection, this might be acceptable. For a crawler processing thousands of pages per minute, it’s a disqualifier.

Static websites, on the other hand — built with modern frameworks like Astro — deliver pre-rendered HTML in under 100 milliseconds. No PHP, no database query, no JavaScript that needs to load first. The content is immediately available, for humans and machines alike.

The WordPress performance ceiling

Even with aggressive optimization — caching plugins like WP Super Cache, CDN integration, image compression — WordPress websites hit a performance ceiling. The reason is architectural: WordPress generates pages dynamically on every request (unless cached), and the PHP execution layer adds inherent latency.

A well-optimized WordPress site can achieve a TTFB of 200-400ms and a Lighthouse score in the 70-85 range. An Astro.js site achieves 30-80ms TTFB and Lighthouse scores of 95-100 without any special optimization, because the HTML is pre-built at deploy time.

For businesses where AI visibility is important — and in 2026, that is increasingly every business — this architectural difference matters. You can read more about the cost and performance comparison between WordPress and Astro.

Swiss hosting comparison for speed

Where your website is hosted directly impacts how fast AI crawlers can access it. Here is how common Swiss hosting options compare:

Shared hosting (Infomaniak, Hostpoint, cyon)

Swiss shared hosting providers offer solid service, but shared environments mean your server resources are split among hundreds of tenants. Typical TTFB: 300-1,500ms depending on server load and time of day.

Pros: Data in Switzerland, CHF billing, local support Cons: Inconsistent performance, slow TTFB, limited control

Managed WordPress hosting (Hostpoint Managed, Raidboxes)

Managed WordPress hosts optimize the server environment for WordPress specifically. Typical TTFB: 200-600ms.

Pros: Better than shared hosting, WordPress-specific optimizations Cons: Still limited by WordPress architecture, higher cost

CDN-based static hosting (Vercel, Netlify, Cloudflare Pages)

Static hosting serves pre-built HTML files from a global network of edge servers. The closest server to the crawler (or visitor) responds, typically within 20-100ms.

Pros: Fastest possible TTFB, global availability, automatic scaling, generous free tiers Cons: Requires static site generator (like Astro), data stored outside Switzerland

Swiss CDN with static hosting (Cloudflare with Zurich PoP)

Cloudflare has a Point of Presence in Zurich, meaning static files can be served from within Switzerland. Combined with a static site generator like Astro, this delivers Swiss-local performance with global CDN benefits.

For most Swiss SMEs, CDN-based static hosting offers the best combination of speed, reliability, and cost. The data residency concern is valid for sensitive applications, but for a marketing website, the performance advantage is decisive — especially for AI crawler accessibility.

Step-by-step optimization guide

If your website is currently slow and underperforming with AI crawlers, here is a prioritized action plan:

Phase 1: Quick wins (1-2 days)

  1. Compress and convert images — Use Squoosh to convert all images to WebP format. This alone can cut page weight by 50-80%.
  2. Self-host your fonts — Download your Google Fonts and serve them from your own domain. This eliminates 2-4 render-blocking external requests.
  3. Defer non-critical JavaScript — Add defer or async attributes to script tags that are not needed for initial rendering.
  4. Enable server-side caching — If you are on WordPress, install a caching plugin that serves static HTML to repeat visitors and crawlers.
  5. Check robots.txt — Ensure you are not accidentally blocking AI crawler user agents.

Phase 2: Structural improvements (1-2 weeks)

  1. Implement structured data — Add Schema.org markup for your business type (LocalBusiness, ProfessionalService, Restaurant, etc.), services, and contact information.
  2. Fix your heading hierarchy — Ensure every page has exactly one h1, followed by logical h2 and h3 structure that outlines the content.
  3. Add an FAQ section to key pages — AI systems love well-structured question-and-answer content.
  4. Remove unused plugins and scripts — Audit every JavaScript file and CSS stylesheet loaded on your pages. Remove anything that is not essential.
  5. Upgrade your hosting — If your TTFB is consistently above 500ms, switch to a faster hosting solution.

Phase 3: Strategic rebuild (4-8 weeks)

If phases 1 and 2 do not bring your Lighthouse score above 90 and your TTFB below 200ms, it may be time for a full redesign. A modern static site built with Astro.js starts with performance as the foundation rather than trying to bolt it on afterward.

This is the approach we take at 0gravity. Every website we build achieves Lighthouse scores of 95-100 and is optimized for both traditional search engines and AI assistants from day one. Learn more about our web design services or check our pricing.

What businesses can do now

If you want your website to be found and recommended by AI agents, you need to work on three levers:

1. Optimize performance

Reduce load time to under 1 second. That means: static HTML, optimized images (WebP/AVIF), minimized CSS, no render-blocking resources. Measure with Google Lighthouse and aim for a score of 95+.

2. Implement structured data

Add Schema.org markup: Organization, LocalBusiness, Service, FAQ, Article. This structured data helps AI systems correctly categorize your business — location, services, pricing, opening hours. For a detailed guide on making your website AI-discoverable, read our article on AI-optimized websites.

3. Use semantic HTML

Use correct heading hierarchies (h1 through h6), descriptive alt texts for images, nav elements for navigation, article tags for content. Write for humans, but structure for machines.

4. Create content that answers questions

AI assistants respond to questions. If your website content is structured as clear answers to common questions in your industry, AI systems are far more likely to cite you. Think about what potential customers ask:

  • “What does [your service] cost in Switzerland?”
  • “How do I choose a [your service] provider?”
  • “What should I look for when [relevant activity]?”

Create dedicated pages or sections that answer these questions thoroughly. An FAQ section with Schema.org FAQPage markup is particularly effective. Read more about how to get recommended by ChatGPT.

5. Build an llms.txt file

The emerging llms.txt standard provides a dedicated file that tells AI systems about your website’s structure, key pages, and content focus. It is like robots.txt, but designed specifically for language models. Adding an llms.txt file to your site gives AI crawlers a clear roadmap of your most important content.

The future is hybrid

The distinction between “SEO” and “AI optimization” will disappear. Websites optimized for search engines today will also be preferred by AI agents tomorrow — provided they are fast, cleanly structured, and content-rich. You can read more about how these disciplines relate in our guide to GEO vs LLMO vs SEO.

The critical difference: with Google, you compete for positions 1-10. With ChatGPT, there’s often just one recommendation. If you’re not the fastest, most relevant source, you won’t be mentioned.

Swiss businesses that invest in performance and structure now will secure an advantage that grows exponentially in the coming years. The more people use AI assistants, the more important it becomes to be recognized by these systems as a trustworthy source.

The cost of inaction

Every month you wait, the gap widens. Your competitors who invest in performance and AI optimization now are building a compounding advantage:

  • Their content enters AI training data and knowledge bases
  • Their structured data makes them the default recommendation
  • Their fast, accessible pages get crawled more frequently, building a richer AI profile
  • Meanwhile, your slow, JavaScript-dependent pages remain invisible to these systems

The investment in a fast, AI-ready website is modest compared to the cost of being permanently invisible to a channel that is growing by double digits every quarter. A performance-optimized website built on modern technology costs less than you might think — and the ROI from increased AI visibility alone can pay for the investment within months.

The question isn’t whether AI will change how customers find you. The question is whether your website is ready for it.


Frequently Asked Questions (FAQ)

How quickly can I improve my website’s AI visibility?

Technical improvements like image compression, font self-hosting, and caching can be done within days and have immediate effects on crawl quality. Structural changes like implementing Schema.org markup and fixing heading hierarchies take 1-2 weeks. A full rebuild on a fast framework like Astro typically takes 4-8 weeks. AI systems re-crawl frequently, so improvements are usually reflected within weeks.

Does website speed affect all AI assistants equally?

Yes, broadly speaking. ChatGPT (via GPTBot), Claude (via ClaudeBot), Perplexity (via PerplexityBot), and Google’s AI Overviews all prefer fast-loading pages with clean HTML. The specific crawl behavior varies — some are more tolerant of moderate delays — but the principle is universal: faster pages get crawled more completely and more frequently.

Can I block some AI crawlers but allow others?

Yes, you can use your robots.txt file to selectively allow or block specific AI crawler user agents. However, blocking crawlers means those AI systems will not recommend your business. Unless you have a specific reason (such as content licensing concerns), it is generally better to allow all major AI crawlers access to your marketing website.

My website is on WordPress. Should I switch to Astro?

It depends on your priorities. If AI visibility and performance are critical to your business growth, a migration to Astro.js delivers significantly better results with lower ongoing maintenance. If you have a complex WordPress setup with e-commerce, membership areas, or frequent content updates by non-technical staff, a hybrid approach may be more practical. Contact us for an honest assessment of your situation.

What Lighthouse score do I need for AI crawlers to find my site?

There is no hard threshold, but data suggests that sites scoring above 90 on mobile are crawled more reliably and completely by AI systems. Sites scoring 70-89 may be partially indexed. Sites below 70 are frequently skipped or only superficially crawled. Aim for 95+ to be safe.

Is AI visibility relevant for local Swiss businesses?

Absolutely. Local queries like “best bakery in Basel” or “reliable plumber in Winterthur” are exactly the type of questions people ask AI assistants. For local businesses, the competition in AI recommendations is still relatively low, which means investing now gives you a disproportionate advantage. Combine a fast website with complete Google Business Profile data and consistent NAP (Name, Address, Phone) information across all platforms.

0g
0gravity

Swiss web agency building ultra-fast, AI-optimized websites. We build websites that bring customers — not just look good.

Ready for a faster website?

Free website check: We test your current website and show you what's possible.

Or directly: hello@0gravity.ch

Book a call