AI crawler logs and AI referral traffic answer different questions. HitKeep keeps them separate so SEO teams do not mistake crawler activity for demand.
Browser tracking can see many AI-referred human visits. It cannot reliably see AI crawlers because those requests usually do not run JavaScript.
The short version
| Signal | What it means | How HitKeep gets it |
|---|---|---|
| AI crawler fetch | An AI system requested a page or asset | Server-side AI fetch ingest |
| AI-referred visit | A human arrived from an AI assistant referrer | Browser tracking with hk.js |
| Correlation | A fetched path later received AI-referred visits | AI Visibility correlation report |
Why crawler fetches are not traffic
GPTBot fetching /pricing does not mean a buyer visited /pricing. It means an AI system requested the page.
Repeated fetches can show which pages AI systems inspect, where they see errors, and which sections of a site are visible to AI systems. But fetches should not be counted as sessions, users, or conversions.
Why AI referrals are incomplete by themselves
AI-referred visits show human demand, but they miss the discovery layer. If ChatGPT sends five visits to a page, you still may not know whether AI crawlers are also fetching related pages, hitting errors, or ignoring important content.
- Server-side fetch records show crawler visibility.
- Browser pageviews show AI-referred human visits.
- Correlation shows where the two overlap.
How agencies can report this
This gives the client a practical next step without claiming deterministic attribution.
- Discovery: which AI systems fetched the site and which paths they requested.
- Demand: which pages received AI-referred human visits.
- Work list: pages with high fetch interest, low referrals, or crawler errors.