Illuminating the Black Box of AI Traffic
Sitechecker | SEO Analytics | B2B SaaS | GA4 AI traffic | Shipped 2025
THE STORY
Making the invisible visible
Artificial intelligence is reshaping search, but traditional analytics tools treat it as a footnote. SEO professionals knew platforms like ChatGPT and Perplexity were sending visitors, but they had no way to measure or act on that data. Competitors either ignored this traffic entirely or buried it in generic referral buckets. We designed a dedicated reporting module to turn this industry-wide blind spot into a competitive advantage for our users.
Product
Web | B2B SaaS
TIMELINE
2025
MY ROLE
As Product Designer, I owned the end-to-end creation of this analytics module from zero. I conducted the initial competitive research, mapped the highly complex data aggregation logic, defined the information architecture across three report tabs, and delivered a production-ready interface integrated directly with Google Analytics.
SKILLS
Product Strategy, Information Architecture, Data Visualization, Technical Logic Mapping, Data Visualization, System Design, Stakeholder Alignment
IMPACT
Launched a dedicated report that gives SEO professionals exact visibility into LLM-driven traffic. Over 2,300 users adopted the feature within eight weeks of release, and sales teams actively use it as a deciding factor for subscription upgrades.
THE PROBLEM
The five percent guess
Standard analytics platforms saw AI traffic arriving but stubbornly refused to give it a dedicated channel. I ran a parallel research process to understand exactly how SEO professionals were patching these structural blind spots and where the existing market tools fell short.
Users were building custom dashboards because their native tools failed them
Digging through LinkedIn discussions and SEO Reddit threads revealed a consistent pattern. I used AI tools to scrape and synthesize hundreds of posts into structured insights. Marketers were exhausted by analytical blind spots. They needed to know which specific pages ChatGPT actually recommended and exactly how that volume compared to their standard organic traffic. Native analytics tools technically captured these sessions, but they forced users to write complex regular expressions just to rescue the data from generic referral buckets. To get a readable interface, teams had to connect their properties to external Looker Studio templates or pay for heavy third party SEO suites. The default ecosystem treated AI traffic as a tracking anomaly rather than a primary acquisition channel.

The friction of fragmented tools
Native analytics platforms completely lack a dedicated AI channel. They force marketers to manually write complex regular expressions just to rescue ChatGPT or Perplexity visits from generic referral buckets. A few third party templates exist to automate this, but they pull users entirely out of their native workflow into rigid Looker Studio files that break the moment you need custom event tracking. Heavy SEO platforms try to bolt AI metrics onto their existing suites, but they merely inherit the limitations of the underlying Google property. Users faced a miserable choice between building fragile manual workarounds or adopting entirely new tracking software just to answer basic questions about their traffic.

THE HYPOTHESIS
Context beats raw numbers

THE EXECUTION
Structuring Data for Human Decisions
INFORMATION ARCHITECTURE
Three levels of depth
Throwing all available data onto a single screen creates analysis paralysis. Progressive disclosure respects the user's cognitive limits. It gives them a clear starting point and invites them to dig deeper only when they have formed a specific question.
DATA VISUALIZATION
Designing for comparison


Scrolling through a long table of URLs usually means losing sight of the global filters applied to that data. Freezing the filter bar preserves spatial context. Users always know exactly what segment they are looking at, eliminating the frustrating loop of scrolling up and down just to verify the date range.
SYSTEM LOGIC
Designing around API constraints

Waiting for a loading spinner every time you apply a filter breaks the analytical flow. Fetching everything upfront ensures that data exploration happens at the speed of thought. Users can slice the data instantly, keeping their focus entirely on finding insights rather than waiting for the system to catch up.
OUTCOMES
Measuring impact
Rapid adoption
Over 2,300 users engaged with the report within the first eight weeks of release.
Sustained retention
Approximately 12 percent of active users return to the report weekly. They have successfully integrated AI traffic analysis into their regular workflow.
Post-launch validation
Session recordings revealed users wanted saved filter combinations for daily checks. We prioritized this exact functionality for the immediate next iteration.
Driving business value
Sales managers adopted the feature as a primary proof point during software demos. Multiple users cited AI traffic visibility as their deciding factor for subscription upgrades.
WHAT I LEARNED
Rethinking Best Practices
Research the Gap Rather Than the Feature: Competitors had access to the exact same Google Analytics data, but they failed to present it usefully. The real product opportunity was not inventing new data but building an interface that actually answered the questions SEO professionals were asking.
Constraints Shape Better Architecture: The strict API limits forced a much smarter data fetching strategy. Designing the interface to rely on cached data rather than continuous server requests resulted in a significantly faster and more responsive user experience.
Progressive Disclosure Scales Complexity: Dividing the report into three distinct tabs of increasing depth allows power users to drill down into specific URL performance while keeping the initial view clean and approachable for casual users.

