Skip to main content

Understanding Filters in Scrunch

Learn how to use filters in Scrunch to narrow your data by platform, persona, stage, topic, geography, and more.

Updated over 2 months ago

Overview

Filters in Scrunch let you focus on specific subsets of data without losing context.
They apply instantly across charts, tables, and visualizations, so you can move from broad visibility metrics to detailed insights about prompts, personas, or sources.


Available Filters

Filter

Description

Example Use

Brand Mentioned in Prompt

Filters prompts that include your brand name (branded) or exclude it (non-branded).

Measure organic visibility in non-branded AI queries where users don’t explicitly mention your brand.

Prompt Tags

Groups prompts by custom tags created for campaigns, content themes, or initiatives.

View prompts labeled “Product Launch,” “SEO Keywords,” or other campaign-specific tags.

Persona

Focuses data on prompts tied to a specific customer persona.

Compare how results differ between audiences such as “Enterprise Buyer” and “Consumer.”

Country / Geo

Limits data to a specific country, based on persona or prompt location.

Compare AI visibility between the U.S. and U.K. or analyze regional trends.

Prompt Topic

Filters prompts by Key Topic defined in the Context tab.

Drill into results for strategic themes like “Cloud Security,” “AI Strategy,” or “Financial Services.”

AI Platform

Selects which AI system to view results for (e.g., ChatGPT, Perplexity, Google AI Overviews, Gemini, Meta AI, Claude).

Focus on brand presence and citation data specific to ChatGPT, Gemini, or Perplexity.

Stage

Segments data by customer journey stage: Awareness, Comparison, Evaluation, Advice, or Other.

Identify how your brand performs across different funnel stages, especially Awareness prompts.

Brand Present in Response

Shows only prompts or citations where your brand appears in at least one AI response.

Evaluate your performance on prompts where your brand is already cited or discussed.

Competitor Present

Shows only prompts or citations where one or more competitors are mentioned in an AI response.

Benchmark your visibility or sentiment against selected competitors.

Citations Tab Only

Citation Owner

(Citations Tab)

Distinguishes whether a citation belongs to your Brand, a Competitor, or a Third-Party site.

Identify which third-party domains influence AI responses most.

Prompt Topic

(Citations Tab)

Filters to prompts grouped by Key Topic defined in the Context tab.

Drill into results for topics like “Cloud Security” or “AI Strategy.”


Best Practices

  • Start broad, then narrow. Begin with all filters cleared, then layer criteria to focus on specific audiences or topics.

  • Combine filters for insight. Try pairing Non-Branded + Awareness Stage to identify early-journey visibility gaps.

  • Cross-reference views. The same filters apply across all major tabs, allowing you to connect prompt-level insights to citation-level evidence.

  • Filter citations by platform. Each AI platform favors different citation sources and content types. Filtering by platform helps you zero in on the citations that matter most for the models your audience is actually using.

Note: By default, the Dashboard displays non-branded prompts only. This is intentional — Focusing on non-branded prompts provides a clearer picture of how often AI platforms recommend your brand organically, when users don’t specifically ask for it.


Organizing Your Prompts Tab

In addition to filters, Scrunch lets you organize prompts in multiple ways to analyze coverage and performance across themes, campaigns, and AI platforms.

Organization Type

Description

Purpose / Example Use

Topic

Groups prompts under the high-level focus areas defined in your Context tab.

See how your brand performs across key themes like “AI Strategy” or “Cloud Security.”

Tag

Custom labels for campaigns, initiatives, or content categories.

View prompts tagged “Product Launch,” “Q4 Campaign,” or “Competitor Watch.”

Stage

Groups prompts by customer journey stage (Awareness, Comparison, Evaluation, Advice, or Other)

Analyze performance by where prompts fit within the customer journey — from early Awareness to specific comparisons

Seed Prompt

The original question (prompt) used to generate multiple prompt variants.

Review all related prompts to understand performance across selected AI platforms.

Prompt Variant

The specific version of a seed prompt by platform.

Compare how each AI model responds or cites sources for the same query.

Pricing Note: Scrunch counts prompts based on active Prompt Variants, not Seed Prompts.
Each prompt is multiplied by the number of AI platforms it runs on (for example, 20 prompts across 5 platforms = 100 total prompts counted toward your subscription limit).

Did this answer your question?