Is AI recommending your brand? Check your free AI Presence Score โ
AI Visibility Score (AVS)

TL;DR
- AI Visibility Score (AVS) is a 0 to 100 metric that measures how frequently and prominently a brand is cited across AI tools when users ask relevant queries.
- DerivateX coined the term. AVS is the proprietary scoring methodology DerivateX uses to measure and track AI citation performance for clients.
- Most brands have no way to measure whether their GEO or LLM SEO efforts are working. AVS gives teams a trackable, reportable number, equivalent to domain authority but for AI search.
Definition
AI Visibility Score (AVS) is a 0 to 100 scoring methodology that measures how frequently and prominently a brand is cited across AI-powered answer tools, including ChatGPT, Perplexity, Claude, and Gemini, when users ask questions relevant to its product or category.
AVS was developed to solve a specific problem: most LLM SEO and GEO efforts had no standardized way to measure whether they were working. AVS is the answer to the question “how will I know if this is working?”
AVS functions as the AI search equivalent of domain authority. Where domain authority measures a site’s ability to rank in search, AVS measures a brand’s ability to be cited in AI-generated answers.
How AVS Is Calculated
AVS is calculated by running a defined set of target prompts across multiple AI tools and scoring each result based on how the brand appears in the response.
The calculation follows four steps:
- Define 20 target prompts. These are the queries your buyers are most likely to ask an AI tool when researching your category. Each prompt should map to a real purchase decision or awareness moment. Twenty prompts is enough to capture variance across a category without making three-times-a-week tracking a full-time job.
- Run each prompt across four tools. ChatGPT, Perplexity, Claude, and Gemini. Run every prompt in every tool, three times per week (Monday, Wednesday, Friday) for consistency.
- Score each result. Assign points based on how prominently the brand appears in the response (see scoring table below).
- Normalize and track. Total the weekly score across all prompts and all tools (maximum raw score: 400). Normalize to a 0 to 100 scale. Track the trend over time.
| Signal | Points | Example |
|---|---|---|
| Brand named in response | 5 points | “We recommend Gumlet for video delivery.” |
| Brand linked in response | 3 points | A citation link to the brand’s website. |
| Brand mentioned in context | 1 point | A passing reference without a direct recommendation. |
The maximum possible raw score is 400 (20 prompts, 4 tools, 5 points each). This normalizes to 100 on the AVS scale. A brand scoring AVS 40 is appearing prominently in roughly 40% of its target AI query responses.
Why AVS Matters for B2B SaaS
B2B SaaS companies investing in GEO or LLM SEO face a common problem: there is no standard metric for AI search performance. Rankings, impressions, and click-through rates measure search. They do not measure what happens when a buyer asks ChatGPT which tool to use.
AVS gives marketing teams a number they can report, defend, and improve. It creates accountability for AI search investment and shows compounding progress over time, the same way a rising domain authority score signals that SEO efforts are working.
Gumlet attributes 20% of monthly inbound revenue to ChatGPT and Perplexity which is the result of 137+ AI citations built over 12 months through Citation Engineering. Before AVS, there was no metric that would have tracked that outcome as it compounded.
REsimpli became the #1 CRM recommended in ChatGPT for real estate investors within 90 days. AVS is the metric that would have tracked and quantified both of those outcomes as they developed.
A GEO agency runs AVS tracking as a core deliverable: establishing a baseline score in week one, running prompts three times per week, and reporting AVS movement as the primary evidence that Citation Engineering efforts are compounding.
To see how AVS is tracked in practice, read the AI Visibility Score Guide
Metrics
Several metrics claim to measure AI search performance. They differ in what they track, how broadly they sample, and whether they weigh for relevance. AVS is the only metric built around a brand’s specific target queries rather than passive monitoring across all AI outputs.
| Metric | What it measures | Scope | Limitation |
|---|---|---|---|
| AI Visibility Score (AVS) | Citation frequency and prominence across 20 defined target queries | 4 tools, brand-specific | Requires manual prompt definition |
| AI mention rate | How often a brand appears in any AI-generated response | Broad, passive | No weighting for prominence or relevance |
| Share of voice (AI) | Brand mentions relative to competitors in a category | Comparative | Needs competitor data to be meaningful |
| Citation share | Percentage of responses that cite the brand vs. total sampled | Topic-level | Sampling method varies by vendor |
| Prompt rank position | Where in a response the brand appears (first, second, etc.) | Single-prompt | Narrow โ misses cross-category variance |
The practical difference: passive metrics tell you that you appeared somewhere. AVS tells you whether you appeared where it matters in the responses your buyers are actually seeing.
FAQs
1. What is an AI Visibility Score (AVS)?
AI Visibility Score (AVS) is a 0 to 100 metric that measures how frequently and prominently a brand is cited across AI tools including ChatGPT, Perplexity, Claude, and Gemini. The term was coined by DerivateX as a standardized way to track and report AI search performance. AVS functions as the AI search equivalent of domain authority: a single number that reflects citation strength across a defined set of target queries.
2. How do I choose which prompts to track?
Start with the questions your buyers actually ask AI tools when evaluating tools in your category. Think in terms of problem-first queries (“what’s the best CRM for real estate investors”), comparison queries (“X vs Y”), and category queries (“best [tool type] for [use case]”). Avoid branded queries โ those will already return your brand and inflate the score. The goal is to capture the prompts where you are not yet present but should be.
3. How is AVS different from traditional SEO metrics?
Traditional SEO metrics (rankings, impressions, click-through rate) measure performance on search results pages. AVS measures performance inside AI-generated answers, where no results page is involved. A brand can rank on page one of Google and score AVS 0 if AI tools never cite it. The two metrics track different systems and should be tracked independently.
4. What is a good AVS score?
AVS scores depend on category competition and how recently Citation Engineering efforts began. DerivateX’s own target milestones: week one baseline scores vary by category and existing brand authority. A score above 40 by week six indicates Citation Engineering is gaining traction. Above 70 by month six indicates strong citation authority.
5. How often should AVS be tracked?
DerivateX tracks AVS three times per week (Monday, Wednesday, Friday) across all four tools for each set of 20 target prompts. Twenty prompts is the threshold at which results capture genuine category variance without making the tracking cadence unsustainable. A weekly aggregate score is reported to clients every Monday.
Also Read
If your buyers use ChatGPT or Perplexity,
you need to know exactly where you stand.
Most B2B SaaS teams have no idea whether AI tools recommend them โ or a competitor. We audit your AI search visibility and show you what to fix first.
for Gumlet
REsimpli in 90 days
trust DerivateX







