In my previous article, I found that AI visibility tends to favour brands that already have strong domain authority. That means brands that have ranked high and trusted by Google all along don’t need to sweat as much as newer brands that are trying to get some web presence.

So how do new businesses compete now that AI has provided another avenue for getting leads. In my curiosity, I searched out for real actual businesses that have increased their AI citations exponentially within a few months.

What surprised me was how much common ground there was across three very different businesses in three very different categories. Each one took a different approach, but the underlying logic was consistent. More importantly, many tips can be followed regardless of size or domain authority.

Let’s go with the first case

Case Study #1- AIclicks (SaaS)

There are quite a lot to takeaway from AIclicks case study with the first one being treating themselves as their first client. It makes sense to show credibility of their own software they claim to provide the same AI visibility for their clients.

AIclicks got to 11.65% share of voice in three months not because of one clever move, but because they executed consistently across all three layers at the same time.

Description:

AIclicks is an AI visibility and GEO platform that helps brands track, analyze, and improve how they appear inside AI-generated answers across ChatGPT, Gemini, Perplexity, and other LLMs. It combines prompt-level tracking, share-of-voice analytics, competitor benchmarking, and execution insights into one system.

The challenge

Ai clicks was competing in a crowded space. It needed trust and authority quick in order to rank and get visibility for it’s own AI visibility tool.

What moved the needle for AIclicks

1. Data-driven Content engineering

Aiclicks did not guess what content to write but did a reverse engineer by tracking every title patterns and exact URLs that AI models were already citing.

They analyzed cited sources for bottom-of-the-funnel queries like “AI SEO tools“, AI visibility software, how often they appeared.

AIclicks also tracked high intent driven AI prompts, using their own tool, and published articles based on real llm data.

They also did a neat work by building not just 1 but several landing pages targeted at specific AI research intents, such as “Perplexity brand visibility rank tracker“, "ChatGPT SEO visibility rank tracker" to match the direct queries users ask in AI assistans.

One of the landing page catered to AI search intent

2. Radical Credibility & E-E-A-T

Be it SEO or AI optimization work, one thing that cannot be ignored is to prioritize on trust signals.

What Aiclicks did was to build a clean website with proper structured schemas, replacing admin authors with real founder profiles, and social media links. They also build a presence on G2 early because its structured data is highly citable by LLMs for “best of“ comparisons

This one is slightly more technical and optional. AIclicks created a custom GPT. Knowing that Chatgpt’s own domain already carry much domain authority, they created a custom GPT called “Best GPT Rank Tracker“ to reference AIclicks within the outputs. Custom GPTs are only available on a paid version of Chatgpt.

3. Source-First Distribution (The Flywheel effect)

Reddit now is getting so much prominence such that they are the most frequently cited website by LLMs. As such, AIclicks engaged in over 160 reddit threads, resulting in over 3000 LLM mentions traced back to reddit pages where AIclicks appears. They expanded their brand presence by redistributing their content on Medium and Linkedin as well.

TLDR- What AIclicks did:

  • Identify high intent-driven prompts like “AI SEO tools“, “ChatGPT SEO“, and published articles that are GEO-ready

  • Build a proper website with structured schemas, real founder profile photos, LLM friendly website.

  • Build a presence on G2

  • Create multiple intent driven landing pages

  • Engage in reddit threads

  • Redistribute blog content on Medium, Linkedin

  • Create a custom GPT on ChatGPT

  • Get backlinks from reputable sources

Case Study #2- Sunil Pratap Singh (SEO+GEO Consulting)

The next case study is another one worth looking into as it started as a personality-led branding and had been dormant on the web for years. In my previous article, I spoke about how google AI overviews favour institutional brands than personality-brand in the YMYL category. So this case study shows some proof that it might actually work out in the end.

Description:

Sunil Pratap Singh is a strategic search and AI growth partner based in India. His site, sunilpratapsingh.com, serves as both a personal brand hub and a live proof-of-concept for his proprietary Search Signal Framework that he uses with clients to engineer AI and organic visibility deliberately.

The Challenge

The site had existed for years but was effectively dormant. With an Authority Score of just 14, 207 monthly organic visitors, and no meaningful AI presence, it lacked the foundations most would consider a prerequisite for visibility. Sunil's challenge was to prove that AI visibility could be built from near zero, on weak traditional SEO foundations, using a repeatable methodology. The site needed to become the best example of his own framework.

The Result:

Starting from roughly 2 AI sessions per day with no entity recognition across any LLM, the site grew to 61 AI sessions across four platforms (ChatGPT, Claude, Gemini, Perplexity) in a 90 day window. ChatGPT alone drove 44 of those sessions and 63 pages now appearing in AI generated responses.

What Moved The Needle?

Sunil followed a strict signal framework by fixing Presence before Trust, and Trust before Relevance. Each phase was documented in real time, with real data published openly.

1. Presence First: Entity Infrastructure

When you search for someone online, you expect Google to return their name, their job, and their website accurately. For that to happen reliably, the information about that person needs to be consistent and structured in a way machines can read, not just humans.

  • He ran a sitewide JSON LD schema audit.

  • Replaced fragmented markup with unified @graph blocks.

  • Deployed a Person schema with sameAs links across 11 platforms on every page of the site, ensuring consistent entity signals wherever AI crawlers looked.

  • Fixed site architecture issues by reducing 27 pages from depth 4 to depth 2, making the site structurally cleaner for both bots and readers.

  • Published llms.txt and llms-full.txt files. Similar in concept to robots.txt (which tells search engine crawlers which pages to access or ignore), llms.txt is a proposed standard that gives AI language models a structured map of a site's most important content.

It's worth noting there is currently no strong evidence that publishing an llms.txt file directly influences how LLMs retrieve or improve citations across AI engines. Perhaps it might be essential in future.

Essentially, what Sunil did in Phase 1 was to build a clear, consistent entity across all platforms- making sure every major site where he had a presence described him the same way. At the same time, he structured his own site so that AI crawlers could read not just his content, but his identity: who he is, what he does, and how it all connects.

The outcome was straightforward but meaningful: when you typed Sunil's name into ChatGPT, it began returning his name, title, and website correctly. That is the foundation everything else is built on.

2. Trust Signals: Original Research and Platform Authority

With presence established, he shifted focus to credibility. He published a comparative GEO frameworks analysis on his own site, original research with named methodologies designed explicitly to be citable. He wrote a Medium article on the AI visibility knowledge cutoff problem, structured for citation rather than just traffic.

He then took a similar approach to what AIclicks used on Reddit engaging directly in communities where LLMs are known to pull citations. Eight substantive Reddit answers on GEO strategy and AI search, each referencing his framework. Three Quora answers positioning him as a practitioner. A SlideShare research deck titled Zero Click Future added an additional high authority entity layer.

The observable outcome: Claude.ai traffic appeared for the first time. Gemini began sending sessions. ChatGPT citation frequency increased.

3. Architecture as Signal: The Full Site Rebuild

In Phase 3, Sunil made a structural commitment that most practitioners skip: he rebuilt the entire site from WordPress to a custom Next.js architecture.

Next.js is a React based web framework that generates pages in a way that is fast, clean, and highly crawlable. Unlike traditional WordPress sites where content is often assembled dynamically on the fly, Next.js can serve pre built, static pages that load almost instantly and are straightforward for both search engine and AI crawlers to parse. In short, it removes technical friction between a crawler and the content it is trying to read.

Crucially, the site structure itself was mapped directly to the four signals of his framework. Every page was assigned a signal function not just a topic.

Next.js is a solid technical choice, but the GEO and AI visibility gains Sunil saw came primarily from his schema work, entity consistency, content strategy, and platform distribution not from switching frameworks. WordPress, if done well, can deliver the same signals.

TLDR — What Sunil Pratap Singh Did:

  • Audited and rebuilt sitewide JSON LD schema with unified @graph blocks

  • Deployed Person schema with sameAs links across 11 platforms on every page

  • Published llms.txt and llms-full.txt for structured AI crawler access

  • Fixed site architecture — reduced page depth, resolved orphan pages

  • Published original GEO research designed for LLM citation

  • Engaged Reddit and Quora with framework referenced practitioner answers

  • Built a SlideShare research deck as an additional entity signal

  • Maintained consistent entity descriptions across LinkedIn, Crunchbase, Medium, Quora, Reddit, Pinterest, and more

  • Rebuilt the entire site around a signal architecture using Next.js

  • Published live case studies with real data as ongoing proof of methodology

Learning from Sunil Pratap

Sunil’s strategy comes from a strategic 4 phase framework of fixing his brand presence, improve trust signals, crawlability, and relevance. Another finding here is that traditional SEO strength is not a prerequisite for AI visibility.

An Authority Score of 14 and 207 monthly visitors would disqualify most sites from serious SEO conversations but it didn't stop AI platforms from beginning to cite him.

#3 Case Study- SSOJet (B2B SaaS)

Description:

SSOJet is an enterprise authentication platform that helps B2B SaaS companies add enterprise SSO (Single Sign On) covering SAML and OIDC protocols without having to rebuild or replace their existing authentication system. It works alongside whatever auth setup a company already uses, whether that is Auth0, Firebase, Supabase, AWS Cognito, or a custom built system.

SSOJet never tried to beat Auth0 or WorkOS at being "the best enterprise SSO platform." Instead, they identified a precise problem; companies that already had authentication set up and just needed to layer enterprise SSO on top and became the definitive educational resource for that exact scenario.

The Challenge

SSOJet had built a genuinely differentiated product in a crowded category dominated by well funded names like Auth0, WorkOS, and Okta. The problem was not the product it was visibility.

When developers turned to AI assistants to ask how to implement enterprise SSO, SSOJet simply did not come up. Auth0 and WorkOS dominated those conversations entirely, not because they were better solutions for every scenario, but because they had years of educational content that AI models had learned from. SSOJet had great technology and zero AI presence to match it.

What moved the needle for SSOJet?

Rather than trying to out-spend the category leaders on ads or brand marketing, SSOJet worked with GrackerAI to take a completely different route: out-educate them. The strategy was built on the insight that developers do not want to be sold to- they want to learn, then evaluate, then implement. If you become the most useful educational resource in the space, the product discovery follows naturally.

The Result: Over six months, SSOJet's overall AI visibility score climbed from 18% to 67%, a 272% increase. ChatGPT brand mentions grew 623% month over month. Perplexity featured SSOJet in 89% of "implement SSO" queries. Enterprise customer signups grew 287%, demo requests from AI referred traffic jumped 412%, and the average sales cycle dropped 47% — from 34 days to 18 — because prospects were arriving already educated.

Developer First Content Strategy

SSOJet's content approach worked in three layers. The first was meeting developers exactly where they were searching. So they posted 35 framework specific implementation guides, each written for a specific combination of framework and existing auth system. Every guide followed a developer first structure with copy-paste-ready code and working GitHub repositories, because AI assistants learn from content that directly answers the specific questions developers are actually typing.

The second layer went deeper. They learned that senior engineers don't just want instructions, but to understand why something works. So while their competitors published documentation about their own products, SSOJet published 45 articles on the underlying protocols themselves: SAML, OpenID Connect, and SCIM. That neutral, educational positioning is exactly the kind of content AI systems trust and cite.

Third, they produced 20 comparison articles of fair breakdowns against Auth0, WorkOS, and Okta. Because the comparisons were not one-sided, AI assistants started treating SSOJet as an unbiased source, which reinforced the citation frequency across all three content types.

Programmatic Content at Scale

The fourth phase was the one that created a structural, hard to replicate advantage. SSOJet built over 2,100 implementation pages using a programmatic approach — combining 20+ frameworks, 10+ authentication systems, and 15+ enterprise identity providers to generate guides covering every major stack variation. A developer using Next.js with Firebase who needed to support customers on Azure AD could find a guide written specifically for that combination, complete with working code and troubleshooting steps.

This content created dominance over long tail queries where competition was essentially nonexistent. When a developer typed an ultra specific question into Perplexity or ChatGPT, SSOJet's guide was often the only substantive answer that existed for that exact scenario. That specificity is what made AI systems consistently recommend SSOJet for implementation queries — there was nothing better to cite.

GitHub as a Trust Platform

Alongside the written guides, SSOJet built 47 working example repositories on GitHub — one for each major integration combination. Each repo was fully functional, recently maintained, and documented with a comprehensive README.

This matters for AI visibility in a specific way: when AI assistants find a concrete, working code example with recent commits, they can confidently recommend it as a reliable source. A stale or incomplete repository undermines the recommendation. SSOJet's repos were kept current as frameworks evolved, which meant AI systems kept citing them as trustworthy references long after they were first published.

TLDR; What SSOJet Did:

  • Published 35 step-by-step technical guides written for the specific tools and coding environments their target customers were already using

  • Published 45 in-depth educational articles explaining how the underlying security protocols actually work — not just how SSOJet works

  • Created 20 honest comparison articles against their main competitors, fairly acknowledging where rivals were the better fit and where SSOJet had the edge

  • Built over 2,100 pages of highly specific guides covering every major combination of coding framework, existing login system, and enterprise identity provider a customer might be using

  • Built 47 working code repositories on GitHub so developers could see exactly how the integration worked before committing to anything

  • Wrote all content in a direct, answer-first format that AI systems can easily parse and cite

  • Owned a clear and specific positioning: the solution for companies that already have a login system in place and simply need to add enterprise-grade SSO on top of it, without starting over

Learning from SSOJet’s case study

The deepest lesson here is about trust. SSOJet earned AI visibility not by optimising for algorithms in a narrow sense, but by genuinely being the most useful, specific, and honest source of information in their category. AI systems rewarded that because their job is to find the best answer, and SSOJet built the best answers.

The programmatic content strategy is also worth paying attention to. Creating 2,100 stack-specific guides sounds expensive, but the compounding effect is significant. Long-tail queries with almost no competition, each one pulling in developers with a very specific and immediate need. Winning a hundred of those queries is often more valuable than competing for a single broad one.

Takeaway- How to increase AI visibility for your business in 90 days?

If we can combine and deduce the common strategies each of the case studies employed. You will be able to tell that AI visibility requires a strategic framework and this is how you can improve it within a short span of time:

Get your entity right before anything else

Sunil’s case study makes this the clearest. Before publishing any content or building backlinks, make sure AI systems can correctly identify who you are. That means your business will need consistent brand descriptions across platforms, proper schema markup, and clean site architecture.

A simple step is to audit how your brand appears across every platform you have a presence on such as Linkedin, your website, any directory listings. Make sure the name, description, and specialty are consistent everywhere.

Write content that answers the exact question

All three brands understood that AI systems do not cite general articles but specifc answer. This process require certain reverse engineering to find out what the AI prompts your audience is keying in, why your competitors are cited but not you, look at which URLs appeared in AI responses for their target queries.

Additionally, rather than competing for the same queries as established players, find the angle they are not covering and build your authority there. SSOJet is a good example of this instead of fighting for broad "enterprise SSO" visibility, they became the definitive resource for a narrower but highly specific problem. This came through by truly knowing what their customers are looking for.

Build trust signals on third party platforms, not just your own site

All three brands understood that AI systems do not just look at your website, they look at what the rest of the internet says about you. You can see they have built their presence on repositories, directory submission tools, engaged in reddit threads, linkedin, Medium, Quora. That’s because these particular platforms are increasingly cited in AI responses.

In all, AI systems recommend sources they trust, that answer questions specifically, and that appear consistently across the web. Build your brand entity cleanly, write content that answers real questions directly, show up on the platforms your audience uses, and be patient enough to let it compound.

Keep Reading