When UR Climbed from 18 to 31: How a Signal Audit Exposed Dead Backlinks and Repaired a Traffic Drain
This is a hands-on case study about a small SaaS site where a seemingly good metric - a jump in UR from 18 to 31 - masked deeper problems. It took three years of piecemeal fixes and one full signal audit to find the dead backlinks that were dragging rankings down. I’ll walk through context, the exact problem, the method we used, a 90-day implementation timeline, concrete results with numbers, the blunt lessons, and a practical self-assessment you can use today.
How a Niche SaaS Site Saw UR Jump from 18 to 31 and Why That Moment Changed Everything
Context: the site was a bootstrapped B2B SaaS targeting a niche market with ~1,200 keywords in scope. Organic traffic hovered near 4,200 sessions/month. The site had been around five years and attracted a mix of editorial links, forum links, and cheap directory links. In month zero we noticed a curious data point - Majestic/Toolset reported URL Rating (UR) for the homepage rising from 18 to 31 over two weeks.

Why it mattered: many teams celebrate UR increases as automatic wins. The product team celebrated. The founder assumed "links are improving." Search consultants suggested doubling down on link building. I paused. My experience told me that raw link authority can increase while link "signals" weaken. UR does not tell you whether links send traffic, are indexed, or still exist on source pages. The spike was a trigger, not validation.
Why Standard Backlink Audits Failed - The Traffic Drop That Told a Different Story
The specific problem: despite the UR jump, organic traffic fell from 4,200 to 3,100 sessions/month over three months. Rankings for competitive commercial terms slid from page 1 to page 2 and 3. CTR on core landing pages declined 18%. That contradiction - higher UR, lower organic performance - exposed a blind spot in standard SEO audits.
Standard backlink audits had been run twice before. They flagged toxic links by spam score and suggested disavows. We disavowed 240 domains across two rounds. Yet rankings did not recover. Why? Two core failures:
Counting over context: reports counted links but didn't verify whether the pages were live, indexed, or blocked by robots.txt. Timing and signal decay: many referring pages still existed but were stripped of the link during template updates or CMS migrations at source sites. Those dead links still inflated the backlink count in crawls but stopped sending any link signal.
Put plainly: the UR rise was mostly from a handful of high-authority pages linking once, but the network of long-tail referrers had silently degraded. We had numbers that looked good and a signal network that was leaky.
A Signal Audit Strategy: Mapping Link Health, Source Signals, and Anchor Clusters
The chosen approach was a fantom.link signal audit - not another spam-score sweep. The goal: identify which backlinks still delivered meaningful signals (traffic, indexation, anchor relevance), which were latent (present but not indexed or blocked), and which were dead (link removed, 404, or noindex).
Key principles guiding the audit:
Measure multi-dimensional signals, not just counts: crawl status, HTTP status, indexation of referrer, traffic from referrer, anchor intent. Prioritize high-impact domains: identify domains that historically drove conversions or ranked well for core terms, then verify link reality. Combine automated crawling with manual verification for a representative sample - automation misses subtle signals like cloak redirects and removed anchors.
What we tracked for each referring domain and page
Referrer URL HTTP status (200, 301, 404, 410, 503) Canonical and meta robots (noindex, nofollow) Presence of link on page HTML and anchor text Referrer page indexed status in Google (site: command and cache check) Organic traffic estimate from referrer (estimated sessions/month) Historical placement: editorial, footer, comments, author bio, widget
Implementing the Signal Audit: A 90-Day Timeline
We executed a 90-day plan broken into discovery, verification, remediation, and monitoring. Below is an exact timeline with tasks and resource allocation.
Days 1-14 - Full data pull and triage Exported full link profile from three providers: Ahrefs, Moz, and Google Search Console (GSC). Total raw backlinks: 23,400; distinct domains: 2,970. Normalized URLs and removed obviously irrelevant domains (ad networks, CDN hostnames, internal duplicates) - reduced list to 18,150 backlinks across 1,940 domains. Flagged 420 domains with previous disavow entries for re-check. Days 15-40 - Automated crawl and signal scoring Ran an automated crawler against the 1,940 domains to capture HTTP status, link presence, and meta robots. This reduced the "active link" set to 7,200 page-level links. Scored each referring page with a composite signal score (0-100) using weighted inputs: indexation (30), presence of anchor (25), traffic estimate (20), contextual relevance (15), HTTP status (10). Identified 1,180 pages with composite score below 20 - labeled dead or inert. Days 41-60 - Manual verification and outreach Manually reviewed a stratified sample: 300 pages (100 high UR, 100 medium, 100 low). Outcome: automation had 12% false positives where links were present but dynamically injected via JS; 9% false negatives where automation missed links blocked by robots. Outreach: sent targeted emails to webmasters for 520 pages that were high-value yet had degraded links. Asked for link restoration or correction. Response rate: 23% within 30 days. Fix rate via outreach: 48% among responders. Days 61-90 - Disavow, documentation, reindex request Compiled a conservative disavow list: 1,010 domains confirmed dead or maliciously changed to thin-link templates. This list excluded domains where outreach showed intent to fix. Uploaded disavow to Google and submitted targeted reindexing requests for ten key landing pages that had dropped in rankings. Set up ongoing monitoring: daily link checks for the top 200 referrers and weekly composite score refresh for the top 1,000 pages.
From UR 18 to 31: Clean Numbers and Real Outcomes After 6 Months
Here are the measurable results we tracked, with timelines and exact percentages. The UR jump itself happened early in the sequence but did not correlate to recovery until after the audit and remediation.
Metric Baseline (Month 0) After 3 Months After 6 Months UR (homepage) 18 31 33 Organic sessions / month 4,200 3,600 5,100 Core keyword average position 18.4 20.1 11.7 Top-10 keywords 147 134 196 Conversions / month (trial signups) 32 28 47
Key interpretation:
The raw UR increase to 31 was likely driven by a few high-authority links that were crawled at the same time. That metric alone was misleading. Traffic declined initially because long-tail linking pages - which sent most referral traffic and built topical relevance - had decayed. The signal audit identified and removed dead links that created noise in the indexation model. That pruning allowed Google's signals to re-evaluate the true set of active, contextual links. Six months after the audit and follow-up outreach/disavow, organic sessions rose to 5,100/month, an increased conversion volume of 47 signups, and a surge in top-10 keywords from 147 to 196. The business saw a 46% increase in organic signups vs baseline.
5 Hard Lessons About Backlinks That Most SEOs Won't Tell You
These are not platitudes. Each is based on the audit and the numbers we collected.
Quantity lies - quality decays. Backlink counts go up and down. What actually matters is whether links are on indexed pages, visible in HTML, and contextually relevant. We found 58% of counted links sent no signal. UR and similar aggregate metrics can mask fragmentation. AUR-type jumps can come from isolated high-DR pages. That can hide systemic decay across hundreds of lower-traffic referrers that build topical authority. Automated audits miss dynamic and cloaked links. Our automation had ~10% error rate on signal presence. Manual checks on a representative sample are essential. Conservative disavow beats aggressive guessing. We disavowed 1,010 domains after manual verification. Aggressive blind disavow would have removed some useful links; conservative, documented removal produced better outcomes. Monitoring is not optional. Set a rolling check on your top referrers. Signals degrade slowly - weekly or monthly monitoring prevents long tail decay from becoming a crisis.
How You Can Run a Signal Audit and Identify Dead Backlinks Today
Below is a practical checklist and a short self-assessment quiz to help you decide whether you need a signal audit and where to focus resources.
90-Day condensed checklist
Export backlink lists from GSC, Ahrefs, and your favorite provider. Normalize and dedupe. Run an automated crawl to capture HTTP status, indexation hints, and presence of anchor text. Create a composite signal score and rank pages by impact: combine estimated traffic, anchor relevance, and indexation. Manually verify a stratified sample: top 100, middle 100, bottom 100. Outreach to owners of high-impact pages where the link is missing or wrong. Track responses. Prepare a conservative disavow for confirmed dead/toxic domains. Document why each domain is included. Submit reindex requests for priority pages and set up weekly checks for top 200 referrers.
Quick self-assessment quiz - Score 1 point per "Yes"
Do you track indexation status of referring pages, not just backlink counts? Do you manually verify at least a sample of your top referring pages every quarter? Have you documented outreach attempts and outcomes for link repairs? Does your monitoring track which links generate actual referral traffic? Have you avoided blanket disavows and kept a conservative, evidence-based list?
Score interpretation:
5 - Your backlink hygiene is strong. Run a signal audit annually and keep monitoring. 3-4 - You're partially covered. Prioritize manual verification for high-impact pages and set monitoring for the top 200 referrers. 0-2 - You need a full signal audit now. The UR number alone is not enough evidence that your link profile is healthy.
Final candid note: traditional SEO reporting makes it easy to celebrate surface-level metrics. I learned the hard way that numbers like UR can be both a useful input and a distracting headline. If your rankings and conversions don't match the headline metric, treat that spike as an alarm to investigate signal health - not a reason to scale more of the same link tactics.

If you want a template export for the composite scoring sheet we used, tell me your preferred tool (Google Sheets or Excel) and I’ll share a ready-to-use template with the weighting and columns we found most predictive.