If you’ve just inherited a website — whether through an agency handoff, internal team transfer, or acquisition — you’re likely asking the same first question I always do: What am I working with here?
If you’ve ever found yourself staring at an unfamiliar site and wondering where to start, this triage guide is for you.
It doesn’t matter if the project came through a Flippa acquisition or a Slack message from your boss — the key is knowing how to quickly size up a site’s condition, identify its greatest vulnerabilities and opportunities, and determine what needs immediate attention — and what can wait.
This isn’t a checklist or a tutorial. It’s a flexible framework that adapts to different scenarios and supports your own self-discovery into related areas of interest — whether it’s for M&A due diligence, client onboarding, or internal diagnostics. My approach blends fast, reliable tactics with enough depth to reveal what’s broken under the surface.
Where I Always Start: GA4 & GSC Access
Before I fire up Screaming Frog or look at rankings in any SEO tool, I’m on the hunt for access to the two most valuable assets:
- Google Analytics (GA4): Which should be filled with usage patterns, key conversion flows, content engagement trends, and page speed insights.
- Google Search Console (GSC): Provides critical, first-party data on search performance, indexation health, top queries, indexing errors, and crawl activity.
“Without GA4 and GSC, you’re guessing — not diagnosing.”
If you don’t have access to either or both, that becomes priority #1 — even if it means setting them up from scratch. Sure, you won’t get historical data — but once verified, you’re capturing first-party insights and building the baseline for every future improvement.
Next, head on over to Google and search for the domain.
site:example.com
This gives you a real-time sense of indexation: what’s appearing, what shouldn’t be, and what may be outdated or duplicative.
Look for clues visible in the SERP:
- Title and meta alignment: Are titles readable, accurate, and unique? Do they reflect current branding?
- Breadcrumbs vs. raw URLs: Breadcrumbs usually signal structured data and clean taxonomy. If you only see full URLs, you may be missing schema support.
- Date snippets: If dates appear on content that shouldn’t be time-sensitive (like service pages), Google may be pulling from errant timestamps — or misinterpreting your layout.
- Image thumbnails: These can sometimes show when Google detects valuable visual media. If they’re missing from key pages, check your
og:image
, structured data, and image placement. But know, many SERP features fail to show on a simple site: search. - Tag/category archive spam: Seeing dozens of near-identical blog index pages? That’s a taxonomy control problem — and a potential source of cannibalization or bloat.
“Google’s SERPs reveal what it thinks your site is. The
site:
command gives you the unfiltered version of that story.”
Hidden in Plain Sight: The GSC Settings Tab
Despite its importance, GSC’s Settings tab is still one of the most overlooked sections — and often the first place real issues show up.
I always check:
- Crawl Stats: traffic drops, crawl delays, or server response issues
- Indexing Crawler Type: desktop vs. mobile parity
- Verification + Access: confirm full rights and shared access across teams
Tip: GSC hides core insights where no one expects them. The Settings tab is your early warning system.
Surface Level Symptoms: Indexation & Cannibalization
This is where I start matching content to performance. I’m looking for:
- Pages with high impressions but low CTR
- Multiple pages ranking for similar queries
- Orphaned content with zero internal links
Inside GSC → Performance report, I:
- Filter by high impressions & low CTR
- Add Page and Query dimensions
- Sort and scan for overlap and missed intent
site:example.com inurl:blog intitle:tag
Use this to find tag and category pages that shouldn’t be indexable — often a hidden source of cannibalization or crawl waste.
Warning: Cannibalization doesn’t always look like two identical posts. Sometimes it’s duplicate intent scattered across weakly supported taxonomy pages.
Need more? Get more guidance straight from the source? → Google’s Guide to Duplicate Content
Performance Matters: CWV + Page Experience Signals
Instead of digging through lab-based data tools like Sitebulb, I use:
- CrUX Dashboard: for real-world performance data
- PageSpeed Insights: for current lab test metrics anf optimization recommendations
- GA4: for pages with drop-offs in engagement, load times or sluggish navigation
“CrUX is your historical truth. It’s available for any URL — even competitors — and it’s where you’ll find an important, historical run of health.”
Never Used CrUX Before? Google’s template literally takes 2 minutes to set up → Google Chrome UX Report Data Studio Template
Did You Know? You can use CrUX to test competitor URLs when suspecting mobile SERP gaps. It’s particularly revealing when comparing template-level DOM sizes, rendering time, or interactivity issues.
Templates + Taxonomies: Structural Intelligence
Pages don’t rank in isolation — structures do. Once I’ve identified problematic pages, I zoom out and ask:
- Are Services and Blog templates bloated with unnecessary fields or duplicate headings?
- Are Projects or Portfolios grouped logically in the URL and nav?
- Are templated pages (like Staff Bios or Events) pulling in smart internal links?
DevTools trick:
Chrome → DevTools → Lighthouse → View Treemap
This shows you DOM size across pages — a fast, visual way to flag bloated templates or overengineered experiences. What you find here can often show why section-specific performance may vary from the balance of the site.
New to DOM? Start with Google’s take on tackling DOM issues → Web.dev on Optimizing DOM Size
Intent ≠ Content: Closing the Gap
I export GSC queries and crosswalk them to site structure:
- Group keywords by ranking page
- Scan for unrelated queries (wrong intent)
- Compare page title/meta vs. actual user search behavior
Tip: If your About page ranks for “[service] pricing” — fix your internal linking, adjust meta, or create the page users actually want.
This is one of the biggest trust leaks in SEO. If Google sends someone to a page and they bounce? You’ve just proven you weren’t the answer.
Authority + Relevance: Backlinks and Mentions
Whether I’m doing due diligence or mapping brand equity, I look at:
- Referring Domains: from Ahrefs or Majestic
- Anchor Text Mix: to catch over-optimization or thin coverage
- Unlinked Mentions: easy wins for outreach
“If Google trusts you, your page doesn’t have to be perfect. If it doesn’t, your page has to overdeliver.”
Final Word
Inherited sites aren’t blank slates — they’re loaded questions. Your job isn’t to answer all of them. It’s to identify the ones that matter and fix them in the right order.
Need fast answers? My Micro SEO Audit distills this triage process and produces actionable insights, a video recording – and prioritized recommendations that are focused, tactical, and tailored to your site and situational needs.