Why Technical SEO Still Matters (And What It Can't Do)
Before I get into the checklist, I want to establish the right mental model for technical SEO — because most teams either over-invest in it or under-invest in it, and the mismatch comes from misunderstanding its role.
Technical SEO is a ceiling, not a foundation. It determines the maximum performance your content and authority can reach. Without solid technical foundations, the best content in the world won't rank at its potential. With solid technical foundations, your content and authority investments pay off at their full value.
This means: fix critical technical issues before investing heavily in content and links. But don't expect technical optimization alone to produce organic growth. It removes the constraint. Content and authority create the growth.
With that framing, here's the audit I run for new clients.
Crawl and Indexation
Robots.txt review
- Is robots.txt accessible and correctly formatted?
- Are any important pages or directories accidentally blocked?
- Is the XML sitemap location declared?
- Does the sitemap exist and is it submitted to Google Search Console?
- Does it contain only indexable URLs (no 4xx, 5xx, or canonicalized-away pages)?
- Is it updated automatically when new pages are published?
- Pull the index coverage report from Search Console — what's the ratio of indexed vs. submitted pages?
- Identify and categorize any "excluded" URLs — are they correctly excluded or are important pages being missed?
- Check for pages indexed that shouldn't be (staging environments, parameter variations, thin pages)
- Are important pages reachable within 3 clicks from the homepage?
- Is crawl budget being wasted on paginated pages, faceted navigation variants, or URL parameters?
- Are there significant numbers of redirect chains consuming crawl budget?
- Are URLs descriptive and keyword-informed?
- Are URL structures consistent across the site (no mixed conventions for categories, posts, products)?
- Are URLs static or dynamic? Dynamic URL parameters should be handled with canonical tags or parameter rules in Search Console.
- Identify all 301 redirects — are they pointing to the correct final destinations?
- Check for redirect chains (A → B → C) — these should be collapsed to direct redirects (A → C)
- Identify any 302 (temporary) redirects that should be 301 (permanent)
- Check for redirect loops
- Is every indexable page either the canonical URL or pointing to the correct canonical?
- Are canonical tags consistent with sitemap inclusions?
- Are self-referencing canonical tags implemented on all indexable pages?
- Target: under 2.5 seconds
- Common fixes: optimize above-the-fold image delivery (modern formats, correct sizing, preload for hero images), improve server response time, eliminate render-blocking resources
- Target: under 200 milliseconds
- INP replaced First Input Delay as the responsiveness metric in 2024 — it measures the latency of all interactions during a page visit, not just the first
- Common fixes: reduce JavaScript execution time, break up long tasks, defer non-critical JS
- Target: under 0.1
- Common causes: images without explicit dimensions, embeds without reserved space, dynamically injected content above existing content
- Fix by specifying width/height attributes on all images and reserving space for late-loading elements
- No clickable elements too close together (minimum 48px tap targets)
- Text readable without zooming (minimum 16px font size for body)
- Content fits within viewport without horizontal scrolling
- Unique title tag on every indexable page
- Primary keyword naturally present, ideally near the beginning
- Under 60 characters to avoid truncation in SERPs
- Not duplicated across pages
- Unique meta description on every indexable page
- Under 160 characters
- Compelling enough to improve click-through rate (Google uses this when relevant to the query)
- One H1 per page, containing the primary keyword
- H1 distinct from the title tag (though both can be similar)
- Logical H2/H3 hierarchy below — no skipping heading levels
- All images have descriptive alt text
- Images are in modern formats (WebP or AVIF preferred over JPEG/PNG)
- Images are correctly sized for their container (not oversized images scaled down with CSS)
- Lazy loading implemented for below-fold images
- Article schema on blog posts and editorial content
- Product schema on e-commerce product pages
- FAQ schema on pages with question-answer content
- Review schema where aggregated review data exists
- LocalBusiness schema for sites with physical locations
- Validate all structured data with Google's Rich Results Test
- Use Google Search Console's URL Inspection tool to see how Google renders your pages — compare rendered HTML to source HTML
- Identify any content that exists in JavaScript but isn't being rendered by Googlebot
- Identify the total JavaScript payload and look for opportunities to reduce it
- Defer non-critical JavaScript that isn't needed for initial render
- Avoid client-side rendering for content that needs to be indexed — prefer server-side rendering or static generation for indexable content
- Are all important pages linked to from somewhere crawlable? Orphan pages (no internal links) won't be discovered by Googlebot.
- Are the pages with most business value receiving significant internal links?
- Does the internal linking structure reflect the content hierarchy (pillar pages receiving more internal links than supporting pieces)?
- Is descriptive, keyword-informed anchor text used in internal links (rather than "click here" or "read more")?
- Full HTTPS implementation with no mixed content warnings
- HTTP pages should 301 redirect to HTTPS equivalents
- SSL certificate is valid and not expiring soon
- Technical SEO is a ceiling, not a foundation — it removes constraints but doesn't create organic growth
- Crawl and indexation are the foundational layer — make sure Google can find and index what matters
- Core Web Vitals (LCP, INP, CLS) are ranking factors — fail these and you're competing at a disadvantage
- INP replaced FID in 2024 — if your audit tools still reference First Input Delay, update them
- JavaScript rendering is a critical check for modern frameworks — Google may not see what you see in the browser
- Internal linking architecture signals content hierarchy to Google — orphan pages and poor authority distribution are commonly missed issues
XML sitemap audit
Index coverage analysis
Crawl depth and efficiency
URL Architecture and Redirects
URL structure
Redirect audit
Canonical tags
Core Web Vitals and Page Experience
Largest Contentful Paint (LCP)
Interaction to Next Paint (INP)
Cumulative Layout Shift (CLS)
Mobile usability
On-Page Technical Foundations
Title tags
Meta descriptions
H1 structure
Image optimization
Structured data
JavaScript and Rendering
This section is increasingly important as more sites are built on JavaScript frameworks.
Rendering audit
JavaScript execution
Internal Linking Architecture
Crawlability
Authority distribution
Anchor text
Site Security and HTTPS
The Prioritization Framework
Not all technical issues are equal. Here's how I prioritize:
P1 (Fix immediately): Issues that are actively preventing crawling or indexing of important pages — misconfigured robots.txt blocking key pages, sitewide canonicalization errors, redirect loops, HTTPS implementation problems.
P2 (Fix within 30 days): Issues that are limiting performance without preventing indexing — Core Web Vitals failures (especially LCP), significant crawl depth problems, major URL architecture inconsistencies, missing title tags or H1s on important pages.
P3 (Fix in next sprint): Issues that are table stakes but not urgently impacting performance — image alt text across large catalogs, structured data implementation, redirect chain cleanup.
Monitor (Ongoing): Crawl coverage in Search Console, Core Web Vitals trends, index status.