Technical SEO audit: the complete method to deliver an actionable diagnosis
Method, six families of tests, 2026 Core Web Vitals, client deliverables: the operational guide for agencies and freelancers who want to turn an audit into a decision.
- A technical SEO audit checks what Google can crawl, index, and understand, and how fast users access the content.
- In 2026, INP and CLS weigh as much as LCP in how Google and visitors perceive a site.
- A good audit does not stop at the technical observation: it produces a deliverable the client understands and that leads to a decision.
A client calls on a Tuesday morning. Their organic traffic dropped 40% in six weeks, with no content changes. You run the usual tools and the alert list stretches over three pages. Half have no real impact, and the cause is often hidden in two or three points.
A technical SEO audit serves to isolate those points. Not to produce a 100-page PDF nobody reads, but to pinpoint what prevents Google from crawling, indexing, and ranking a site, then to translate it into a client-readable audit report. The method has changed since INP and the 2026 evolutions. Here is what to check, and how to turn the diagnosis into a decision.
What a technical SEO audit really covers
Technical SEO audit vs full SEO audit: clearing up the confusion
Many prospects ask for an "SEO audit" without knowing there are three distinct families. The editorial audit analyzes content, keywords, and search intent. The authority audit looks at link profile and mentions. The technical audit checks that the machine works: crawling, indexation, speed, structure, signals sent to Google.
Without a clean technical audit, the rest is useless. You can write the best article on the market, but if the page is blocked by robots.txt or wrongly marked noindex, it does not rank. It is the foundation, and it is also what goes unnoticed most often.
The question to ask the client is not "do you want an SEO audit" but "what are you trying to diagnose." The answer directly drives the scope and the final deliverable.
The six families of tests of a serious technical SEO audit
A technical SEO audit worthy of the name covers six dimensions. Crawling checks that Google bots reach your pages: robots.txt, server accessibility, HTTP codes, redirect chains. Indexation controls what Google can index: meta robots tags, canonical, sitemap, hreflang, duplicates.
Structure looks at semantic hierarchy: coherent Hn tags, click depth, internal linking, Schema.org structured data. Technical performance measures Core Web Vitals and side metrics: LCP, INP, CLS, TTFB, Speed Index. Security reviews HTTP headers, enforced HTTPS, cookie policy. Accessibility, finally, checks contrasts, alt attributes, keyboard navigation.
None of these families is optional. A weakness in one is enough to undermine the best editorial efforts. The point of a technical SEO audit is not to fix everything but to prioritize what truly costs traffic.
Critical control points in 2026
Core Web Vitals 2.0: what changed for auditors
Since March 2024, INP has replaced FID. The metric measures the delay between any user interaction (click, tap, keystroke) and the next visual update. The "good" threshold is set at 200 ms maximum, versus 500 ms in the "needs improvement" zone. INP has become the Achilles heel of JavaScript-heavy sites.
According to data published by DebugBear, around 43% of sites fail the INP threshold, making it the most commonly missed Core Web Vitals metric in 2026. The problem rarely comes from visible content. It comes from third-party scripts, poorly coded WordPress plugins, or a front-end architecture that monopolizes the browser main thread.
LCP remains the central speed perception criterion. Good threshold: under 2.5 seconds at the 75th percentile of visits. CLS must stay under 0.1 to avoid penalizing visual stability. Three metrics, three thresholds, three very different correction levers.
Indexability, canonical and sitemap: the silent traps
Most unexplained organic traffic drops come from indexation issues, not content. An SEO plugin update rewriting robots.txt, a migration forgetting to update the sitemap, a canonical pointing to a dead URL. These errors trigger no visible alert on the client side.
A serious technical audit systematically verifies the presence and validity of sitemap.xml, its declaration in robots.txt, the consistency of canonical tags on deep pages, the absence of meta robots noindex on strategic pages. hreflang, for multilingual sites, must be audited with particular rigor: a single ISO code error breaks the whole setup.
On WordPress sites, which represent close to 42% of the global web according to W3Techs, specific risks add up: version exposed in source code, publicly accessible readme.html, unpatched vulnerable plugins, /wp-admin/ folder exposed without protection.
HTML structure, structured data, internal linking
HTML document structure remains a strong signal. A single H1 tag per page, a logical Hn hierarchy with no jumps (no H4 before an H2), HTML5 semantic tags used correctly. Too many sites stack generic div tags when article, nav, and main would bring clean semantic reading.
JSON-LD structured data give no direct ranking boost, but they open access to rich results: review stars, FAQ, breadcrumb, product price. A technical audit checks presence, validity, and consistency of these markers with visible content. A FAQ Schema citing questions absent from the page exposes the site to a manual penalty.
Internal linking, finally, is the blind spot of superficial audits. Excessive click depth, orphan pages with no incoming link, anchors overloaded with exact keywords. All these signals surface in a clean audit and condition the site's ability to distribute its internal authority.
How to run the audit in real conditions
The five-step process to miss nothing
In the field, an effective technical SEO audit follows five ordered steps. First: scoping. Define the perimeter (full site, sub-section, strategic pages), the business objectives (traffic loss, redesign, post-delivery validation), the access constraints (no admin login, with Search Console, with server access).
Second step: automated collection. This is the phase where an external tool saves hours. Page crawl, Core Web Vitals measurement, HTTP header analysis, tag check. Third step: targeted manual analysis. The tool flags anomalies, the auditor validates their real impact on the site context.
Fourth step: prioritization. Not all anomalies are equal. A missing meta description on a support page does not have the same weight as a broken canonical on the home page. Fifth step: deliverable. A technical report for developers, and a client report in business language. Two distinct documents, two levels of reading.
The tools to know and their blind spots
Google Search Console remains the reference tool for field data. The "Core Web Vitals" report displays metrics from the Chrome User Experience Report, so from real visits. PageSpeed Insights complements with lab data. Lighthouse, integrated in Chrome DevTools, serves page-by-page debugging.
Screaming Frog remains the reference tool for exhaustive site crawl, but its learning curve is steep and its deliverable is not showable to a non-technical client. GTmetrix measures speed but stays centered on website performance auditing. All these tools share a common flaw for an agency: they produce technical data, never client-ready deliverables to sign a quote on.
At Orilyt, we built the tool to fill that gap. The audit runs from any URL, with no admin access, no install, and produces two distinct PDFs: a client report in business language and a technical report for the production team. Detected anomalies can be translated into directly usable quote lines.
The score trap: why a 90/100 alone means nothing
A PageSpeed score of 92 does not guarantee the site ranks well. Conversely, a site scoring 60 can very well dominate its competitors if its field Core Web Vitals are in the "good" zone for 75% of visits. The gap comes from the fact that lab scores simulate a mid-range device and a 4G connection, while Google ranks on the real data of your visitors.
A serious technical SEO audit never delivers an isolated score. It always crosses lab data (useful for diagnosis) and field data (useful for ranking). Without this double reading, the client risks investing in fixes that will have no impact on Google ranking.
In audits run with Orilyt, we regularly observe that the cause of poor ranking is not the expected one. The client suspects their content. The audit reveals an outdated sitemap, a misconfigured canonical, or a hreflang tag broken since the last migration. They are rarely spectacular problems, but they are the ones that unblock rankings.
From diagnosis to client decision
Translating the technical into a commercial argument
This is where most audits fail. The auditor delivers a technical PDF, the client files it away, and nothing happens. The freelancer loses the fix mission, the agency loses the maintenance contract. The issue is not the diagnosis quality, it is its commercial translation.
A client-useful audit reads in three minutes. It answers three simple questions: what is wrong, what is the concrete business impact, how much it costs to fix. No jargon. No unexplained acronym. No Lighthouse screenshot without context.
This translation is what separates a consultant who gets a quote signed from a technician who delivers an observation. When a canonical is broken, the client does not want to know what a canonical is. They want to know that dozens of pages no longer surface in Google and that the fix is two hours of work.
The report in two versions: technical and client
Separating technical and client deliverables has become a standard for serious agencies. The technical report lists raw anomalies, HTTP codes, file names, configuration lines. It is aimed at the internal production team or the developer who will fix.
The client report, on its side, speaks usage value. "Reduced loading time, visitors no longer wait." "Strengthened security, the browser refuses unsecured connections." "Fixed referencing tags, Google indexes your pages correctly." Each line is a measurable benefit, not a technical metric.
This dual deliverable logic is what allows you to turn an audit into a maintenance contract. The client understands what they buy, the agency justifies its price, the freelancer avoids long negotiations. It is also the only format that allows generating a quote directly from detected anomalies, in a few minutes.
What an audit must produce to stay useful over time
A one-off audit has a short validity. A site evolves, plugins update, content is added, third-party scripts invite themselves. Six weeks after an audit, 20 to 30% of the initial fixes may have been undone by a badly managed update.
That is why continuous monitoring complements any serious audit. Tracking Core Web Vitals over time, alert on regressions, SSL certificate expiration check, monthly verification of critical tags. Without monitoring, the client pays for an audit that will be outdated by the next quarter.
For agencies selling maintenance, monitoring becomes the central retention argument. The client sees every month what was monitored, what was fixed, and what remains to be addressed. The commercial conversation happens on facts, no longer on intuitions.
Why Orilyt simplifies producing this audit
A tool designed for agencies, not developers
Historic audit tools were designed by developers for developers. Lighthouse, Screaming Frog, GTmetrix produce precise and exhaustive technical data. But they stop at line 47 of the dashboard, with no help for the client conversation. Orilyt takes the opposite angle: first produce a deliverable the client understands, and derive the technical report for the production team.
The audit runs with no admin access, no plugin install, on any publicly accessible URL. That is an advantage in prospecting: auditing a prospect requires no prior authorization. It is another in mission: no need to wait for FTP credentials or Search Console access to deliver a first diagnosis.
The result comes out in under two minutes, across five analysis categories covering performance, security, SEO, accessibility, and compliance. No technical acronym appears in the client document, everything is translated into business benefit.
From audit to quote: the complete chain
Every anomaly detected by Orilyt automatically generates a usable quote line. A missing security header becomes "Strengthened security, the browser refuses unsecured connections." A too-high TTFB becomes "Reduced loading time, visitors no longer wait." The freelancer adjusts prices, the quote is ready to send in minutes.
This logic has transformed the practice of several agencies we work with. The cycle prospect URL → client report → signed quote now holds in one morning, where it took a week. On recurring contracts, the automatic monthly report sent directly to clients from monitoring strengthens the perception of value without overloading editorial production.
Multi-CMS, no plugin, white-label
The tool works on WordPress, Shopify, Wix, Webflow, PrestaShop, Magento, Drupal, Joomla, Squarespace, Ghost, TYPO3, and custom-coded sites. No plugin to install on the client side, no integration to maintain. Agencies managing a varied portfolio no longer have to juggle three tools depending on the CMS.
White label covers client PDFs and report interfaces. Logo, colors, agency contact details, the client never sees the Orilyt name. Automatic monthly reports leave from the agency mail address, to the end client address, on the 1st of the month, with no human intervention. Multi-page tracking lets you monitor a site's critical URLs, not only the home page, which changes everything for e-commerce and editorial sites.
The technical SEO audit is a prioritization exercise, not an exhaustiveness one. Value does not lie in the number of checked points but in the ability to isolate critical blockers, explain them simply, and quantify them. A site can display 100 minor alerts with no ranking issue, or three major alerts that destroy its visibility. Knowing how to sort is the whole job.
For agencies and freelancers who deliver audits regularly, the stake is no longer to have the most exhaustive tool, but the one that produces fastest a deliverable the client signs. That is precisely what we built with Orilyt, keeping in mind that the technical only has value when it translates into a decision.
You can audit any URL in two minutes, with no credit card, no install, and test for yourself the readability of the produced report.
Your most frequent questions
What is the difference between a technical SEO audit and a full SEO audit?
A full SEO audit covers three axes: technical, editorial, and authority. The technical audit focuses on what Google can crawl, index, and understand, as well as content access speed. It does not handle text quality, keyword strategy, or external link profile. It is the foundation on which the other dimensions rest, but it can still be performed alone, for example during a redesign or migration.
Do you need admin access to run a technical SEO audit?
No, the majority of control points are performed from outside, with no login. HTTP headers, meta tags, Core Web Vitals, sitemap, robots.txt are publicly accessible. Search Console access enriches the diagnosis with field data and indexation reports. It is a major advantage when you want to audit a prospect without asking anything, which becomes a full-fledged commercial argument in prospecting.
How long does a serious technical SEO audit take?
Automated collection takes two to ten minutes depending on the tool used. Manual analysis, prioritization, and writing the client deliverable then take between one and three hours for a standard site. An audit that ships in five minutes complete is probably superficial. An audit that takes two days of production is rarely profitable for a one-off mission. The right range is around half a day for a quality deliverable.
How often should you redo a technical SEO audit?
A one-off audit stays valid three to six months on a stable site. On a site that publishes regularly, adds plugins, or evolves editorially, the check should be quarterly. Continuous monitoring complements the audit by tracking regressions over time. That combination secures agencies on their maintenance contracts and avoids bad surprises during poorly anticipated updates.
What are the 2026 Core Web Vitals thresholds to respect?
Three metrics, three thresholds in the "good" zone. LCP must be under 2.5 seconds, measuring the speed at which the largest visible element appears. INP must stay under 200 milliseconds, measuring responsiveness to all user interactions. CLS must be under 0.1, measuring visual stability during loading. These thresholds are evaluated at the 75th percentile of real visits, on Chrome User Experience Report data, not on lab simulations.
Sources and references
- Google web.dev, Core Web Vitals — official reference for performance metrics and 2026 thresholds
- Google PageSpeed Insights — official lab and field performance measurement tool
- DebugBear, Core Web Vitals 2026 — INP failure rate statistics and analysis
- W3Techs, Usage statistics of WordPress — CMS market share statistics
- MDN Web Docs, HTTP Headers — technical reference for security HTTP headers
- CNIL, Cookies and tracking devices — official GDPR recommendations on cookies and third-party scripts
- HTTP Archive, Web Almanac — annual data on the technical state of the web
- Google Search Central, Sitemaps overview — official documentation on sitemap.xml and indexation