Advances in technology have driven efficiency in SEO, where site crawlers such as Botify, DeepCrawl and ScreamingFrog have flourished.
These tools are an essential part of the SEO toolbox and are great at uncovering and visualizing technical issues such as broken links, 404 errors and invalid canonical tags. They are becoming the default source of technical performance analysis for SEOs, which means they spend less time interacting with, and analyzing, websites in a browser and/or site analytics.
On the surface, this doesn’t look like anything to be concerned about; we’re getting vast amounts of technical analysis at speed with tools.
However, these tools are bots — they analyze the site’s source code looking for identifiable issues against an audit checklist which, while useful, won’t necessarily correspond to the issues consumers face.
Search results are centered around the consumer
Studies from SEMrush and SearchMetrics both also reference user signals and the consumer experience, including mobile-friendliness, content relevancy, site speed, bounce rate/search sequence, time on site and content format as key ranking factors.
However, with site crawlers becoming the default for website analysis and reducing the time that SEOs spend analyzing physical websites, the consumer experience is being neglected, resulting in untapped opportunities to improve performance.
Additionally, as of last November, consumers accessing the web via mobile devices overtook desktop for the first time. This further disconnects site crawlers from consumer behavior, and while site crawlers are catching up, they predominantly still default to desktop analysis.
With this in mind, it is critical to analyze and diagnose websites in the same way consumers interact with them, in addition to bot usage.