One central issue that presents itself time and time again is indexation.
However, since 2019, the evergreen Googlebot is advanced enough to produce little to no delay during the rendering queue – anywhere from 5 seconds to minutes.
But it can also cause problems for a search engine crawler if it’s unable to interact with the page, making it difficult to index content properly.
- Web Developer is a web extension that can be installed on your Chrome browser. Here is how to use it:
- Step 1: Once installed, click on the extension.
- Step 2: Click the “Disable” tab.
- You’ll need to refresh the page to see the tool implemented.
- Alternative version of the site
- Server-side rendering to send as a complete HTML page
- Progressive enhancement focuses on the concept of building a website by first prioritizing the most basic features and then adding more complex features on top, allowing all visitors, regardless of their device or technical capabilities, access to the content.
2. Infinite Scrolling and Pagination Issues
Blogs that rely heavily on infinite scrolling may have difficulty getting content indexed if crawlers are unable to access the content through paginated links.
Infinite Scrolling (below)
- Implement pagination loading to create links where search engine crawlers are led to more content and can index the content.
3. Missing or Hidden Content
This also applies to content hidden under accordion-style or expansion buttons, which crawlers cannot click and detect.
For ecommerce websites, this presents problematic obstacles if crawlers cannot read product descriptions and assume your product pages have thin content.
There are a few different ways you can handle missing and hidden content on your site:
- Pre-rendering: Create a static HTML version of the page and serve this to the client and search engine crawlers. This will make all content visible to the crawler in HTML format, supporting indexation and keyword ranking efforts.
- For content hidden under accordion and expansion buttons, you could also move the content to under a visible element in HTML format.
4. Internal Link Errors
Although these frameworks make web development much easier, it can create major headaches when internal links are injected without the proper elements, making links uncrawlable for search engines.
For example, this onclick event:
Rather than an internal link with an <a> tag within an href attribute:
According to Google’s Search Central documentation,
“Google can follow links only if they are an <a> tag with an href attribute. Links that use other formats won’t be followed by Google’s crawlers. Google cannot follow <a> links without an href attribute or other tags that perform a links because of script events.”
- Create internal links using only resolvable URLs, with <a> tags within href attributes instead of onclick handlers.
5. Heavy or Unused JS Files
With the introduction of Core Web Vitals, Google has placed more weight in site speed performance.
If your website is slow to load, it can negatively impact your search engine rankings as well as user experience.
6. Site Speed Issues: Render-Blocking Resources
This process can block the browser from quickly rendering the page as it allocates bandwidth and time to download files. Once all files are downloaded, your browser can then render and display the full webpage.
Because this process takes time and bandwidth, it slows down site speed performance, which can dampen your site’s Core Web Vital metrics and rankings.
- For all critical resources, inline the script within the body above the fold, so that only critical elements load first.
- This will encourage progressive loading for your page (see above image), which can reduce lagging in site speed performance and improve user experience.