The (Real) Impact of A/B Testing and Personalization on SEO

As the impact of JavaScript-injected content on sites’ Search Engine Optimization (SEO) has yet to be comprehensively determined, we tested how GoogleBot crawls and indexes JavaScript-based content. Below is a summary of experiments conducted to determine how dynamic content, like A/B and Multivariate tests and Real-Time Personalization, affects sites’ SEO, including crawling and indexation.

Take a look at our JavaScript SEO experiment.

Can Google crawl and index JavaScript based content? Yes! As shown in our test, Google indeed crawls and indexes Dynamic Yield’s JavaScript-based content.

Our validation method included the use of unique strings in order to identify dynamic content in (Google) search results. Google was able to crawl and index all of our JavaScript-based content. Content generated using JavaScripts and Algorithmic Recommendations is shown as plain text in Google’s search engine results page (SERP). Results are valid for all implementation methods tested.

Can A/B tests or on-site personalization wreck your Google ranking?

From a technical point of view, there are four main risks that need to be taken into consideration when optimizing or personalizing websites:

  1. Cloaking
  2. Redirection
  3. Duplication
  4. Performance

Risk #1: Cloaking –

Cloaking involves presenting one version to search engine bots and a different version to human users, for the purpose of manipulating and improving organic rankings. This is a huge issue in terms of SEO risks and a harsh foul against Google’s official Webmaster Guidelines. Targeting content variations specifically to search engines User-Agents (such as “GoogleBot”) and other variations to human visitors is considered cloaking, and a big no-no.

Risk #2: URL and Content Duplications –

URL Duplications – Sometimes, when running online experiments, the CMS duplicates some of the site’s URLs for different test variations. Generally speaking, from an SEO point of view, on-site duplications are a relatively minor risk in terms of a real penalty, unless they were initially created as an attempt to manipulate organic rankings. Nevertheless, to resolve this possible risk, we recommend a fairly easy fix – implementing a Canonical tag element on each duplicated URL. With Canonical tag elements in place, there’s no risk of duplication, and all available “SEO Juice” will be migrated to the original, canonical URL.

Content Duplications – With on-site optimization and personalization, websites deliver different content variations that are tailored for different individuals. When the original content plays a big part in determining a web page’s organic rankings, it is vital to keep this default content on-site, instead of completely replacing it with a fully dynamically generated content (even-though Google can crawl it). In addition, keeping the original content can also act as a failover mechanism for unsupported test groups or individuals.

Risk #3: Wrong Type of Redirect –

When redirecting visitors to separate URL variations (such as with split tests), the official guidelines instruct to always use a temporary HTTP response status code (302 Found) or a JavaScript-based redirect. As a general rule of thumb, never use a 301 Moved Permanently HTTP response code. The 302 redirect request will help search engines understand that the redirect is temporary and that the original, canonical URL should remain indexed.

Risk #4: Web Loading Performance –

We all know site speed is a major user experience issue and a deal-breaker. It has also been incorporated in Google’s organic web search ranking algorithms since 2010, among 200+ other ranking signals. Some users are concerned that a/b testing or personalization tools may negatively affect the loading time of their webpages, and the truth of the matter is that sometimes it just might. But it depends on the use case and implementation method, and as always – there are ways of minimizing the effect.

Here at Dynamic Yield, our tracking script is loaded asynchronously and as such, should not affect page load time. Additionally, we work with Amazon Cloud Servers and deliver content directly through its CDN servers. We can also send content to your company’s own CDN, if needed. Lastly, it is important to note that we have an uptime of almost 100%. We power more than 10B page views a month with some of the world’s most demanding customers, who wouldn’t stand for latency on their site and most likely, neither would you.

To Summarize

  • Google acknowledges the use of A/B testing and personalization.
  • Google can crawl and index JavaScript-generated content
  • You should be testing all the time and improving your overall user experience.
The (Real) Impact of A/B Testing and Personalization on SEO
5 (100%) 8 votes

  • Dominic Hurst

    Great post Yaniv.

  • scott heron

    Hi Yaniv. Good post, but thought you might highlight another issue between SEO and A/B Testing. The need to consider SEO in creating the A/B Test so that the ‘winner’ is also a strong SEO solution. Sometimes the A/B Team and SEO Team are separate – even separate agencies – and the A/B winner is not a strong SEO solution. Thanks again. Scott

    • Good point Scott. Thanks for the feedback.

Menu Title
Contact Us
×