The (Real) Impact of A/B Testing and Personalization on SEO

Read about Google's official SEO advice on A/B or multivariate testing, and how to minimize the possible impact of A/B testing on SEO.

Vice President of Marketing, Dynamic Yield

Optimizing and personalizing the customer experience on a regular basis to improve KPIs serves as an inherently valuable growth tool and is a key part of the digital marketing ecosystem for any online business. However, a common question that arises is whether these efforts can negatively impact Search Engine Optimization (SEO) rankings. 

With a few misconceptions floating around on the topic, we’re going to dive into the various concerns, providing solid references to Google’s official stance, as well as highlight practical guidelines to help you become more relaxed with, and more in control of, your optimization routines. We’ll also share the results of experiments we conducted to determine how Dynamic Yield’s Dynamic Content affects SEO, including Google crawling and indexation. 

Before we jump in, some comments: 

  1. This article refers only to JavaScript-based dynamic content rendered through client-side implementation. (Dynamic Yield offers both client- and server-side solutions, which you can read about here).
  2. If you want to learn about Google’s Intrusive Interstitials Penalty, read this article: The impact of popups, overlays, and interstitials on SEO.

Now let’s get started.

Can A/B tests or on-site personalization wreck your Google ranking?

Google encourages website owners and marketers to conduct constructive testing and ongoing optimizations. If it didn’t, it wouldn’t offer a platform of its own for this exact purpose. The general idea is that personalization doesn’t work against you – but the other way around. By improving digital experiences, businesses can improve engagement metrics, increase loyalty, and eventually, build a stronger brand. As John Mueller, Google’s Webmaster Trends Analyst said in a Reddit AMA

“Personalization is fine, sometimes it can make a lot of sense.”

But while Google acknowledges these efforts improve the overall customer experience, transforming websites for the better, from a technical point of view, there are four main risks that need to be taken into consideration when optimizing or personalizing web pages.

  1. Cloaking
  2. Redirection
  3. Duplication
  4. Performance 

Risk #1: Cloaking

Cloaking involves presenting one version of a page to search engine bots while showcasing a different version to the users interacting with its. This is done for the purpose of manipulating and improving organic rankings and is a huge SEO risk, which is seen as a harsh foul in the eyes of Google’s official Webmaster Guidelines. Running optimization initiatives that target specific variations to search engine User- Agents (such as “GoogleBot”) and another to human visitors is what’s considered cloaking – a big no-no.

As a strict principle, we always treat Googlebot and other search bots the same way we treat real human visitors. With Dynamic Yield, your optimization and personalization initiatives will never be considered cloaking manipulations.

“Ideally you would treat Googlebot the same as any other user-group that you deal with in your testing. You shouldn’t special-case Googlebot on its own, that would be considered cloaking.”

– John Mueller, Google

Risk #2: Wrong Type of Redirect

When redirecting visitors to separate URL variations (such as with split tests), Google’s official guidelines instruct to always use a temporary HTTP response status code (302 Found)

or a JavaScript-based redirect. As a general rule of thumb, never use a 301 Moved Permanently HTTP response code. The 302 redirect request will help search engines understand the redirect is temporary, and that the original, canonical URL should remain indexed.

At Dynamic Yield, we only offer temporary (302) or JavaScript-based redirects, depending on what you’re trying to achieve.

Risk #3: URL and Content Duplications

URL Duplications – Sometimes, when running online experiments, the system duplicates some of the site’s URLs for different test variations. From an SEO point of view, these on-site duplications are a relatively minor risk in terms of a real penalty, unless initially created as an attempt to manipulate organic rankings through cloaking. Nevertheless, to remove the possible risk, we recommend a fairly easy fix – implementing a Canonical tag element on each duplicated URL. It is also possible to block access to those duplications using a simple Robots.txt Disallow command, or a Robots meta tag with a noindex value.

Content Duplications – With on-site optimization and personalization, websites deliver different content variations tailored to different individuals. When the original content plays a big part in determining a web page’s organic rankings, it is vital to keep this content on-site, instead of replacing it with fully dynamically generated content. In addition, keeping the original content can also act as a failover mechanism for unsupported test groups or individuals.

As Google has gotten smart, its crawling and indexing algorithms can render and understand (to some extent) our JavaScript-based Dynamic Content tags – even indexing the content.

Risk #4: Web Loading Performance

Site speed is a major deal-breaker when it comes to user experience. It has also been incorporated in Google’s organic web search ranking algorithms since 2010, among 200+ other ranking signals. Some users are concerned that optimization and personalization tools may negatively impact the loading time of their webpages, and the truth of the matter is that it might sometimes. But there are ways of minimizing the effect.

To start, Dynamic Yield’s script is loaded asynchronously, and as such, does not influence page load time. Additionally, we work with Amazon Cloud Servers and deliver content directly through its CDN servers. We can also send content to your company’s own CDN if needed. Lastly, it is important to note that Dynamic Yield has an uptime of almost 100%. Our platform powers billions of page views a month with some of the world’s most demanding customers, who wouldn’t stand for latency on their site.

JavaScript-rendered experiences, and rankings

Google uses some kind of a headless browser to render JavaScript code and effectively imitate real users’ browsing experiences. They use this technology because they try to fully understand the user behavior (compared to the old days, when they had only used “simple” bots to fetch text, media, and links from web pages). With that said, it’s not clear to what extent can Google (and do they) emulate different sessions and user characteristics, and thus identify each personalized experience or testing variation. For example, will Googlebot crawl the same page from different countries, to try and identify location-based experiences? Probably not, although in theory, they can do this when needed. While Googlebot is mostly US-based, they claim to also sometimes do local crawling – the only way to actually see would be to analyze the server’s logs. 

In the past, Google released an official statement about their abilities to render JavaScripts, saying that “sometimes things don’t go perfectly during rendering, which may negatively impact search results for your site.” That’s why, when generating dynamic content variations, it is strongly advised to include the original static content in the source code.

Therefore, I wouldn’t use Dynamic Yield, or any other personalization platform, to dynamically inject content that is important for SEO. Instead, I’d make sure that content that is important for SEO, appears in the static source code of the pages. By doing that, you ensure that non-personalized user experiences or users without JS-enabled browsers will get the default, fallback content. You also ensure that most bots, Googlebot included, will be exposed to the content that is important for your SEO strategy. It’s safe to assume that, generally speaking, Google’s ranking algorithms rely on the default, static version of a page – instead of on the potentially unlimited amount of personalized variations that a given page can have.

As the impact of JavaScript-injected content on sites’ SEO has yet to be comprehensively determined, we tested how Googlebot crawls and indexes JavaScript-based content. 

Below is a summary of the experiments conducted and how our Dynamic Content capabilities, which personalize experiences for visitors by dynamically replacing the content, design, and layout, affect a site’s SEO, including crawling and indexation.

Tests – six different client-side implementations

Method – Dynamic Content (inline), Dynamic Content (placement), Dynamic Content (Inline Placement), Dynamic Content (Custom Action JS), Recommendations (Placement), Recommendations (Inline-Placement)

Validation – unique strings used to identify dynamic content in (Google) search results

So, can Google crawl and index JavaScript-based content? 

Conclusion – Yes, Google is able to crawl and index all JavaScript-based content

Content generated using JavaScript (Dynamic Content), custom actions, and algorithm-driven recommendations are shown as plain text in Google’s search engine results page (SERP). Results of which were valid for all implementation methods tested.

Take a look a deeper look into the JavaScript SEO experiment by visiting dynamicyield.com/seo.

Example of a client-side SEO-friendly optimization

Wrap the static content inside a container with a specific CSS Selector and use Dynamic Content to replace the static content located within the specific CSS Selector with new, dynamically-generated content variations.

Dynamic Yield’s Dynamic Content tags will then be used to personalize experiences for visitors by dynamically replacing the content, design, and layout.

Example of a client-side SEO-friendly optimization

Organic rankings are not affected since Googlebot is served with the static version of the page. 

To Summarize

Google acknowledges the use of A/B testing and personalization on the customer experience, crawling and indexing JavaScript-generated content. Marketers concerned about its impact on SEO simply need to utilize best practices during testing and experimentation in order to avoid any unnecessary penalties. And hopefully, this article has provided the resources for teams to be able to do so confidently. 

The (Real) Impact of A/B Testing and Personalization on SEO
5 (100%) 8 votes