logo-small
Features Prices
News 0
Latest News See All

Temporarily unavailable. Please come back later.

See All
Webinars 0
Upcoming Webinars See All
Upcoming Webinars

Sorry, we could not find any upcoming webinars.

See recorded webinars
Blog 0
Recent Posts See All

Temporarily unavailable. Please come back later.

See All
Chandal Nolasco da Silva

Personalized and Penalized? Googlebot and Dynamic Content

Chandal Nolasco da Silva

The idea of personalizing brand messages and experiences for the consumer is not new to marketing professionals. Traditional marketing techniques took advantage of digital opportunities first by turning to the personal inboxes of individuals via email (Hello, [name]!). Soon after, email campaigns evolved to segment-targeted users based on their demographics, interests and behavior.

The last time I wrote for SEMrush, I talked in-depth about how user behavior analysis can enrich digital campaigns. I explained how online user behavior is a result of their intent and how understanding this can help SEMs when performing keyword research. I touched briefly on how effective marketing is shifting toward user-centric personalized experiences.

Creating experiences that are user-centric is a growing trend in online marketing, and the landscape of the World Wide Web is now reflecting this need for personalized experiences with the use of dynamic content. Marketo, Hubspot and KBK Communications have all written about the influx and value of dynamic content and personalized experiences. The bright futures of website experiences that are dynamic get touted due to their beneficial ability to convert more visitors on your website. Social Media Today recently cited that dynamic content is going to grow in popularity in 2014 and says that while 78% of CMO’s think that custom content is the way of the future, marketing professionals are starting to use dynamic content to deliver personalized messages in real time.

The real time value for conversion is certainly of interest to marketers who are looking toward personalizing experiences for web users. But this article aims to discuss an area that has been largely overlooked in all of this hype: How will search engines be able to index pages with personalized dynamic content?

The SEO-portion of this discussion will be critical for any brand marketer to consider before dedicating their site to dynamic experiences. If webpages aren’t listed as part of Google’s index, they will not be found through search and it won’t matter how cool and customized the site is. Let's look at what dynamic content is before we decide what to do about search engines.

What is dynamic content? What does it look like?

PC Magazine defines dynamic content as:

A Web page that provides custom content for the user based on the results of a search or some other request. Also known as "dynamic HTML" or "dynamic content," the "dynamic" term is used when referring to interactive Web pages created for each user in contrast to the billions of static Web pages that do not change.

The definition of this term now has to include content that is dynamic as a means to cater to personal preferences and purchasing patterns of website users. For example, a website can have a banner that changes slides and this content could be considered dynamic, but it is not necessarily personalized. If a website recognized you as user and changed the content on the page to suit your interests or purchasing patterns, among other factors, this would be dynamic content that is also personal. Sometimes dynamic content is personalized. Personalized content is always dynamic.

As it stands now, personalized dynamic content examples are few and far between. But Google seems to have found a way to deal with the most common. Consider product recommendations made to you by Amazon or shows offered to you on Netflix. Amazon and Netflix base their content customization on unique viewing history, ratings and past purchases. These customized suggestions are based on category or product suggestion widgets that are built into most e-commerce platforms nowadays.

Netflix

Image Credit: Netflix.com

The difference, of course, is that this personalized content is behind a log-in, and so, beyond the reach of Google. This may not always be the case, however, as more and more websites will begin to try to build personalized dynamic content into their Google-facing, open-to-the-world webpages.

What does personalization mean then for SEO? Let's review the basic concept of web crawling before talking about how disruptive dynamic experiences can be to them.

How does Google map the web for your search results?

Website owners create websites with pages. Google sends out its “bot” (fondly referred to as “Googlebot”) to crawl the World Wide Web for these websites and index the information contained in the pages. As it stands now, there are over 60 trillion website pages included in Google’s index. Each page is indexed as a snapshot of what Googlebot saw when it visited, and assigned ranking power that is based on over 200 factors. This ranking power can improve or decline depending on things like how closely your page snapshot matches a search query, or how quickly your site performs.

The index sits around waiting until an Internet user types a search query into Google.com/ca/eu/au … etc. Then a seriously complex set of formulas fuse together, making up algorithms that utilize the massive Google index and attempt to provide you with the most relevant webpages on the Internet for your search query.

Google’s algorithms are based on the way the user searches (voice command search, image-based search, video-based search, etc.), the device used for searching, the PageRank previously assigned to websites, page loading speeds, backlinks, social signals and, of course, the trusted code on every website that tells Google the textual information contained therein. In fact there are a number of complex considerations to include when decoding Google’s almighty algorithm. But, for the purpose of discussing dynamic content, let’s look at how Google crawls the code on a website more closely.

Microscope on-site crawling

Google reads information from your website and indexes it. Most often, the information Google is reading from your site comes from its HTML code. According to Simon Abramovitch, co-founder of Search Engine Revs and contributor here at AOD, the original Googlebot was designed to read basic HTML text and links. Google simply crawls the page and indexes the content; this is still the ideal scenario for search engines.

“As soon as there is dynamic content, be it Flash, a product recommendation widget or JavaScript of some sort, your site runs a greater risk of problems from a search engine point of view,” Abramovitch says. He explains that Google has a limited ability to read dynamic content, depending on how that content is presented.

But in the past, with what could now be considered as basic forms of dynamic content, such as widgets and Flash, there have been workarounds. Even if Google can’t read a few widget boxes on a page but can still read the core text in clean HTML text — in addition to good HTML interlinking and a proper title tag and meta description — then it really isn’t that much of a problem.

Google was not able to crawl Flash content in the past, and in this way, dynamic content isn’t a new problem. After realizing Flash content posed a barrier to search engines, workarounds were developed by placing real text behind the Flash programming so search engines could have something to read while the user still has a dynamic experience. Websites with Flash eventually became readable to Google, if the creators of the site took Googlebot’s limited abilities into account.

This is the key to a search engine’s heart: readability. Was readability always considered intelligently and responsibly? No. As a result, Google has always treated the text it encounters, when it isn’t identical to what is seen in a browser, as suspect.

Dynamically diabolical to your site?

If your site has different content on it than what Google has read and indexed, this is considered straight-up black hat SEO and you and your site are headed for a search engine penalty. Normally, this would only happen in the past with site owners trying to manipulate and trick search engines. Google would find egregious cases pretty quickly and just block the site from its search results.

In the case of more complex forms of dynamic content, like an entire site built off AJAX, the situation becomes a bit more complicated. This is because in the end, there is no static content on that page, no HTML text or links for Googlebot to crawl, in addition to ongoing mixed messages about what content is on each page, since its likely to change per the user who sees it. In other words, if the content is so dynamic that there really isn’t anything permanently on the page and there is no HTML, then you have a problem, and this is way beyond the scope of previous web discussions about AJAX.

There is no clean map of content and navigation in this case. If online experiences become so personalized that each is different, it will contradict Google's longstanding policy that it shouldn't see something different than the user. This could lead to problems, if not penalties.

Working toward a workaround

What happens in this case of completely personalized and dynamic content? It’s hard to say. These completely dynamic websites haven’t really made their grand entrance onto the open Internet yet (but any day now…). Initially, there could be warnings and penalties in Google’s search engine because, in theory, what personalized content creates is a textbook no-no for search engines.

The lines are fuzzy, though, because, yes, when giving users something different than what you give Google, it could be perceived as manipulative and get penalized. However, in the case of personalized content, that is not what website owners are trying to do. These site owners are trying to give the users a better experience while still trying to give Google what it wants, and penalties in this regard just seem unfair.

The solution then is having a website that Google can crawl and read and map and index, which offers a blend of static, universal content (that appears the same way to every user and to Googlebot) and personalized dynamic content which enhances the user experience. The key here is that the static content has to be representative enough of the page’s offering that someone who finds that page through search, even if there is no personalized content on the page, still has a positive experience. The "searcher intent" of someone who finds your site through Google still has to match the "universal" content available on any page, and the personalized content cannot overshadow the universal content so much that it messes up the relationship between the searcher-intent, and the user experience on the page.

Google very recently emphasized that the latest iterations of Googlebot are smarter at rendering JavaScript and CSS, and that an option will soon be available within Google Webmaster Tools that allows you to see how your pages truly look to Googlebot. This will be useful for webmasters who are creating personalized dynamic pages,. They’ll be able to see with concrete examples just how their pages look to a modern search engine, and can reconcile how large a gap they're leaving between what that search engine sees and what a human sees when they’re served personalized content.

Takeaways

Yes, the benefits of personalized content are many: Social Media Explorer (SME) posted almost a year ago about how dynamic content is the future of online experiences. Search Engine Watch recently cited benefits for personalized experiences and dynamic content, writing “visitors remain on your website longer, download more offers, and ultimately purchase more products.” This Search Engine Watch article echoes a lot of the benefits identified by SME that equate personalized experiences as being a means to higher conversions. But, if search rankings are compromised in the process, forget it!

If you’re getting ready to ramp up your site with a heavy duty personalization engine, good for you for being on the forefront of innovation. But, first ask yourself: If the pages you’re personalizing are open to Googlebot, are you still serving Google something useful?

Remember, the Googlebot, even with all of its recent enhancements, doesn’t collect cookies — you can’t create "personalized dynamic content" for Google itself. You need to have a graceful fallback — some "universal" content on each of your pages that are accessible to Google. Does the "universal" element of your content do enough to serve both Google and a new user who arrives through search? If yes, then great. If no, then you may need to consider not allowing Google to index that particular section of your site, or you could be in for a rude traffic awakening.

Google does not like to look stupid, nor does it want to have the search experience ruined. If the page that the user lands on is different than what Google previously crawled and indexed, or shows a real-world user content that is too different from the search terms that brought them there, it could lead to a site penalty. Or, the site might simply not be indexed.

As marketing moves toward one-to-one experiences, this will become more and more of a relevant problem. In the meantime, make sure you’re planning properly for personalization.

Author bio:

Chandal Nolasco da Silva works at AOD Marketing, with a focus on client strategy. She has consulted and collaborated on a variety of digital marketing strategies, developed websites and written content for a number of brands. Chandal has worked for Canada’s federal government, completed various teaching contracts, is a published author and writes a web column in her spare time. Her last article for SEMrush was "Searcher Intent: Enrich Digital Campaigns through User Behavior Analysis."

Comments

2000 symbols remain
Chandal Nolasco da Silva
Hi Cleyton,

Disqus has a plugin for wordpress that automatically enables comment rendering
in HTML, making them fully crawlable. If you're not using WordPress,
you can still use their API to store comments locally in your database,
and then publish them yourself in your HTML, accomplishing the same
thing.

In general, if you want the content from a widget to be crawlable (by
current-generation crawlers, not just next-gen Google js rendering
bots), then you should have a noscript fallback that publishes the
content in HTML, when JS is disabled.
Cleyton Messias
Cleyton Messias
Hi,

And what about comment systems like disqus? Besides import the content like the disqus plugin for wordpress does, how can I handle SEO for my customer if I am a widget?
Have a Suggestion?