I was watching “The Lord of the Rings” on TV the other day and thought to myself, “We (the search community) could use that, one ring that would unite us all.”
We could use a single approach that binds together a search community that is victimized by the fear, uncertainty and doubt perpetrated by the search engines with a UX community that fears technology more than bad color combinations. I got very little traction for my conspiracy theory, that it was part of a plan on the part of Google to put a stake in the heart of SEO by anointing the user experience (UX) community, in absentia, as arbiter of placement on search results pages.
Then the Hummingbird update hit the Web. All of a sudden my theory did not seem so out of the realm of possibility.
What do we, the search community, know? Google targets anyone who tries to influence position in SERP and any methods that are able to manipulate ranking in search results and did not come out of their labs. After the JC Penny fiasco, Google introduced the first of many Panda updates that focused on UX and content quality. Then there was the Penguin and Not Provided curve balls before Google delivered the coup de grâce to traditional SEO with Hummingbird, a complete algorithm refocus from links to what the searcher experiences onsite.
We know from what little specifics Google has provided that Panda was about site quality and content freshness. The Google-fueled further speculation in the SEO community about click-through (Does the user select the result?), bounce rate (Does the user become engaged with the page?) and conversion (Does the page resolve the user search need?). Vanessa Fox and Eric Enge present this case in A Holistic Look of Panda. The Panda funeral pyre burned quick and hot as sites dropped in and out of visible ranking with each update.
Google chose well. The UX community is so techno-phobic that link farms, keyword stuffing, cloaking and parasite hosting are destined to die out soon. And let’s face it, the thrill of traditional SEO left the house ages ago when Google started playing hardball and delisting sites outright. Ouch, explain that to your client.
This refocus will actually work in our favor. To regain the high ground, the SEO community must break down silos and work with UX and content strategy communities to deliver the algorithm-defined great experience. Yes, hands across the pixels, Kumbaya and all that goes with it.
UX the New, Yet Somewhat Familiar, SEO
In the world of UX, the foundations of site quality are: Information architecture (Can they find what they’re looking for?), Interaction design (Are the interaction points intuitive or does the user have to work hard to figure it out?) and Content strategy (Do they know where to go next and WANT to go there?). We in the SEO field think of these as: structure, text on the page and social channel endorsement, different labels, same opportunities.
Users rarely navigate by URL. By this I mean those that want to know more about Google’s Hummingbird update do not fire up a browser and type http://www.semrush.com/blog/publications/hummingbird-googles-kinda-new-algorithm[MS1] / into the URL window. Why should they when they can go to a search engine to get a list of highly relevant results and possible one they did not know they wanted? While the URL should do its best to convey the nature of information represented on the page, it does not have to be elongated with keyword-rich directory slashes.
As SEOs who care about our client's visibility in search results, let’s reach out to UX brethren and give them the ever-popular-with-clients-SEO-ammunition that nested folders are not as valuable as a well-designed (as in optimized) page name to influence SERP placement. This empowers both disciplines to create strong, yet shallow, site structures that leverage ranking influencers URL-depth (how far away from the homepage) and click-distance (how far away from an authority page). Search engines have become wary of global and footer navigation links due in large part to the SEOs who have used these links to manipulate or sculpt PageRank when such things were still possible and effective.
Over the years, users have developed a preference for, yay addiction to, the search box over site navigation to the extent that the information retrieval community refers to global and footer navigation as nepotistic links. So, how valuable can navigation be if the user is using a search engine and the back button to make their way around your site? It is most important from a UX perspective, and now an SEO one, that site navigation is as simple and understandable as possible. Clear, concise section labels are more effective than that “cool” optimized label that may or may not encourage a user to click through to that page. Footer links are like Jessica Rabbit, they aren’t bad, and they’re just drawn that way. If we want users to find the destination, the footer section of the page is not the place to put a link.
After the initial launch of Panda, Google took things a bit further down the UX trail with their Page Segmentation patent. No one explains that better than Bill Slawski over at SEO By the Sea in Google’s Page Sedimentation Patent Granted. This patent tells us that Google has carved up the basic Web page and assigned some regions with more relevance than others. According to Google, the prime relevance page real-estate is centered above the fold. In their lab user studies, this is where the subjects focused their attention when first arriving at the page, all 10 of them. Work with your UX designer to ensure that prime real estate is maximized with the main message of the page in machine readable text. If you meet with resistance, you can use the term "findabilty" and they will come around. Example: “The findability of the page is compromised because the search engines value this section for matching search terms to page content and you have a big Hero image with no text on, in or behind it.”
It is true: no one likes to read online. They are printing out, downloading and bookmarking (online and in their browsers) pages with reckless abandon. Laptops today have a terabyte of storage for all of the documents saved to read later. This brings back the opportunity for deep, rich, content that search engines have loved since the application of term frequency/inverse document frequency. Users now scan the page to see if the content appeals to them. Page length is no longer consequential except to the search engine because of their locked-in adherence to term frequency and now authority. So, deep, rich, subject-specific content is a good thing. Because, in most instances, that is what the user wants and will consume, either online or off.
Google’s Not Provided abstraction of what brings users to the site is the death knell for keyword optimization and more so with the announcement that they are taking away all keyword referral data. Soon, Not Provided was draining the usefulness from organic keyword analytics faster than a regular on “True Blood.” Users do not think in terms of keywords. They use random phrases to get a sense of their information need based on the search results in a cycle of: “Let’s try this. Nope, not it. (bounce) How about this? Nope, not it but getting warmer. (bounce)” We may think of them as keywords. Users think of them as baby steps towards finding what they really want.
Content Strategy is Your Friend
Start with a topic/subject/concept where you want visibility in search results. Use a content inventory to see if you have any old, moldy content on it already. Flush that and start anew or give it a facelift, and resubmit to the search engine. Work with your content strategist to develop a Relational Content Model that links related content on and off the site to create a more complete knowledge experience. Work with UX and content teams to develop focused authoritative content segmented with descriptive headings accessed by inter-page linked navigation. You can put together a Guided tour that link related content items or a Task-based drop down widget that contains links to pages related to a single task that engages users. Suggest the development of produced-view sub-navigation (or, as we know them we know these as Hub) pages that consolidate links to same subject content items that may be distributed across the site.
Having trouble putting a stake in the heart of the “Learn More” link text? You are not alone. Instead of talking about keyword-rich link text (and seeing the UX designer’s eyes roll up in their sockets), mention that the link lacks “information scent.” Information Scent is a concept made famous by UX luminary Jared Spool to describe links that attract the user action by using text most descriptive of its destination will be more highly valued by UX and content strategy colleagues.
Social, social, social. If there is one thing that search engines like, it is social media. There’s no more powerful indicator of relevance than thought-processing-bipeds shaking off their lethargy to say, do or share something. The search engines monitor both active endorsements (icon/link clicking) and passive endorsements (following, viewing) when incorporating social signal into relevance ranking. Our UX brethren are uneducated about the impact of social media engagement as an indicator of user endorsement for relevance and all too often relegate these interactions to footer navigation Siberia. In order to relocate these social media icons to a more appealing position, the SEO could frame them as opportunities to create a pervasive cross channel UX that extends beyond the page. This should not be a too hard a sell as many clients find these channels a valuable resources for information about user satisfaction. You get bonus points if you can reference the “psychology of sharing.”
If all else fails, you can fall back on the ethnographic argument, e.g. most users are accustomed to engaging with a multiplicity of social channels throughout the day. Be prescriptive in what social media channels to include as one size or suite of icons do not fit all sites. Stay with a limited choice of channels that make the most sense for the website and its user community. And those are: Facebook, Twitter, Google+ and one or two more from Pinterest, Linkedin, Stumbled Upon, Reddit, Share This, etc.
Why Is This A Great Direction for SEO?
We stop chasing the algorithm. This is a loser, catch up ball, strategy that never really worked for us in the long term. Let’s start using what does work, our capacity as thought-processing bipeds to understand other thought-processing bipeds. To be successful, we need to exploit the locked-in technology of term frequency and location to make the first cut at the results used by all search technology. Together, we can leverage whatever algorithm soup of content quality, social signals, contextually related links and authority with some engine-specific-unnamed-secret sauce on top the search engines decide to try out. Because, it is not about them, it is about what we feed them. We will join forces with our thought-processing biped UX and Content colleagues to feed them sites with information with engagement that results in expectation met with satisfaction. Then, users will pick our site out of the search results that it rightfully belongs in. They will engage with the calls to action in the forms of social sharing, downloads, purchases or whatever our conversion is for that page because these are relevant to what the user wants. They will end their search at our site and, by doing so, tell the search engine that we are relevant and provided a great site experience.
Google’s choice of UX as the driving force behind the Panda updates was diabolical. They switched from a highly engaged, tactical and organized SEO community to a seemingly highly theoretical, squishy human factors community and made a doubled down bet that these two disparate groups would never connect. I believe that if the SEO community wants to be effective in influencing search results, we must “cross the aisle” to include our UX brethren. Relevance, user satisfaction and ranking in results will be back with the thought-processing bipeds, as it should be.
Marianne Sweeny is a Search Information Architect at Portent Inc. She is passionate about optimizing the user search experience on the Web or inside the enterprise firewall. You can view her last article for SEMrush here: "IO (Information Optimization) is 21st Century SEO."