Features Prices
News 0
Latest News See All

Temporarily unavailable. Please come back later.

See All
Webinars 0
Upcoming Webinars See All
Upcoming Webinars

Sorry, we could not find any upcoming webinars.

See recorded webinars
Blog 0
Recent Posts See All

Temporarily unavailable. Please come back later.

See All
Michael Stricker

What Are The Top 10 SEO Issues Impacting Your Site? Pt. 1

Michael Stricker

SEMrush's Director of Marketing, Michael Stricker, recently sat down with CIO.com to offer up his list of the Top 10 Technical SEO Issues affecting sites today. Below, he expands on his first five selections, offering the rationale behind his decision-making and his suggestions on how best to fix them.

1) Mobile Presentation

Matt Cutts has said, "It’s important to make sure your site is lean and loads fast as that’s important on mobile." Mobile device users are driving up website visitor metrics such as Mobile Visitors. Mobile Purchases are increasing among Smartphone users. Tablet users convert, or buy, at a higher rate and at a higher price point, for more revenue per visit, than desktop users.

Unless your site welcomes Mobile users with a great User Experience (UX), those Visitors will bounce away, perhaps never to return. Google crawls your site in the guise of both desktop and mobile users. It emulates different mobile devices to gain a better understanding of how well prepared your site is to serve various smartphones. It obtains signals that indicate slow-loading mobile sites, and perhaps combines that with knowledge of Mobile engagement.

That calculation against quality indicators enables the Google algorithm to make an assessment about how well your site should rank, compared with others. This ranking method gives Google the ability to present fast mobile experiences that users appreciate, and demote slow, difficult layouts that repel users.

A great Mobile UX has become essential to meet user expectations, and to rank well in Google. How one handles the inbound mobile visitors is a key factor to consider in advance of any design or development. Diverting traffic to a separate, mobile site is an option rife with additional baggage - using a subdomain such as http://m.domain.com can split your link equity.

This raises concerns about deflecting traffic from the original URL without informing the user or offering options, which is a practice that Google abhors because it smacks of blind redirects and traffic theft and misdirection and a whole host of nefarious practices. Using a separate mobile subdomain can consume additional resources and require dedicated maintenance. Google can become suspicious of sites that present different content to different devices, depending on how it is accomplished.

That is why options are important, such as a way to "display the desktop site.”

Responsive design, while not necessary, provides an experience customized to the device, yet the content is the same for all users.

The ability to serve up beautiful photos suited to the device makes Tablets into luscious E-Commerce catalogs. It also offers the option for users to consume Infographics and illustrations, even on Smartphones or Featurephones. That improves secondary signals such as higher Page Visits, Time on Page, Visit Duration, and lowers Bounce Rate.

2) Page Authority

Low rank applied to isolated pages that are new, lack links from your own pages and especially are void of links from already-established websites. Dr. Pete, of SEO tool vendor Moz, states that Page Authority has one of the highest observed correlations with Google Rank.

Moz defines Page Authority as “how well a given webpage is likely to rank in Google.com's search results.” Data includes inbound links to the page and their PageRank, comparative rank of the domain, and Trust.

Navigation of the linkbuilding landscape is fraught with peril, but web masters would do well to attract genuine links from highly trafficked, popular websites such as news media and relevant blogs.

3) Duplicate Content - On-Site 

Google's crawlers must cover a lot of 'ground'. According to one estimate, the Indexed Web contains at least 2.43 billion pages. As of 2013, FactsHunt posits 14.3 Trillion Webpages, live on the Internet. There is a deficit.

Google cannot consume all of the data, especially when one considers that Google must revisit each page over and over to find changes or new material. Anything that slows Google's discovery or 'crawling' of the web is unwelcome.

Dynamically created websites that create webpages on-the-fly from databases are frequently misconfigured from an SEO point-of-view, potentially drawing the crawlers’ ire. These sites may create lots of pages, or URLs, that contain basically the same content, over and over. Imagine a website that sells socks, offering separate pages for red, blue, brown, or black socks, for example.

Other sources of SEO Duplicate Content include use of 'plain' and secure protocol URLs (HTTP and HTTPS), no expressed preference for www.domain.com and domain.com, and Blog Tags, and syndicated RSS Feeds.

According to Google, each URL must be accessed by Google's searchbot, its contents must be downloaded, placed in a database, and then stored, processed and filtered by Google's Algorithms for purposes of assigning rank to those URLs. When an E-Commerce site has 10 colors in each of 100,000 garments, they are consuming 10X more of the Googlebot's attention than Google might like.

The remedy is to crawl one's own site, discover the duplications and apply "crawl directives" to inform Google as to the relative value of multiple URLs. Suggest to Googlebot that particular folders and directories are not worth crawling by using Robots.txt. Then tell Google which of several URLs to prefer for its index by applying the rel="canonical" link element to point to the 'best' URL. Fix these technical issues and enjoy better management of link value, metrical clarity and thrifty use of Google's crawler so that it has time to find new and changed content upon your site.

Duplicate Content - Off-Site

Google has enough trouble deciding which pages to present to searchers without having to sift through hundreds of instances of the same stuff. Google has also been sued over presenting trademarked or copyright-protected content inappropriately stolen or 'scraped' by content thieves. Naturally, Google has no interest in presenting SERPs full of the same results. Plus, Google is gun-shy about who owns license to content.

Google has spent a good deal of research to try and figure out who first placed content on the web, so that the original can rank higher. Despite the work, Google still has some difficulty discerning the original content from the copies. Spare yourself from the confusion. Your best workaround is to use only original, fresh text matter. The same concept applies to images – use only those that you have properly licensed or to which you own copyright.

4) Single-Page Applications

Web developers would love to streamline web page creation, and reduce the unwieldy and unmanageable 'spaghetti code' that results from using dozens of JavaScripts and CSS Stylesheets to dynamically assemble webpages. The result of this drive to simplify is to rely on 'front-end' or 'presentation layer' scripts, JSON, AJAX, to assemble the data that makes up a webpage as it arrives in the user's browser.

However, such a website may present webpage after webpage with the same URL. Google was built around Unique Resource Locators (URLs). When those URLs are no longer unique, Google's entire crawling, indexation, and rank calculation machinery can be stymied. (This is oddly similar to some of the issues that used to plague 'frames' HTML page, such as the inability to use a 'back button,’ the inability to directly link to a specific part of the website, etc.)

Great care must be taken to present the 'illusion' of multiple URLs when SEO is important. You’d better do your research and have a killer developer team with great foresight before you take on a challenge of this magnitude.

5) Guest Blogging without using rel="nofollow" links

Unless you've been under a rock, you know that Google's Webspam team leader, Matt Cutts, announced on his blog, "…if you’re using guest blogging as a way to gain links in 2014, you should probably stop." Routinely writing posts on remote blogs to gain the PageRank provided by built-in rel="follow" links has become sufficiently exploited that Google considers it as Search Engine manipulation.

Recent, high profile penalizations include popular guest blog, MyBlogGuest. Owner Ann Smarty announced that the links in the guest author posts would remain 'follow', that is, inviting the Googlebot automated web crawler to follow that link to the linked website. "Follow" links also carry PageRank from the popular, well-read blog domain back to the author's domain. It is this SEO value delivery system that Google resents, because it can be used to artificially inflate the apparent popularity of webpages and websites, causing them to rank higher in the Search Engine Results Pages (SERPs.)

As the strategy of Guest Blogging grew in scale, Google was forced to curtail it in order to prevent abuse. PostJoint is another Guest Blog that was removed from the index.

In the case of MyBlogGuest, over 200 authors indicated that their own websites suffered Manual Linkspam Penalties so that not only the Guest Blog and its posts disappeared from Google, but their own rank suffered, as well. The remedies include using rel='nofollow' to prevent the passage of PageRank.

Users can still follow the links from blog to one's website, but the transfer of PageRank is cut off, and with it, the manipulation of rank that Google guards against. Additional safeguards are to write Guest Blog posts to appear on the blogs most relevant to your industry, business and customers. Real value accrues to posts of genuine human interest that earn presence on news media and trade sites.

What do you think are the most impactful technical issues affecting SEO currently? Look for Part Two of this blog post later on today!

Michael Stricker is SEMrush’s Director of Marketing. His industry musings can be found on Twitter and on our blog. His last post was "What are the Top SEO Issues Impacting your Site?"

Have a Suggestion?