Google’s Algorithm Change: “Overly-Optimized Sites”

From the Archives…

In 2012 at SXSW, Google’s Matt Cutts talked about an upcoming “over-optimization” algorithm launch aimed at those who abuse search engine optimization. Rob Snell transcribed the session, which included these comments from Matt:

“The idea is basically to try and level the playing ground a little bit. So all those people who have sort of been doing, for lack of a better word, “over optimization” or “overly” doing their SEO, compared to the people who are just making great content and trying to make a fantastic site, we want to sort of make that playing field a little bit more level.

So that’s the sort of thing where we try to make the Google Bot smarter, we try to make our relevance more adaptive so that people don’t do SEO—we handle that—and then we also start to look at the people who sort of abuse it, whether they throw too many keywords on the page, or whatever they exchange way too many links, or whatever they are doing to sort of go beyond what a normal person would expect in a particular area. So that is something where we continue to pay attention and we continue to work on it, and it is an active area where we’ve got several engineers on my team working on that right now…”

 [And later after talking about the positives of SEO] “Absolutely there are some people who take it too far. What we’re mindful of is when someone says, “We’re White Hat. We continue to do the right thing, and we see the Black Hats who are over optimizing or going too far, and they seem to be doing too well.” So we’ve been working on changes to try to make sure that if you are a White Hat or if you’ve been doing very little SEO that you are going to not be affected by this change. But if you’ve been going way far beyond the pale, then that’s the sort of thing where your site might not rank as highly as it did before.”

A lot of people have asked me what this means for those who include search engine optimization as part of their marketing mix. Some are worried that Google will begin to penalize sites that have implemented search engine optimization techniques. My thoughts? I think that some site owners should worry. But whether or not you should depends on what you mean by search engine optimization.

AS I’ve talked about and written about over and over (notably in my book and most recently in my article about Clay Johnson‘s talk about SEO killing America), SEO means lots of different things to lots of different people. When I talk and write about SEO, I mean:

  • Using search data to better understand your audience and solve their problems (by creating compelling, high-quality content about relevant topics to your business)
  • Understanding how search engine crawl and index sites and ensuring that your site’s technical infrastructure can be comprehensively crawled and indexed

But the definition of SEO is a continuum. Some of it is clearly spam. But there’s a gray area of SEO that’s not exactly spam, but it’s really not those two bullets above either.

For instance, I’ll look at a page and see a bunch of keyword-rich links in the footer. “Does anyone click on those?” I might ask. “Nah, those are just there for search engines”. I go to conferences and hear people debating keyword density percentages, how many times a keyword should be repeated in a title tag, how to get links that “appear” natural. At some point, search engine optimization goes beyond making sure pages are as useful as possible for the target audience and that the site is crawlable and becomes a game of guess the algorithms.

Anyone who’s read or heard me before knows that I’m not an advocate for algorithm chasing. Historically, I’ve had this view because I don’t find it productive. Algorithms change hundreds of times a year. Signals differ for individual queries. The goal is always to extract all of the data on the web and show the very best page for searchers. So why not just invest time in making sure all of your content is extractable and are in fact the very best pages?

Now, there’s another reason to follow this strategy.

The type of algorithm changes Matt talked about in this SXSW session remind me a bit of how Google described the Panda algorithm. Panda wasn’t about spam. It was about separating high-quality, useful pages from pages that were just a collection of words about a particular topic. This seems similar, like yet another way of discerning that. At one point in the session, Matt said:

“We’re always trying to best approximate if a user lands on a page, are they going to be really, really happy instead of really, really annoyed? And if it’s the sort of thing where they land on a page and they are going to be annoyed, then that is the sort of thing that we’ll take action on.”

Matt talked about finding ways to surface smaller sites that may be poorly optimized, if, in fact, those sites have the very best content. This is not anything new from Google. They’ve always had a goal to rank the very best content, regardless of how well optimized or not it may be. And I think that’s the key. If a page is the very best result for a searcher, Google wants to rank it even if the site owner has never heard of title tags. And Google wants to rank it if the site owner has crafted the very best title tag possible. The importance there is that it’s the very best result.

Matt talked about this later:

“We tell people over and over again, “Make a compelling site. Make a site that’s useful. Make a site that’s interesting. Make a site that’s relevant to people’s interests… all of the changes we make, over 500 a year, are designed to try to approximate if a user lands on that page, just how happy are they going to be with what they get? So if you keep that in mind, then you should be in good shape no matter what.”

He also mentioned making Googlebot smarter, which is more an evolution of what they’ve been working on for years: being able to extract content from JavaScript, AJAX, Flash, images, forms… We’ve seen this in the last year with smarter handling of paginated content, for instance. (I wrote about the pagination tags Google supports here, but my post was based on a Google video and blog post where Maile Ohye mentions that if you don’t implement the tags, Google will use patterns from your site to try and create paginated clusters for you.)

Another thing to keep in mind about how Matt described this upcoming change is that he wasn’t speaking at a search conference. The audience was at least in part non-SEOs. He introduced himself as the person in charge of catching those who try to cheat Google. He was talking to people who (based especially on the question that triggered Matt’s  comments) were coming from the perspective of thinking of the type of SEO that’s really about reverse engineering algorithms.

Matt first talked about the benefits of SEO. He said to think of SEO like a coach who helps to present yourself better. He said that Google wants to level the playing field so that all content has a chance to compete equally. And when he talked about the kinds of techniques that this algorithm would look for he said they were looking for abuse: too many keywords, too many link exchanges. He contrasted what the algorithm was looking to flag to “great content”.

In particular, Matt said the following in support of SEO:

“The way that I often think about SEO is that it’s like a coach. It’s someone who helps you figure out how to present yourself better. In an ideal world, though, you wouldn’t have to think about presenting yourself and whether search engines can crawl your website. Because they’d just be so good that they can figure out how to call through the Flash, how to crawl through the forums, how to crawl through the JavaScript, how to crawl through whatever it is…

A lot of people seem to think that Google hates SEO. That’s definitely not the case…

 We even made a video about this. If you do a search for webmaster videos, we’ve made something like 400 videos. And we made one specifically to say Google does not hate SEO, because SEO can often be very helpful. It can make a site more crawlable. It can make a site more accessible. It can think about the words that users are going to type whenever they come to a search engine and make sure that those words are on the page, which just makes the site more user-friendly.

So the same sorts of things you do to optimize your return on investment and how well something spreads virally or socially is the exact same sort of stuff that often works well from a search engine perspective. So there is a ton of stuff that is fantastic to do as an SEO, it just makes your content more crawlable and more accessible.

This isn’t the oft-heralded death of SEO. But it may be the first nail in the coffin of those who go beyond SEO and lose track of creating the best possible content for their audiences.

Leave a comment

Your email address will not be published. Required fields are marked *