There are few companies as ubiquitous as Google. We did a quick check through Internet Live Stats (as of June 9th), and found that there were 67.520 searches done on Google in 1 second! If that stays true through 24h, it’s the equivalent of 5,833,728,000 searches a day. Let’s be clear, thats 5 billion, 833 million, 728 thousand searches in a single day! Googling something is so common that in 2006 it officially became a verb. It is now an intrinsic part of our lives, as well as how we acquire information.
It’s inspiring to see the depth and magnitude of what Google has achieved. It’s also important to see the implications of these achievements on how search effectively works.
This is why we’d like to try a bird’s-eye view with our Historical Timeline of Google Confirmed Updates.
We will continue to develop new infographics covering the changes as they continue to roll out. As such this article will also receive updates as new updates are confirmed.
2018 Google Confirmed Updates
Already, we’ve seen Google confirmed updates like the “unnamed” update of February, the “Brackets” Core algorithm update and the Mobile-First Index Roll out confirmed in March.
In March, there was also a very interesting test being performed. Moz dubbed it the Zero-result SERP Test, where on a small set of Knowledge Cards, “Google started displaying zero organic results and a ‘Show all results’ button”. This is especially interesting as we made similar reflections when considering the ‘down the line’ consequences of Featured Snippets in our article in late March. This also coincides with data from Rand Fishkin, at SparkToro, and JumpShoot, as you can see below. It would seem that the real-estate value of a top SERP position is about to get a lot higher!
For now, however, let’s look back at the years 2000 to 2017!
Historical Timeline and Infographic of
Google Confirmed Updates
Google has naturally been very active in terms of updating their services, ensuring their market supremacy and product quality. This is why we’ve decided to split our infographic into 3 pieces for now.
The first of our Historical Timeline of Google Confirmed Updates Infographics covers the first 9 years, from 2000 to 2009. The second infographic will cover 2010 to 2015, while the third will look at the last couple of years.
We will give the early years a good bit of attention, seeing as Google had a crystal clear vision from an early stages. This is as simple as it is grandiose:
Organize the world’s information and make it universally accessible and useful.
That vision informs both the updates that came in the early years and those still in the making. So let’s get a view of what’s been going on.
2000 to 2009
The First 9 Years of Google Confirmed Updates
2000 and 2003 Updates
In 2000, Goggle launched their Google Toolbar™, and we also saw the introduction of automatically determining the importance of every page through PageRank™. This really brought about a sweep of changes in what could easily be understood about how search engines work. The practically monthly changes in Google’s indexing (search rankings) during these early years went from small and straight forward to large and wildly confusing.
The dramatic changes and webmasters adjusting to them became known as doing “the Google Dance.” Google would sweep you off your feet, onto the dance floor, where you had to figure out the moves.
While there were undoubtedly updates in 2002, we have not included them in this infographic as they were not confirmed. After all, while Google is pretty good at keeping webmasters informed, they can’t give away all their secrets.
In 2003, the first named update was announced at the SES Boston – aptly named Boston – and it was a combination of index and algorithm update. While the aim was for updates to come around on a monthly basis this approach was soon abandoned. Fritz was introduced in July ’03 and truly spelled the end of the previous few years’ Google Dance. Fritz saw a switch away from semi-monthly overhauls to an incremental approach to updating their index.
The Florida update of November really started the modernization of search engines. For many, search engine updates where of little import. This update, however, infuriated business men and -women, and webmasters alike, as many sites suddenly lost their ranking. The 90s low-value tactics were truly being targeted for demolition. It spelled the real introduction of the mantra we know today: give value – receive value.
While there weren’t many named and Google confirmed updates in 2004, the changes introduced by Brady were far reaching and included a vast index expansion. That said, one of the more far reaching – and with greater implications – was the implementation of Latent Semantic Indexing (LSI).
Conceptually, the introduction of LSI is a big one. With LSI, Google’s ability to understand synonyms greatly improved. Simultaneously, the understanding of “link neighborhoods” were introduced – putting additional emphasis on relevance in relation to backlinks. While past and present algorithm updates might not be picture perfect from the get-go, it is an optimization process.
In 2005 Google, in an effort to deal with spam and link quality, introduced “nofollow” in a combined effort with Yahoo and Microsoft. This move empowered website owners to deal with spam links in comment sections, and other frustrating aspects of dealing with less than helpful links.
Bourbon, however, is less easy to clearly distinguish in the sense that it does not focus exclusively on one problem or issue. Effectively, “GoogleGuy”, who kept close touch with various webmaster forums, informed the webmaster community that there were 3.5 updates rolling out in quick succession. One major target of these Google confirmed updates was to stop giving results to sites employing black-hat techniques, like “unnatural backlink growth”.
The June 1st, ’05, introduction of personalized search turned out to be a pioneering step from Google. While initially the impact was rather small, it has grown to be a big part of search as a whole. It uses the search history of the individual to guide what shows up in search results. The implications of personalized search is precarious, and we might return to this at a later date. XML Sitemaps and Jagger were also rolled out on the first of June.
XML Sitemaps is familiar territory to most webmasters these days. However, the cliff notes version is that it allows webmasters to send an overview of all pages under a property, or domain, to Google via Webmaster Tools (Google Search Console, these days). This gives webmasters and SEOs a minor, yet concrete, influence over both indexation and crawling.
Jagger, on the other hand, focused on links. To be specific, its focus was on link farms, paid links, low quality links, and reciprocal links; all part of what was, and still is, black-hat SEO techniques to influence and boost rankings.
Another 2005 update that turned out to be a hit was Google Maps, and Local Business Center. This effectively introduced the concept and trend we’ve seen in the following decade and a half. With prioritization of localization, and businesses registering their presence through Google.
2006 - No Google Confirmed Updates
As with 2002, there were no Google confirmed updates in 2006. That said, Google was not resting on its laurels.
However, unconfirmed updates will have to be a topic for another article at a later date. After all, there’s more than enough to cover for now.
2007 was a peculiar year in the world of SEO and search. Yet again, there weren’t any distinctly confirmed algorithm updates in 2007. However, a massive change occurred in how search is done – which forever changed how we all perform searches. Universal Search introduced images, maps, news, video, and much more that we now take for granted in how we acquire answers and information online today.
Dewey was a peculiar update. While it certainly was a confirmed update, what it actually did – and what it addressed – is circumspect. We could speculate that it was a nod to the Dewey Decimal System, and that it was part of a background update on cataloging and processing? However, these are just educated guesses on our part.
While there are those who believe that Google might have been actively pushing their own product lines in search results, like Google Books, there is little concrete evidence of this. All we know for sure is that there was indeed an update, and it did ruffle some feathers, but what exactly the purpose was remains largely a mystery.
Google did have another fantastic rollout in 2008. Google Suggest is one of those initiatives that definitely had a direct impact on how search is done. It opened a plethora of choices and assistance for the searcher. You remember our earlier mention of the “Google Dance”, right? Well, by now, this picture has changed again.
Whereas previously Google would drag you onto the dance floor, by this point Google is the dance floor. You don’t show off your skills by dancing with Google. You can show off your skills by dancing better than your competitors on Google, in a certain dance genre, while the floor shifts beneath your feet. Intimidating? It certainly can be. However, if you focus on creating valuable informative content that is truthful, useful, and beneficial, you aren’t likely to have too many problems.
That said, if you are struggling with establishing yourself, seeking SEO expert advice can greatly assist your efforts. Whether it be content creation or translations, or if you need an SEO strategy developed or an SEO campaign executed, there are options out there for assistance. to deal with this. With that said, let’s move on and visit 2009.
In 2009, the rel=canonical tag was introduced. While canonicalization had been around for ages before 2009, the “rel=canonical” tag and tool, if you like, was implemented in 2009. This solved a rather frustrating problem for most websites, business related or otherwise. As blogs often wish to share their content on social media, there might be additional URL parameters added to the URL of the content you share.
Let’s run a hypothetical example. If your main content is https://yoursite.com/maincontentX, but when you share it you generate a link akin to https://yoursite.com/maincontentX?ref=linkedin. What may happen is that people reference your main content using the link they found in its entirety – resulting in the /ref=linkedin version of your content ranking higher than your actual preferred /maincontentX/ page. Alternatively, if your business-related website wants to make content easily printable, or PDF accessible, you might have a PDF extension to your main content. Giving you a URL like https://yoursite.com/maincontentX/PDF_printable/.
The end result here is similar: you’ve ended up diluting the link equity and ranking of your actual main content. This is due to the fact that the different URLs lead to the exact the same content, which results in duplicate content. Duplicate content is one of the most common issues websites face – the best case scenario is that duplicate content spreads ranking signals across URLs. This results in a decreased likelihood of your intended content ranking high in search engines. Rel=canonical tags are one of the most efficient ways to address this problem.
Another exciting update that was rolled out in 2009, was real-time search. While Google had up to this point been actively improving their index, and still are working on instantly updating it – 2009 was a turning point. Their ability to almost instantly update their index grew exponentially – and this immensely improved services like searching for breaking news stories. It also included featuring Twitter updates – having seen it grow to become a massive medium for instant communication and news.
2010 to 2015
Times they are a-changin'
2010 was a massive year in teams of Google confirmed updates that focused on solving direct problems. We saw Google Places, May Day, the Caffeine rollout, Google instant, Instant Previews, and Negative Reviews all introduced. However, in the interest of keeping this as useful as possible without being exhaustive, we won’t address them all. We will focus on the main takeaways from these updates.
In 2010, Google reaffirmed their emphasis on valuable content – which spelled trouble for sites which rarely updated their content or had thin content to begin with.
Think short blog posts, or pages, with less than 150 words. Just an update really, intended to keep the audience informed. These were generally seen as less useful than long form content. Evidence has long demonstrated that long form content is by far superior to generate traffic and ranking in search engines. Nail Patel has en excellent article on the value of long form content.
As we mentioned previously, Google always works on improving the speed with which they update their search engine results pages (SERPs). This is true of their 2010 updates too. The multi-interstitial iterations of Caffeine and Google Instant focused on refreshing the SERPs, and making them as real time as possible. They arguably achieved this goal of instant indexing and SERP updates in 2010.
Another major difference in 2010 was also the rather dramatic discovery that “all publicity is good publicity” also applied to Google and search engine ranking. It became clear when a company had received a ranking boost after having been in a storm of poor publicity, with numerous backlinks to their site. When the story broke, Google acted quickly to address this, and published and confirmed an algorithm update.
The message was simple, “being bad to your customers is bad for business.” in 2010 we also saw that general social signals became a ranking factor for sites. However, the strength of this ranking signal is unclear. It is not a significant factor, but it is part of the overall picture.
As you can probably tell, 2010 through 2014 was rife with Google confirmed updates. In 2011 we saw: Attribution Update, Panda, The +1 Button, Schema.org, Google+, Expanded Sitelinks, Pagination Elements, 516 Algorithm Updates (starting in September), Query Encryption, Freshness Update, and 2 additional 10-Pack of Updates. We won’t discuss them all in great detail; instead let’s look at the big picture.
2011 started off with a bang! Google called out JCPenny and Overstock.com for their dubious link-building schemes. These schemes included discounts on products in return for links. The end result was a manual penalty from Google, which saw both businesses plummet in rankings on Google. This set the tone for 2011: If you don’t play by the rules, Google will penalize you!
The early updates focused greatly on cracking down on poor and superficial content. First with the Attribution Update, then with the much renowned Panda Update (with its many iterations). The attribution update targeted scraped content and affected approximately 2% of searches, while Panda focused overall on content farms and affected no less than 12% of searches. That is massive. Panda targeted huge sites that pumped out useless, or close to useless, content. These were often big affiliate sites with thin content, sites with over-optimized content, and sites with a poor ad-to-content ratio. Basically, if it’s not focused on user friendliness and value provision – Panda is not going to like it.
In general, the emphasis on fresh updates to the SERPs and on useful content – cracking down on wholesale/low value content creation and distribution- was the theme of 2011.
2012 was another interesting year in search. The Google confirmed updates we saw in 2012 are in summary: Search+your world, Panda x14, Ads Above the Fold, Venice, Parked Domain Bug, Penguin, Knowledge Graph, DMCA Penalty (“Pirate”), Exact-Match Domain Update, and Knowledge Graph Expansion.
While Google aggressively pushed personalization in search, and put additional weight behind their Google+ social signals and data, we also saw some radical shifts in what was seen as acceptable ad usage on sites. Especially Top Heavy ad usage, or, Ads Above the Fold. In other words, sites that have a large amount of ads that load before most other content on the site were being down-valued in SERPs. While February saw an early introduction of updates targeting site speed, spell checking, and keeping SERPs fresh – this was merely a precursor of things to come: Namely Penguin.
Penguin deserves special attention in our 2012 section. Its focus is both on over-optimized sites and sites trying to cheat the system. These sites are created and managed in an attempt to appease search engines much more than users. What these sites often were culpable of was keyword-stuffing, black hat SEO techniques, and other spam factors (like 50 links in a 150 word paragraph). As with Panda, Penguin demonstrated a priority shift.
It can be seen as a reaffirmation of Google’s initial goals, making information and data useful to all. Both Panda and Penguin target and emphasize that sites need to provide value to their users – and place users first.
Another major development was the implementation of Knowledge Graphs. Knowledge graphs can be seen as both a search feature as well as an algorithm update. It was a massive step toward semantic search, which is something we discussed in our article on Featured Snippets. This is a significant shift, as it sees Google search focusing more on providing direct answers to the searcher, as opposed to providing links and excerpts to other sites.
2013 saw new iterations of both Panda & Penguin through the year, and they continued to be tweaked and upgraded over the years to come. However, there are other big news in 2013. In particular, there was the introduction of Hummingbird. Before we dig into Hummingbird, we will look into the other Google confirmed updates from 2013.
Domain crowding, payday loan, and in-depth articles all became targets of Google confirmed updates. Let’s start with PayDay Loans. The “Payday Loan” Google confirmed updates specifically targets sites with notoriously spammy results. It did not, however, only target payday loan company websites. It’s a general targeting algorithm which aims at a host of industries known for a spammy and low-value approach. Payday loans and porn were singled out in this update – which you can see in Matt Curtis’ and Google Webmasters video.
While most of the Google confirmed updates we discuss here carry across into the present, either as the building blocks of something larger or in new iterations, the in-depth article update/feature is less conclusive.
In-depth articles were visually distinguished and promoted, as you can see in the image in the image from the Googleblog. These days, however, they are extremely unlikely to show up a similar manner. We’ve previously mentioned long form content, and it should not be confused with in-depth articles. In-depth articles are specially calculated and weighed. These days, they are also less prominently displayed, yet remain differentiated. Check out Go Fish Digital and Bill Slawski to read more on what happened to in-depth articles.
Domain crowding, or domain clustering, refers to the amount of times one domain appears for one search in SERPs (Search Engine Result Page(s)). For example, this means that previously you could end up seeing results from a singular domain across page 1 through to 4. If you aren’t finding the right answer from page 1, where all the results are from one domain, they are unlikely to have the answer you’re looking for. Therefore, to solve this problem, an algorithm tweak was made. As a result, the number of times a specific domain would show in SERPs per executed search was limited.
The biggest update of 2013 was Hummingbird. The difference with Hummingbird as compared to all previous Google confirmed updates is that Hummingbird was a complete rewrite of the core algorithm itself. Search Engine Land used an incredibly useful metaphor for this – and we’ll simply provide you with this quote:
“Think of a car built in the 1950s. It might have a great engine, but it might also be an engine that lacks things like fuel injection or be unable to use unleaded fuel. When Google switched to Hummingbird, it’s as if it dropped the old engine out of a car and put in a new one. It also did this so quickly that no one really noticed the switch.”
– Danny Sullivan, Search Engine Land
This really is the simplest way to think about it. Such a huge algorithm rewrite was unheard of before Hummingbird. It offered far greater precision and speed, making the name Hummingbird rather fitting! It enabled an overall improved search experience for the user, and was more capable of “conversational search” than before. What this means for us is that a greater emphasis is now placed on long-term keywords as opposed to singular words or items.
2014 saw a good few updates that remain much topical today. Page Layout targets sites that are top heavy with ads, much as it did back in 2012. Payday Loan was reiterated twice; once targeting specific sites, and again with a follow-up update targeting spammy queries. 2014 also removed Authorship Photo and then Authorship entirely from SERPs. This came as a bit of a surprise, considering the heavy promotion work that had been in place for these as Google+ features. The three big Ps, (Panda, Pigeon, and Penguin) also got new iterations and updates.
Targeting digital media piracy was also yet another aspect that got specific attention in 2014, as major sites took serious hits in SERPs. However, there was more going on in 2014 than punishing the bad. We also saw that HTTPS/SSL sites were rewarded for their added security, as it now became a direct ranking factor. Additionally, the “in the news” search feature improved dramatically, giving a series of major news sites a significant boost in organic traffic.
In other words, 2014 saw a general strengthening of security, and active reinforcement of values like reliability, safety, and user friendliness overall.
While we won’t delve into Panda for our 2015 roundup, there were some rather significant Google confirmed updates in 2015. Mainly, Mobile First and RankBrain. There was also a general Quality Update – a core algorithm update that focused on identifying and ranking pages according to quality. Not big news really, as this trend had been evident for some time by this point, but still a confirmed update.
The mobile update was a big one, not for its immediate effect, but for what it represented in terms of search habits. It allowed for an increased prevalence of “on-the-go” search and was in keeping with the increased focus on local search. The rare, pre-announced, Google confirmed update for mobile set the stage for what we saw a couple of years later – that mobile search is imperative for ranking in search engines. While the immediate effect was minimal, it struck fear in the webmaster environment and for online businesses – and Moz dubbed it “mobilegeddon.”
RankBrain is probably the biggest algorithm of the Google confirmed updates of 2015. RankBrain is a machine learning algorithm, or AI (seemingly used interchangeably) that predominantly helped Google improve and deliver better results. There’s billions of searches conducted on Google each day, and approximately 15% of these searches are new and unique. RankBrain is specifically designed to deliver greater search results for these unique or hyper-specific search queries. According to Google, RankBrain itself is the third biggest ranking factor. That said, what exactly that means is difficult to deduct with certainty.
While links and content are the other two main ranking factors (as confirmed by Barry Schwartz and Search Engine Journal), RankBrain remains largely elusive. RankBrain’s machine learning test algorithm makes changes according to user behavior with old-vs-new search results in an offline environment. Due to this offline-historic query testing process, and the general trends from the previous 15 years, we can surmise that RankBrain as a ranking factor relates to how easy it is to understand your content and its quality in relation to answering queries. In other words, RankBrain will be working with constantly increasing relevance. What this means to you is that bounce rates carry some serious weight. Content, links, and RankBrain make up quality, authority, and relevance, if you will.
2016 & 2017
Fewer but bigger waves in Google confirmed updates
With Hummingbird now running quickly and accurately in the background, and RankBrain gradually improving various search results, 2016 saw fewer shake-ups than previously.
That said, one of the Google confirmed updates was the AdWords shake-up. While it might seem harmless to remove the ads on the side of SERPs, and adding another one on top of the organic results, it’s a big move. It demoted the number of naturally occurring first page results, and further promoted websites that use AdWords. While Google is excellent at finding and displaying good results for search queries, they do operate on ads. After all, 84% of the revenue Alphabet, Google’s parent company, made in 2017 was based on Google ads.
Another big update in 2016 was the further iteration of mobile-friendly boosts. This continued the trend of giving incentives to websites that are optimized for mobile search and indexing.
While we haven’t spent a lot of time expanding on the multiple iterations of Penguin, Panda, or Pigeon, we will make an exception for 2016’s Penguin 4.0 update. This iteration of Penguin was slightly different than the previous Google confirmed Penguin updates. It was slightly less harsh than its predecessors, in that it devalued bad links rather than penalizing the sites directly. In other words – rather than being harmed by bad links, you would not receive any benefits from them.
At this point, Google also took the step of making Penguin part of the core algorithm with improvements to come, rather than big updates. Another impressive element of Penguin 4.0 is that it is both real time and page specific. This means that it won’t devalue your entire site if you have poor elements on a page, but rather devalues that page specifically.
2017 saw some really positive movements in terms of what is acceptable website behavior. A particular favorite of ours is the intrusive interstitial penalty. In other words, if a site is overloaded with pop-up ads, or sign-ups, or anything else that is intrusive and obstructs the regular website visitor’s experience of your content, it will be penalized.
While it is intended more directly towards mobile users, as part of Google’s overall push towards Mobile First, it is a move we are generally in favor of. Sites that have multiple layers of interstitials and pop-ups are not catering to their audience. Whether over-optimizing for lead conversion, or ad revenue, etc. – the end result is a poor user experience.
There was also an update in early February, 2017. It’s unnamed, yet it remains a confirmed update. According to some data, this seems to be a combination update and rollback of previously harsher regulations on rankings. One thing is for sure, it wreaked havoc across rankings for a multitude of sites, and was the icing on the cake after an already tumultuous 6 months.
Two other Google confirmed updates were Google Jobs and an expansion to Snippet Length. We won’t use a lot of space on these two, as their impact on search in general is small. Google Jobs, is quite simply the incorporation of available jobs in the U.S. accessible to search through Google. The expansion in snippet length was heralded, yet is highly unstable. While it expanded in 2017 to 300 characters, from about 150, it has since returned (more or less) to 150.
The other confirmed Google update of 2017 that set a precedent, is that Google’s web-browser, Chrome, now started to show warnings on sites that had not implemented HTTPS for their domain. This is a big one. We discussed it in our article on e-commerce and website security, and when discussing the top 5 biggest SEO tips and errors for websites in general. This warning sets a new bar for acceptable website conduct, and while it doesn’t actively harm rankings, there’s a lot to it.
Remember earlier when we discussed RankBrain’s impact as one of the three biggest ranking signals? Well, without implementing https:// your visitors now get warnings that your site is unsafe. The result of which is that the visitor more likely than not will leave your site quickly. If that trend continues, your bounce rate goes up. Consequently, RankBrain will think that this site is not the answer to these searchers’ questions. It’s not pretty, but it’s straightforward. Implement better security, and your site won’t have issues with the https warning.
If you’re going to track 17 years of change with any organization, you’re bound to find some sort of common theme along the way. With Google, the question has rarely been “why”, but is more often “how”. How to update your site accordingly. How to become more valuable online. How to increase relevance. How to provide reliability. Et cetera.
As you can tell from our very summarized take on the updates covering 2000 to 2017/18 – there’s been quite a lot going on! However, if you’re a non-SEO professional, you should be aware that this barely scratches the surface. In this article we’ve given you summaries of years of updates and changes. And what we have covered are solely Google confirmed updates. This leaves a practical ocean of updates and changes that aren’t confirmed.
As we’ve stated, Google (while cooperative and wanting most people to keep up with the new rules) don’t give away all their trade secrets. That would jeopardize the value of their AdWords products. If they broadcasted exactly how to rank perfectly for their search engine, there would be a radical drop in demand for AdWords – which allow you to jump the queue. So the question becomes – should you start studying SEO, seek consultants for SEO insights, or hire an expert company to execute your SEO campaigns and activities?
Moreover – what’s next?
Want to know how to best set-up your business website for success?
There is a myriad of ways to approach SEO and improving your business’ website for ranking organically in SERPs. Finding the best road to Rome, however, depends on your circumstances to a great degree.
There are general SEO tips and advice for all sites to follow that take Google confirmed updates into account. After all, Google confirmed them and gave advice as to how to react to it. However, the unconfirmed updates to search algorithms is a massive chapter in and of itself.
If you’re curious about SEO, and want to know more about how to get your site set up for ranking success, please feel free to reach out using our contact for below. Or if you are an SEO service provider and look to partner with another company and outsource some heavy lifting, feel free to give us a call or use our contact page for more information.
We look forward to helping you reach your SEO goals!