We could go back to the very beginning to see all the changes and updates that Google has made, but I want to focus on those changes that have happened since 2011 that’s why I’ve briefly noted the updates between 2003 and 2010.
These are the ones that have changed the way we do SEO today and I think it is important that you know the history as it helps you make decisions on which SEO strategies to implement.
- Google Algorithm Changes in 2003 – 2010
- Google Algorithm Changes in 2011
- Google Algorithm Changes in 2012
- Google Algorithm Changes in 2013
- Google Algorithm Changes in 2014
- Google Algorithm Changes in 2015
- Google Algorithm Changes in 2016
- Google Algorithm Changes in 2017
- Google Algorithm Changes in 2018
- Google Algorithm Changes in 2019
- Conclusion on Google Algorithm Changes
Google Algorithm Changes in 2003 – 2010
November 16, 2003
Google’s Florida Update signaled a new era of SEO. Websites (including retailers who relied on affiliates to drive traffic) using spammy tactics of the previous decade (e.g., keyword stuffing, using multiple sites under the same brand, invisible text, and hidden links) to rank for high-commercial keywords saw their rankings wiped out right before the lucrative holiday season.
September 1, 2005
Jagger was an update in three phases (Jagger 1, Jagger 2, and Jagger 3) that began with a number of backlink-focused updates in early September meant to crack down on unnatural link building, paid links, and other types of spam.
The second phase of Jagger had the most noticeable impact in October. The final phase was completed near the end of November.
December 15, 2005
Big Daddy (or Bigdaddy) was a gradual update to Google’s infrastructure that began rolling out in December 2005 and was completed in March 2006.
This update changed how Google handled technical issues such as URL canonicalization and redirects. Some websites didn’t make it into the new Big Daddy data centers, typically due to unnatural linking (e.g., excessive reciprocal linking, linking to spammy neighborhoods, paid links).
January 18, 2009
Google’s Vince update was a quick, noticeable change in broad-level, competitive keyword terms to favor first page rankings for big brand domains vs. previously ranking sites (typically less authoritative sites, affiliate sites, and sites that had won this coveted visibility purely through SEO efforts).
April 28, 2010
The MayDay update was an algorithmic change to how Google assessed which sites were the best match for long-tail queries. This update rolled out between April 28 and May 3.
August 10, 2009
Google’s Caffeine update was a new web indexing system that allowed Google to crawl and store data more efficiently, resulting in 50 percent fresher results. Developers were given early access starting in August 2009 before the update officially rolled out June 8, 2010.
Google Algorithm Changes in 2011
This was a huge year in SEO terms, shocking many webmasters. In fact, 2011 was the year that wiped out a lot of online businesses. Most deserved to go, but quite a few innocent victims got caught in the carnage, never to recover.
At the beginning of the year, Google hit scraper sites (sites that used bots to steal and post content from other sites). This was all about trying to attribute ownership of content back to the original owner and thus penalize the thieves.
On February 23, the Panda update launched in the USA. Panda (also called “Farmer”) was essentially targeting low-quality content and link farms. Link farms were basically collections of low-quality blogs that were set up to link out to other sites.
The term “thin content” became popular during this time; describing pages that really didn’t say much and were there purely to host adverts. Panda was all about squashing thin content, and a lot of sites took a hit too.
In March of the same year, Google introduced the +1 button. This was probably expected to bear in mind that Google had confirmed it used social signals in its ranking algorithm. What better signals to monitor than its own?
In April 2011, Panda 2.0 was unleashed, expanding its reach to all countries of the world, though still just targeting pages in English. Even more, signals were included in Panda 2.0.
Maybe even user feedback via the Google Chrome web browser. Here users had the option to “block” pages in the SERPs that they didn’t like. As if these two Panda releases weren’t enough, Google went on to release Panda 2.1, 2.2, 2.3, 2.4, 2.5, and 3.1, all in 2011.
Note that Panda 3.0 is missing. There was an update between 2.5 and 3.1, but it is commonly referred to as the Panda “Flux”. Each new update built on the previous, helping to eliminate still more low-quality content from the SERPs.
With each new release of Panda, webmasters worried, panicked, and complained on forums and social media. A lot of websites were penalized, though not all deserved to be; unavoidable “collateral damage” Google casually called it.
In June 2011, we saw the birth of Google’s first social network project, Google Plus.
Another change that angered webmasters was “query encryption”, introduced in October 2011.
Google said it was doing this for privacy reasons, but webmasters were suspicious of its motives. Prior to this query encryption, whenever someone searched for something on Google, the search term they typed in was passed on to the site they clicked through to.
That meant webmasters could see what search terms visitors were typing to find their pages using any web traffic analysis tool. Query encryption changed all of this.
Anyone who was logged into their Google account at the time they performed a search from Google would have their search query encrypted. This prevented their search terms from being passed over to the websites they visited.
The result of this was that webmasters increasingly had no idea which terms people were using to find their site.
In November 2011, there was a freshness update. This was to supposedly reward websites that provided time-sensitive information (like news sites), whenever visitors searched for time-sensitive news and events.
As you can see, there was a lot going on in 2011, but it didn’t stop here.
Google Algorithm Changes in 2012
Again, 2012 was a massive year for SEOs and webmasters. There was a huge number of prominent changes, starting with one called “Search + Your World” in January. This was an aggressive measure by Google to integrate its Google+ social data and user profiles into the SERPs.
Over the year, Google released more than a dozen Panda updates, all aimed at reducing low-quality pages from appearing in the SERPs.
In January 2012, Google announced a page layout algorithm change. This aimed at penalizing pages with too many ads, very little value, or both, positioned above the fold.
The term “above the fold” refers to the visible portion of a web page when a visitor first lands on it. In other words, whatever you can see without the need to scroll down is above the fold. Some SEOs referred to this page layout algorithm change as the “Top Heavy” update.
In February, Google announced another 17 changes to its algorithm, including spell-checking, which is of interest to us. Later in the same month, Google announced another 40 changes.
In March, there were 50 more modifications announced, including one that made changes to anchor text “scoring”. Google certainly wasn’t resting on its laurels.
On April 24, the Penguin update was unleashed. This was widely expected, and webmasters assumed it was going to be an over-optimization penalty. Google initially called it a “Webspam update”, but it was soon named “Penguin”.
This update checked for a wide variety of spam techniques, including keyword stuffing. It also analyzed the anchor text used in external links pointing to websites.
In April, yet another set of updates were announced, 52 this time.
In May, Google started rolling out “Knowledge Graph”. This was a huge step towards semantic search (the technology Google uses to better understand the context of search terms). We also saw Penguin 1.1 during this month and another 39 announced changes. One of these new changes included better link scheme detection. Link scheme detection helped identify websites that had built their own links to gain better rankings.
In July, Google sent out “unnatural link warnings” via Google Search Console, to any site where it had detected a large number of “unnatural” links. To avoid a penalty, Google gave webmasters the opportunity to remove the “unnatural” links. Think of unnatural links as any link the webmaster controls, and ones they probably created themselves or asked others to create for them.
These would include links on blog networks and other low-quality websites. Inbound links such as these typically used a high percentage of specific keyword phrases in their anchor text. Google wanted webmasters to be responsible for the links that pointed to their sites. Webmasters who had created their own sneaky link campaigns were able to do something about it.
However, if other sites were linking to their pages with poor quality links, then Google expected webmasters to contact the site owners and request removal of the bad link(s). If you have ever tried to contact a webmaster to ask for a link to be removed, you’ll know that it can be an impossible task. For many webmasters, this was an impractical undertaking because the unnatural link warnings were often the result of tens or hundreds of thousands of bad links to a single site.
Google eventually back-tracked and said that these unnatural link warnings may not result in a penalty after all. The word on the street was that Google would be releasing a tool to help webmasters clean up their link profiles. When you think about it, Google’s flip-flopping on this policy was understandable and just. After all, if websites were going to get penalized for having too many spammy links pointing to their pages, then that would open the doors of opportunity to criminal types. Dishonest webmasters looking to take down their competition would simply need to point thousands of low-quality links to their pages using automated link-building software.
In July, Google announced a further 86 changes to its algorithm.
In August, the search engine giant started to penalize sites that had repeatedly violated copyright, possibly via The Digital Millennium Copyright Act (DMCA) takedown requests.
For those who might not be familiar with this, the DMCA is a controversial United States digital rights management (DRM) law. It was first enacted on October 28, 1998, by the then-President Bill Clinton. The intent behind DMCA was to create an updated version of copyright laws. The aim was to deal with the special challenges of regulating digital material.
Moving on to September 2012, another major update occurred, this time called the EMD update. You’ll remember that EMD stands for Exact Match Domain and refers to a domain that exactly matches a keyword phrase the site owner wants to rank for. EMDs had a massive ranking advantage simply because they used the keyword phrase in the domain name. This update removed that advantage overnight.
In October of that year, Google announced that there were 65 algorithm changes in the previous two months. On October 5, there was a major update to Penguin, probably expanding its influence to non-English content.
Also in October, Google announced the Disavow tool. This was Google’s answer to the “unnatural links” problem. It completely shifted the responsibility of unnatural links onto the webmaster by giving them a tool to disavow or deny any responsibility or support for those links. If there were any external links from bad neighborhoods pointing to your site, and you could not get them removed, you could now disavow those links, effectively rendering them harmless.
Finally, in October 2012, Google released an update to its “Page Layout” update. In December, it updated the Knowledge Graph to include non-English queries for the more common languages. This drew an end to the Google updates for that year.
Google Algorithm Changes in 2013
In 2013, Google updated both Panda and Penguin several times. These updates refined the two different technologies to try to increase the quality of pages ranking in the SERPs.
On July 18, a Panda update was thought to have been released to “soften” the effects of a previously released Panda, so Google obviously watched the effects of its updates, and modified them accordingly.
In June, Google released the “Payday Loan” update. This targeted niches with notoriously spammy SERPs. These niches were often highly commercial, which offered great rewards for any page that could rank highly. Needless to say, spammers loved sites like these. Google gave the example of “payday loans” as a demonstration when announcing this update, hence its name.
August 2013 – Hummingbird – Fast & Accurate?
Hummingbird was the name given to Google’s new search algorithm. It was not part of an existing algorithm or a minor algorithm update, but an entirely brand-spanking new algorithm that was unboxed and moved into place on August 20, 2013 (though it was not announced to the SEO community until September 26).
This was a major change to the way Google sorted through the information in its index. In fact, a change on this scale had probably not occurred for over a decade. Think of it this way.
Panda and Penguin were changing parts of the old algorithm, whereas Hummingbird was a completely new algorithm, although it still used components of the old one.
Google algorithms are the mathematical equations used to determine the most relevant pages to return in the search results. The equation uses over 200 components, including things like PageRank and incoming links, to name just two.
Apparently, the name Hummingbird was chosen because of how fast and accurate these birds were. Although many webmasters disagreed, Google obviously thought at the time that this reflected its search results – fast and accurate.
Google wanted to introduce a major update to the algorithm because of the evolution in the way people used Google to search for stuff. An example Google gave was in “conversation search”, whereby people could now speak into their mobile phone, tablet or even desktop browser to find information.
To illustrate, let’s say that you were interested in buying a Nexus 7 tablet. The old way of finding it online was to type something like this into the Google search box: “Buy Nexus 7” However, with the introduction of speech recognition, people have become a lot more descriptive in what they are searching for.
Nowadays, it’s just as easy to dictate into your search browser something like: “Where can I buy a Nexus 7 near here?” The old Google could not cope too well with this search phrase, but the new Hummingbird was designed to do just that.
The old Google would look for pages in the index that included some or all the words in the search phrase. A page that included the exact phrase would have the best chance of appearing at the top of Google. If no pages were found with the exact phrase, then Google would look for pages that included the important words from it, e.g. “where” “buy” and “Nexus 7”.
The idea behind Hummingbird was that it should be able to interpret what the searcher was really looking for.
In the example above, they are clearly looking for somewhere near their current location to purchase a Nexus 7. In other words, Hummingbird was supposed to determine searcher “intent” and return pages that best matched that intent, as opposed to best matching keywords in the search phrase.
Hummingbird is still around today and tries to understand exactly what the searcher wants, rather than just considering the words used in the search term.
On December 2013, there was a drop in the authorship and rich snippets displayed in the SERPs. This was a feature where Google displayed a photo of the author and/or other information next to the listing. However, Google tightened up its search criteria and removed these features from listings.
Google Algorithm Changes in 2014
In February 2014, Google updated its page layout update. In May the same year, Payday Loan 2.0 was released. This was an update to the original Payday Loan algorithm and was thought to have extended the reach of this algorithm to international queries. Also in May, Panda was updated. It was called Panda 4.0.
Google Algorithm Changes in 2015
The Mobile-friendly Update On April 21, Google began rolling out an update that was designed to boost mobile-friendly web pages in the mobile search results.
To help webmasters prepare for the update, Google provided a web page where webmasters could test their site to see if it was mobile-friendly or not.
You can find the mobile-friendly testing tool here. To use this tool, you simply enter your web page URL and wait for the results. Hopefully, you will see something like this:
The mobile-friendly update:
- Only affects searches carried out on mobile devices.
- Applies to individual pages, not entire websites.
- Affects ALL languages globally.
This update makes a lot of sense. If someone is searching on a small screen, Google only wants to show web pages that will display properly on such devices.
The Quality Update (or the Phantom Update) was a confirmed change to Google’s core ranking algorithm – specifically, how Google assesses quality signals. Websites with content quality issues, as well as too many ads, seemed to be impacted the most by this update.
July 17, 2015
Google announced a Panda refresh that would take months to roll out and impact 2 to 3 percent of English queries. Due to the slow nature of the rollout, it’s unclear how substantial the impact was or precisely when it occurred. It was the final confirmed Panda update.
October 26, 2015
Though it had been in testing since April 2015, Google officially introduced RankBrain on this date. RankBrain is a machine learning algorithm that filters search results to help give users the best answer to their query.
Google Algorithm Changes in 2016
Google confirmed that Panda had been incorporated into the core Google algorithm, evidently as part of the slow Panda 4.2 rollout. In other words, Panda was no longer a filter applied to the Google algorithm after it does its work, but is incorporated as another of its core ranking signals. It has been clarified, however, that this doesn’t mean the Panda classifier acts in real time.
On the 23 rd of February, Google made some big changes to the SERPs, removing the right-hand column of adverts and placing a block of 4 adverts at the top of the SERPs. For any given search term, organic results were now pushed down the page. Above the fold, most searchers only saw the paid advertising links.
On May 12th, Google rolled out a second mobile-friendly update that essentially reinforced the first and made mobile sites perform better on mobile search platforms.
On the 1 st of September, another animal joined Google’s ranks. The “Possum” update was thought to target local search rankings, increasing the diversity of the local results, but also preventing spam from ranking. Local businesses that had previously found it difficult to rank for a city’s results, simply because they were just outside the limits of the city, now found it easier.
On 23rd September, Google announced Penguin 4.0. This was a long-awaited (and anticipated) update by the SEO community. Penguin 4.0 is real-time and a core part of the algorithm.
That means that any pages caught by Penguin, can be fixed, and those penalties reversed as soon as the page is re-indexed and reassessed by Penguin. With previous iterations of Penguin, webmasters had to wait months (or even years) to see if their SEO fixes reversed the Penguin penalty.
Google Algorithm Changes in 2017
In January, Google started to roll out an update that would impact pages that had intrusive popups or other “interstitials” ruining the mobile experience. Essentially, anything that covered the main content on mobile devices and required attention (e.g. a click) to dismiss it, was targeted.
March 2017 – Fred
Google’s Gary Illyes jokingly referred to this update as “Fred” and the name ended up sticking. But this algorithm was no laughing matter for those impacted. This major algorithm update seemed to mainly target low-value content. On March 24, Illyes officially confirmed the update. But Google has refused to share any more specifics, instead choosing to say that all the answers about Fred can be found in Google’s Webmaster Quality Guidelines.
In April, it appeared that HTTPS websites were favored over the insecure HTTP sites. In October, Google introduced Google Chrome warnings for insecure websites. That wasn’t an algorithm update but I felt it was important bearing in mind what I wrote in the previous paragraph. We also saw a reduction in the number of featured snippets in the search results, with an increase in knowledge panels.
December 12, 2017
Some in the search community reported their websites being hit by update between December 12 and 14. Google confirmed several minor changes to the core algorithm during the timeframe but downplayed the significance of the period of flux.
Google Algorithm Changes in 2018
In March 2018, Google rolled out the Mobile-First index. This change meant that instead of indexing desktop versions of pages for the search results, Google started using the mobile versions of the web pages.
Why? Because of problems searchers had on mobile devices when the desktop and mobile versions of a page were vastly different.
In July 2018 Google started showing all non-HTTPS sites as “not secure” in the Chrome browser.
In August, Google rolled out a big core update which has been nicknamed the “Medic” update. This update affected a lot of health-related sites (hence the name). There have been suggestions that this update targeted sites that made money by recommending products that could be “dangerous” to health or livelihood.
August 1, 2018
Google confirmed via Twitter for the third time this year the rollout of a broad core algorithm update. In doing so, Google’s Search Liaison Danny Sullivan recommended following the guidance it provided following the March 9, 2018 update. This update has been referred to as “Medic” by some in the industry, even though Google said it was a general ranking update and wasn’t specifically targeting medical sites.
September 27, 2018
On September 27 (Google’s 20th birthday), many within the SEO community began noticing significant spikes and drops in traffic, indicating some sort of update was underway. Some of the sites impacted by the August broad core algorithm update reportedly made a recovery. Google’s Search Liaison Danny Sullivan confirmed via Twitter September 29 that some sort of “smaller” update had taken place (but it wasn’t a broad core algorithm update).
October 31, 2018
Some webmasters reported changes starting around Halloween, perhaps indicating an (unconfirmed) Google update. But there was little evidence of a significant update here. The more likely cause of the chatter was spillover from the August broad core algorithm update and Google ramping up its use of neural matching.
Google Algorithm Changes in 2019
February 13, 2019
Algorithm trackers and industry chatter indicated some sort of unconfirmed update took place on and before this date. However, unlike other updates, mostly positive changes in rankings were being reported.
March 12, 2019
Google’s Search Liaison Danny Sullivan confirmed via Twitter the release of a global broad core algorithm update. This update is particularly important and one of the biggest Google updates in years.
Conclusion on Google Algorithm Changes
As you can see, Google has been very active in trying to combat the spam thrown at it. The two major updates that changed everything were Panda and Penguin. Together, these two technologies weed out low-quality pages, and pages that have been engineered to rank highly in the search engines.
Anyone who builds a website will want it to rank well in Google. Without having a high-profile presence in the SERPs, a site won’t get much traffic, if any at all. If that happens, webmasters WILL try to boost their rankings, and the traditional way is by working on the “on-page SEO” and inbound links. This is where webmasters and Google collide. Google wants the best pages to rank at the top of the SERPs for obvious reasons.
So, Google rewards the pages that deserve to be at the top, rather than pages that webmasters force to the top using SEO (much of which Google collectively calls “Webspam”). What this means to you is that you must deliver the absolute best quality content you can. You need to create content that deserves to be at the top of the SERPs and is likely to attract links from high-quality sites in your niche.
Content is not only King now, but it has always been King.
The difference now is that Google algorithms are so much better at identifying great content. It’s no longer easy to take shortcuts and use underhanded tactics to trick the algorithms as it once was.
Fortunately for you, Google offers a lot of advice on how to create the type of content it wants to show up in the SERPs. In fact, it created “Webmaster Guidelines”. These web pages tell you exactly what Google wants, and just as importantly, what it doesn’t want. We’ll look at these shortly, but first a question.