Whether you’re an animal lover or not, it’s hard not to appreciate the blatant adorableness of pandas, penguins, and hummingbirds. Except, of course, when they have to do with Google’s changing algorithms. Then it’s like, “How can something so cute be such a nuisance?”
Google algorithms are, to say the least, complicated. Unfortunately, they will continue to grow more complex as Google strives to:
- Provide searchers with the information they need
- Work on ways to stop people from cheating
- Make search engine results as relevant to users as possible
Drawing on information from Moz’s Google Algorithm Cheat Sheet, we learned that, in the past, the Google algorithm changed very infrequently. If your site was ranking #1 for a certain keyword, for example, it was guaranteed to stay in that spot for weeks or months until the next update occurred.
The launch of 2010’s “Caffeine” changed all this, and since then, search engine results change several times a day instead of every few weeks. In fact, Google makes about 600 algorithm changes a year, many of which go announced. When Google makes a real whopper of a change, though, they usually give it a name, along with making announcements that turn the SEO world on it’s head as marketers and webmasters frantically try to figure out the new changes and how they can use them to their best advantage.
And here’s where those cute animals we mentioned earlier come into play: The Panda, Penguin, and Hummingbird algorithms are some of the biggest changes Google has made in the past six years or so. But what are they and how do they affect websites?
The Panda Algorithm
Panda, originally released in 2011, endeavored to show high-quality sites higher in the search results, pushing lower-quality sites down in position. This algorithm change, unnamed when it first came out, was often referred to as the “Farmer” update because of it’s devastating affects on content farms.
Panda is a sort of learning AI that analyzed the input of a crowd-sourced group of human search raters. The AI determined what a good page looked like, what a bad page looked like, and what factors influenced this decision. Google took these results and created the Panda update, which demolished entire industries of low quality content mills and scraper webpages.
Source: SEO Blog
The Panda algorithm, concerned with on-site quality, is a site-wide issue, meaning that Google doesn’t just demote certain pages of your site in the search engine results, but considers the entire site to be of lower quality.
The blog post by Google Employee Amit Singhal includes a checklist you can use to determine if your site is high quality or not. It’s a pretty lengthly list, so we didn’t include it here, but you can find it at Google Webmaster Central Blog.
All the items on the list can be indicators of how users might rate the quality of your site. While it’s hard to say for certain all of the factors Google uses in determining the quality of your site—according to Panda criteria—the focus is ultimately on creating the best site possible for users, as well as making sure you provide Google with the highest level of content for indexing.
Things to Avoid:
- Thin Content – Provides little or no value to readers
- Duplicate content – Whether it’s copied from other sources or duplicated from your own site
- Low-quality content – Poor content with information that no one is engaging with
The Panda algorithm is typically “refreshed” about every month, with announcements being made only when there’s a really big change. Each time Panda refreshes, Google reviews websites and determines whether or not they are quality sites in regards to the outlined criteria.
Has your site been adversely affected by Panda? If you’ve since removed thin, duplicate, or low quality content, when Panda refreshes, you should see an improvement. Sometimes, however, it might take a couple of refreshes to see the entire result of your efforts since it can take several months for Google to revisit all of your pages and register your changes.
Every so often, Google does a Panda update (like the 2014 Panda 4.0), instead of a refresh. This generally indicates that Google changed the criteria they use to establish what is / is not high quality content. While these updates can result in significant changes and dramatic recoveries, if you work hard to improve your website’s content you should see better results.
The Penguin Algorithm
Penguin came out in 2012 and was established to reduce trust in sites that used unnatural backlinks to cheat their way to better Google results. While there can be other factors that affect sites in the eyes of Penguin, the algorithm’s main focus is primarily on unnatural backlinks.
The Importance of Good Links
When a respected site links to yours, it’s like getting a vote or recommendation for your site. Likewise, small or unknown sites that link to yours won’t count as much as a link from an authoritative site. That being said, getting a large number of small links can nevertheless make a difference. So, in the past, SEOs tried to obtain as many links as possible from any source available.
Because obtaining large quantities of links from low quality sites was relatively effective, SEOs created links from places such as directory listings and self made articles, as well as links from comments and forum posts. It was these types of low-quality, self-made links that Penguin was trying to detect. You might look at the Penguin algorithm as a sort of tool that Google uses to place a “trust factor” on your links.
Penguin was established as a site-wide algorithm, meaning if it determines that most of the links to your site aren’t trustworthy, Google’s trust in your entire site is reduced, leading to a dip in rankings.
If your site has has been dinged by Penguin, here’s what you need to know:
“While the initial impact was the largest, both Panda and Penguin are in fact ongoing initiatives. Both updates receive occasional upgrades, changes that continue to affect webmasters to this day.”
Source: SEO Blog
Sites gets reevaluated when the algorithm re-runs. Start now to identify any unnatural links to your site and remove them. If it’s not possible to remove them, you can request that Google not count them with the help of the disavow tool. For more information on Google’s disavow tool, see SEO Blog.
When Penguin refreshes or updates again, providing you’ve done a decent job of removing unnatural links from your site, you will regain Google’s trust. Sometimes sites will need to patiently wait through several refreshes in order to obtain the “all clear” from Penguin because it can take 6 months for disavow files to be processed.
Not certain how to identify which links to your site are unnatural? Moz offers the following resources for you:
- What is an unnatural link – An in depth look at the Google Quality Guidelines
- The link schemes section of the Google Quality Guidelines
Keep in mind that when sites recover from Penguin, they don’t typically shoot to the top of the rankings because the pre-Penguin high rankings were based on links that have been judged as unnatural. See Moz for information on what to expect when you have recovered from a link based penalty or algorithmic issue.
The Hummingbird Algorithm
Hummingbird, initially released in 2013, is very different from Penguin or Panda, in that it was a “complete overhaul of the entire Google algorithm”.
“Think of a car built in the 1950s. It might have a great engine, but it might also be an engine that lacks things like fuel injection or be unable to use unleaded fuel. When Google switched to Hummingbird, it’s as if it dropped the old engine out of a car and put in a new one.”
Source: Search Engine Land
Interestingly, Google told Search Engine Land that the algorithm was dubbed Hummingbird because it refers to being “precise and fast” as it applies to Google’s goal to produce better user queries.
“Hummingbird should better focus on the meaning behind the words. It may better understand the actual location of your home, if you’ve shared that with Google. It might understand that “place” means you want a brick-and-mortar store. It might get that “iPhone 5s” is a particular type of electronic device carried by certain stores. Knowing all these meanings may help Google go beyond just finding pages with matching words.
In particular, Hummingbird is paying more attention to each word in a query, ensuring that the whole query — the whole sentence or conversation or meaning — is taken into account, rather than just particular words. The goal is that pages matching the meaning do better, rather than pages matching just a few words.”
Source: Search Engine Land
Has your site been negatively affected by Hummingbird? if so, you’ll need to focus on creating high quality content that provides useful information, while continuing to improve SEO, and increase your entrance portals. Long tail keywords are also back in style, so don’t forget to throw in a few of those, too.
For more on the Hummingbird, we suggest checking out the very thorough article on Kissmetrics, which includes helpful information on optimizing for this Google algorithm.
What Google’s Algorithm Changes Can Teach You About Your Content
Google algorithm changes—like Panda, Penguin, and Hummingbird—apparently aim to encourage you, and other webmasters, to publish the highest level of content possible. After all, “Google’s goal is to deliver answers to people who are searching”.
As Kissmetrics points out, Panda wanted you to provide unique content that “offers a creative and interesting read for traffic and engages the reader to the point that they want to share it”. Hummingbird is now asking you to include content that is useful, too. You can still improve your ranking in the search engines by continually improving your SEO, but content is king!
By producing original and useful content for user queries as frequently as possible, as well as avoiding stratagem that may cause Google to lose trust in your site, you can keep those pesky algorithm changes from messing with your search rankings (thus restoring your affection for those adorable animals they’re named after).