A History of Major Google Algorithm Updates from 2000–Present. Each year, Google makes hundreds of changes to search. In 2018, they reported an incredible 3,234 updates — an average of almost 9 per day, and more than 8 times the number of updates in 2009.
Google’s algorithms are a complex system used to retrieve data from its search index and instantly deliver the best possible results for a query. The search engine uses a combination of algorithms and numerous ranking signals to deliver webpages ranked by relevance on its search engine results pages (SERPs). Google uses a number search algorithms and signals to govern the way it ranks relevance for queries in the search engine results pages (SERPs).
Google announced the BERT Update, calling it the biggest change to search in the past 5 years. Google uses BERT models to better understand search queries. Google said this change impacted both search rankings and featured snippets and BERT (which stands for Bidirectional Encoder Representations from Transformers) will be used on 10 percent of U.S. English searches.
SEO is short for “Search Engine Optimization”, which is the on-going process of improving the visibility and ranking of a website or web page in the organic search engine results presented in Google and other search engines. A complex algorithm determines how search engines list the results for any given search term or phrase.
Last google pagerank update was on 4 February 2013. Google pagerank basically ranks a page on the basis of its backlinks and how authoritative that link is. And high Google pagerank may boost ranking of the site and reputation.
Few Tips to Increase your Page Rank
Guest posting
Article marketing or Directory Submission
Blog commenting
Forum Posting
Submit Feed to Feed directories
Submit blog to Social bookmarking sites
Proper SEO structure
Fred
The latest of Google’s confirmed updates, Fred targets websites that violate Google’s webmaster guidelines. The majority of affected sites are blogs with low-quality posts that appear to be created mostly for the purpose of generating ad revenue.
Google Fred is an algorithm update that targets black-hat tactics tied to aggressive monetization. This includes an overload on ads, low-value content, and little added user benefits.
Google Fred is an algorithm update that targets black-hat tactics tied to aggressive monetization. This includes an overload on ads, low-value content, and little added user benefits.
Possum
The Possum update ensured that local results vary more depending on the searcher’s location: the closer you are to a business’s address, the more likely you are to see it among local results. Possum also resulted in greater variety among results ranking for very similar queries, like “dentist denver” and “dentist denver co.” Interestingly, Possum also gave a boost to businesses located outside the physical city area.
Rankbrain
RankBrain is a machine learning algorithm that works in conjunction with the Hummingbird Update to give better search results for user queries.
RankBrain is part of Google’s Hummingbird algorithm. It is a machine learning system that helps Google understand the meaning behind queries, and serve best-matching search results in response to those queries. Google calls RankBrain the third most important ranking factor. While we don’t know the ins and outs of RankBrain, the general opinion is that it identifies relevance features for web pages ranking for a given query, which are basically query-specific ranking factors.
RankBrain is a component of Google’s core algorithm which uses machine learning (the ability of machines to teach themselves from data inputs) to determine the most relevant results to search engine queries. Pre-RankBrain, Google utilized its basic algorithm to determine which results to show for a given query.
Mobile
Google’s Mobile Update (aka Mobilegeddon) ensures that mobile-friendly pages rank at the top of mobile search, while pages not optimized for mobile are filtered out from the SERPs or seriously down-ranked.
The update is solely focused on providing higher positions within the search rankings for websites that are optimised to run efficiently on a mobile device (android and iPhone software).
The thinking behind this is simple; because more and more online traffic is coming from mobile devices, it is necessary for Google to rank (and penalise) websites for their effectiveness on that platform, and what better way to encourage websites to increase their performance on mobile devices than by favouring or penalising sites that have mobile compatibility issues.
Pigeon
Pigeon affects those searches in which the user’s location plays an important part. The update created closer ties between the local algorithm and the core algorithm: traditional SEO factors are now used to rank local results.
Google Pigeon is the code name given to one of Google’s local search algorithm updates. This update was released on July 24, 2014. The update is aimed to increase the ranking of local listing in a search.
Launched on July 24, 2014 for U.S. English results, the “ Pigeon Update ” is a new algorithm to provide more useful, relevant and accurate local search results that are tied more closely to traditional web search ranking signals. Google stated that this new algorithm improves their distance and location ranking parameters.
The major aim of Google pigeon is to increase the local listing in a search. It is way too beneficial for all the local business owners. It allows them to create awareness about their products or services to the people.
Hummingbird
Hummingbird helps Google better interpret search queries and provide results that match searcher intent (as opposed to the individual terms within the query). While keywords continue to be important, Hummingbird makes it possible for a page to rank for a query even if it doesn’t contain the exact words the searcher entered. This is achieved with the help of natural language processing that relies on latent semantic indexing, co-occurring terms and synonyms.
The “Hummingbird” update was the first major update to Google’s search algorithm since the 2010 “Caffeine” search architecture upgrade, but even that was limited primarily to improving the indexing of information rather than sorting through information.
What Were the Goals of the Google Hummingbird Update? The Hummingbird update – which was announced September 26, 2013, but had actually rolled out the prior month – was the next evolutionary step after 2010’s Caffeine update and other significant changes that influenced how users respond and engage with search results.
Google page layout algorithm
The Google Page Layout algorithm was an effort by Google to reward websites that promote content and penalize websites that have advertisements in the first fold of web pages. What is the Google Page Layout Algorithm? In January 2012, Google first announced the Page Layout Algorithm, developed based on complaints from searchers.
The arrival of a high-quality user experience and more sophisticated on-page SEO might have felt premature in the wake of 2011 Panda updates, but Google made it official on January 19, 2012: the page layout algorithm was here.The page layout algorithm update targeted websites with too many static advertisements above the fold. These ads would force users to scroll down the page to see content.
Penquin
Google Penguin’s objective is to down-rank sites whose links it deems manipulative. Since late 2016, Penguin has been part of Google’s core algorithm; unlike Panda, it works in real time.
Google Penguin is a codename for a Google algorithm update that was first announced on April 24, 2012. The update was aimed at decreasing search engine rankings of websites that violate Google’s Webmaster Guidelines [2] by using now declared black-hat SEO techniques involved in increasing artificially the ranking of a webpage by manipulating the number of links pointing to the page.
Panda update
Panda assigns a so-called “quality score” to web pages; this score is then used as a ranking factor. Initially, Panda was a filter rather than part of Google’s ranking algo, but in January 2016, it was officially incorporated into the core algorithm. Panda rollouts have become more frequent, so both penalties and recoveries now happen faster.
Google Panda is a change to Google’s search results ranking algorithm that was first released in February 2011. The change aimed to lower the rank of “low-quality sites” or “thin sites”, in particular “content farms”, and return higher-quality sites near the top of the search results.
Thus, the basic difference between Google Panda and Google Penguin is that Panda targets sites that are spam and Penguin deals with bad linking and overstuffing of keywords. In case of Panda the only way to survive Panda’s changes is to update the website and do away with all the useless pages low on quality.
The Panda algorithm is the official name of Google’s algorithm update. This update was developed to reduce the widespread presence of low quality, diminished content in search results, while elevating and rewarding unique, engaging content.
The Panda update was firstly called Farmer Update because the main target of it were the content farm websites, which were simply gathering user-submitted content allowing backlinks the user’s website. Most of the content was thin, meaning low on quality, providing few or no information to the reader. Panda update also targeted thin content websites like blogs, which had low-informative content displayed. An important thing to know about Panda is that it is a site-wide penalty; this means that if the Google Bot considers one or two pages from a website as being low-quality, the whole site will suffer a loss in SERP.
Panda update advantage
Which websites were placed in advantage by the update? Even though Panda update was supposed to be a leveling update, websites with high-quality deep analysis content have the chance to grow in search engine results. This is another step from Google towards making the quality over-balance the quantity of the content. However, it happened that soon after the first release of the update, many sites suffered from scrappers. Scrappers are the websites that copy original content from other sources, and getting on top of the results instead of the original websites were the content was first posted. However, Google resolved a large part of this problem in the next Panda algorithm updates.