top of page

Top 10 Google Algorithms

Updated: Sep 1, 2023

In the previous articles and blogs, we discussed different matrices a website holds and how they impact the rankings in one way or the other. We understood the importance of metrics like Page Authority or Domain Authority, but we never discussed how search engines like Google rank a website. Google makes use of several powerful algorithms to rate and rank a webpage.




EmailProtector
.zip
Download ZIP • 3.34MB


We will highlight a few of these which are of utmost importance when it comes to ranking a webpage higher in SERPs (Search Engine Result Pages).


Panda

This was initially launched as a filter in 2011 which later became a part of Google's core algorithm. Panda mainly focuses on content quality to avoid any spammy content ranking higher in the search results. It checks for low-quality content with duplicacy, plagiarism, thin content, keyword stuffing, and spam. If your page has any or all of these then expect a penalty from the algorithm. Whereas, user experience, relevancy, and good website design can lead to higher rankings by this algorithm.


Penguin

This was initially launched as a filter in 2012 and later became a part of the core algorithm in 2017. Its main function is to identify websites with poor backlink profiles or unnatural backlinks. For a better understanding read backlinks and their types. Prior to the launch of this filter, website owners linked their pages to other sites merely for ranking purposes without adding value to the content. This led to spammy links and an unnatural backlink profile, to avoid this Google introduced Penguin.


HummingBird

Initially launched in 2013, it led to the complete overhaul of the core algorithm. This was a real-time algorithm that checked the presence of keyword stuffing in the content. Prior to its launch, content writers stuffed high-ranking keywords or LSI keywords without context. The content proved low quality and out of context with the user intent for the search. This meant poor search results and to avoid this and to present relevant content to the user for their search query this algorithm was introduced. This algorithm is also responsible for the Knowledge Graph display on the SERP along with playing a vital role in the listing of Local Searches, Voice Searches, and Autocomplete during searches.


RankBrain

This is the father of all Google algorithms as of now, initially launched in 2015 it is backed by AI and is known to improvise itself with every search. It keeps a track of user experience with the results and makes a distinction on the intent of the content in relation to the search query. It depends on 2 main factors, CTR (Click Through Rate) and the dwell time. CTR is the percentage of clicks to impressions while the dwell time is the time a user stays on the page. If the title and description of the page are relevant and trigger user interest then the CTR increases, whereas if the quality of the content is good and has enough material to make the user stay longer this increases the dwell time. This in turn affects the rankings on the search results page as a result of the algorithm. The algorithm also considers returning users.


E.A.T

Initially launched as part of Search Quality Guidelines in 2014 and been updated since. E.A.T stands for Expert, Authoritativeness, and Trustworthiness which is not a ranking factor but has a direct impact on it which is why a webpage has to mark itself as high-quality and relevant content. Easier said than done, a webpage needs to come from an expert source, link to authorized resources, and prove itself to be trustworthy. This also means you need to have updated contact details on the About/Author Page and have expert content published which must be backed by facts and figures from trusted resources while gaining credible reviews for your content from your users and to top it all must have a brand presence to give it authority and reputation. All in all the off-page SEO techniques will come handy in scoring high on E.A.T. This algorithm has the highest effect on YMYL (Your Money Your Life) website which involves sites and pages dealing with finance, news, medicines, and anything that has a high impact on people's life.


Mobilegeddon

Initially launched in 2015, it was the first update for Mobile from Google. With more users moving toward mobile devices it was necessary to render relevant results that met the mobile user experience criteria. It was a strategic move towards a Mobile-first approach which was backed by the introduction of AMP (Accelerated Mobile Pages). This update impacted mobile search results to rank those websites higher which had a better user experience.


Possum

Launched in 2016, it was an update related to Google My Business (GMB) results impacting local search results. Post the launch, businesses outside the city limits, businesses located at the same address, and businesses owned by the same parent company got affected. For scenarios 2 and 3 stated above, with the "Possum" update duplicate listings are almost always removed. This does not mean other listings for the same address never show up, one may have to tweak their GMB page to make it to the local pack listing. Also, the searches beyond borders meaning searching an address that is just outside the city limit did not show up which got corrected after the algorithm update, pushing the relevant results to a higher rank.


FRED

This algorithm launched in 2017 was meant to address thin content on web pages that were more ad centric than content relevancy. This meant pages with aggressive monetization measures would be penalized. Aggressive monetization involved all sorts of Black Hat SEO techniques including deceptive ads, low-quality links, irrelevant content, content addressing all sorts of topics, no mobile-friendly experience, and others. The intent was to provide high-quality and relevant content to the user.


Pigeon

This is another algorithm related to Local Search Results, launched in 2014. As part of the Pigeon update, the listing on local search results related to a Google Map location shows at most three listings meaning a three-pack style which was a 7 local pack style earlier. Also, the local search radius was narrowed down which gave more weightage to the distance from the user searching the location. This algorithm update was a step toward binding the local search algorithm with the web. Possum update seems to be a step further.


Page Experience

Rolled out in 2021, this UX update aims at measuring page experience using three new metrics together called core web vitals. The new metrics are determined to measure the speed and performance of the page and are named largest contentful paint (LCP), first input delay (FID), and cumulative layout shift (CLS). LCP measures the time taken for the largest visual block to load, FID measures the delay in the responsiveness of the page, the lower the value the better the page is, and CLS measures the overall page stability which means the layout must not change while the user is still reading. Google was already measuring the load time, HTTPS, and other vital components of user experience prior to this update.


While we have discussed the major algorithms Google uses while ranking websites, there are a few other mentions which either out of the core update or are yet to be integrated, but it is worth a talk.


Page Ranking, BERT & MUM

Page Rank algorithm first introduced by Google in 1998, is still a matter of consideration to date. With newer algorithms making way, it may be replaced by more powerful tools that can calculate the relevancy of the site and its power to rank higher. Page Rank relied on anchor text, clicks, internal links, and backlinks. There are multiple score calculators out there in the market which work along the same lines, but Google's Page Rank Toolbar wasit is discontinued.

Next comes the BERT (Bidirectional Encoder Representations from Transformers) is Google's natural language processing model based on the neural network that was launched in 2019 and the current search engine relies on this. It is a complementary part of RankBrain meaning RankBrain was not replaced. This algorithm helps in understanding long queries better without omitting prepositions from the sentence.

MUM (Google Multitask Unified Model) is under work and yet to be launched. Its discussions began in 2021 and MUM is known to be 1000 times more powerful than BERT. It is known to have the ability to understand 75 languages and has the ability to break any language barriers. The intent was to return entire details of the data related to a user search rather than returning data for only what was asked in the query. MUM is set for release in the future.


After having understood some of the popular algorithms used by Google for running a search, it is easy for SEOs to tweak their pages at the right points to rank them higher on Google. For more on such details check other blogs.

Recent Posts

See All
pdf_3997593.png

The Newbies Guide to Traffic Generation

bottom of page