Thursday, 6 June 2019

Evolution of Google Search Engine

Google's Evolution as a Search Engine

 

This blog will give you a glimpse of Google's Evolution as a Search Engine.

 
Evolution of Google search engine

Google actually started as a project of two Stanford University PhD students in September 1997. Larry Page and Sergey Brin proposed a search system better than conventional search engines. Their new search engine determined a website's relevance by the number of pages, and the importance of those pages that linked back to the original site.


Known as “BackRub” initially Page and Brin's search algorithm was officially launched as a company under the name “Google” in 1998.



In the early days a Google search query used to give the results in 24 hours via an email address provided by the user. Soon the user could see the evolution of  Google as a mature search engine which began to give immediate results. 


Aspects of SEO

 

How Google learnt from its mistakes



 On September 11, 2001 the World Trade Center(WTC) at New York, United States was attacked by Al Qaeda, a terrorist group. It was a man made catastrophe which shook the whole world. These were the times Google was trying to evolve into a prominent search engine. But Google’s search results on the WTC attack was not satisfactory for most of its users. Google’s engineers got together to find out the reasons behind the unsatisfactory search results.  The main reason that stood out was that most of the web pages available then were not ‘crawlable’ by Google. The web pages were too heavy to be ‘crawled’ by Google. Editions in a web page can be done only by its WebMaster and not by Google. 

A WebMaster is the custodian of a Website. In order to earn the trust of its users Google decided to make public a document of guidelines to ‘Crawlability’. 


“ SEO Starter Guide” was the document Google released to earn users’ trust, which was an utmost priority for the evolution of Google as a prominent search engine.

SEO starter guide

 

Let us know the terms 

How does a search engine like Google find, crawl and rank the trillions of websites out there in response to a search query?


how search engines work


Getting the search results page involves 3 main steps:
  •  Crawling: Search Engines have crawlers (Spiders/Robots) that “crawl” the Web to identify the best web pages for a search query.
  • Indexing: Indexing means Google has visited a particular website and has added the website to its database.
  •  Caching : Caching means Google takes a snapshot of the website when it visits the website.

 

Evolutionary Changes in SEO 

In 2002-2003 Google was Content Specific - it used to look for the specific keywords. As a result webpages became competitive as to who would have most number of preferred keywords. Thereby the quality of webpages came down.
 As an improvement Google became Link Specific- it began looking for links from and to the website. As a result users were in a mad rush to increase the number of links from and to their website. 

SEO  Practices

Seach Engine Optimization (SEO) practices are broadly classified into categories: 

  • White Hat SEO 

White Hat SEO practice refers to the most genuine practices to get higher page rank. They strictly adhere to Google guidelines for SEO.


blackHat WhiteHat SEO Practises

 

  • Black Hat SEO 

Black Hat SEOs look for loopholes in Google’s algorithm in order to rank higher in the search engine results. E.g: A Black Hat SEO is Keyword Stuffing - It is an unethical way of repeating keywords so that the website gets noticed. 


Next Google became Quality link specific- ‘Page Rank’ technology was introduced to assess the quality of the sites. 
Page Rank is  Google Search Engine's algorithm to rank web pages in the search engine results.

A Page Rank of 10 out of 10 was the best a webpage could get for its quality, such a page rank was very rare. e.g: Websites having 10 on 10 page ranks were very few- twitter.com , USgov.org, Upgrade link of Flash Player. 9 on 10 page ranked websites were google.com, facebook.com, wikipedia.org,etc 
A higher page rank implies that webpage has a higher trust value in the World Wide Web. 

2000-2003 Updates: 

Google AdWords, AdSense and Google Toolbar were launched.
  •  Google AdWords: It is used by user to advertise his/her product. 
  •  Google Toolbar: Google Toolbar helps people search without visiting Google homepage 
  •  Google AdSense: It is a means of making revenue from Google. 

2005 Updates: 

Dont follow

 

 Google joined hands with Yahoo and MSN for the “NoFollow” attribute. A “nofollow” attribute is added in the html page to inform Google that the concerned link does not get any page rank value from the web page. If a “nofollow” attribute is not given and more such links are given in a webpage then the webpage’s Page Rank comes down. A webpage having comments with links will mostly be nofollow links.

2009 Update: 

Google introduced Google Suggest. Google launched Real Time results. 

 2010 Update: 

Social Media Signal : When the concerned website is shared via social media and there are interactions through the shared link then the Page Rank of the website improves. 


Google Panda 2011 Update:

Google Panda Update

 The main purpose of Google Panda Update was to minimize the number of low quality websites in the search engine results. The initial rollout of Panda Update affected affected nearly 12% of English language websites.

 What Google Panda dealt with ? 

 Content Duplication - Content that is copied from the internet is present in other web pages. 

Thinny pages - Web Pages with very little relevant material are thinny pages. Google compares the page content is very less than html page content. 

Content spinning - Using and reusing the same sentences , copied from another website is called Content spinning. 

High Ad Content Ratio- Websites having most of its webpages filled with Advertisements and proper content is very less. 

Low quantity content pages - They include pages having grammatical mistakes and spelling mistakes. 



2014 Panda 4.0 Update: 

Panda 4.0 update affects different languages in different degrees. Panda 4.0 affected approximately 7.5 % of all english language queries. E.g: eBay lost nearly 80% of its organic rankings overnight! By the latest version of Panda Update , Panda has become a permanent filter in Google’s main algorithm. 


What are ‘sitemap.xml’ and ‘Robots.txt’? 

 Sitemap.xml is an xml file that lists all URLs for a website. It acts like a roadmap of one’s website for Google to crawl, cache and thereby index all the webpages of the website.It also has information on when each new post was last updated. This helps Google to know that there is some new content to crawl and to index. 


Uses of Sitemap in Google


Robots.txt is a text file created by webmasters to instruct search engine robots how to crawl their webpages.Usually robots.txt tell which parts of the webpage have to be crawled or not. These crawl instructions are given by “allowing” or “disallowing” web-crawling software. e.g: Disallow: /search Above example tells crawler not to crawl search folder.

 

2012 Penguin Update 

 

Google Penguin Update

 It was Google’s main Update against link spamming- reduce the presence of websites involved in manipulative link schemes.

Link spamming involves Black Hat SEO practices like Link Exchange, Paid links , lesser quality directory links, link farming, comment farming. With the release of Penguin 4.0 in September 2016 Penguin began operating real-time. Disavow tool This is Google’s solution to help webmasters protect links related to their webpages. Usually low quality links are penalized by Penguin. Disavow tool tells Google to neglect the backlinks. 

Guest Blogging 
Guest blogging was considered essential to improve a website’s page rank. But slowly this option became business oriented in the market and too much traffic and equity flows resulted from guest blogging. Such sites were taken down by Google. 


What is a ‘Content Management System’?  

Content Management system

CMS or a Content Management System is used to maintain websites in a simple and faster way. It is a software that helps in creating, editing, organizing and publishing content.As otherwise making corrections, adding new pages, changing themes in many webpages of a website are time consuming jobs. Some of the examples of Content Management Systems are: wordpress.com, majento.com, joomla.com,etc.

2014 Pigeon Update

This update is regarding local listing / localseo/ localranking. The Pigeon update affected Google Maps search results along with regular Google search results.


Google Pigeon Update

Following steps are necessary for one’s website to have a local listing: 

    1.Place Mark the website through Google                    Business. 

    2.One must have a full address, PIN Code and            phone number in the website.  

     3.The website should be listed in local directories- e.g: justdial, sulekha, yellowpages 

     4.In the website’s About page /Contact page geotagging has to be done. Geo tagging means embedding gmap in one’s website.

     5.Likes/Interactions in your social media should be from local contacts . 



Hummingbird Update 2013 

Google Hummingbird Update

This update in Google is to give a semantic search result- in depth search result. The search result will be a step a stepwise or listwise result. e.g: search for " how to play flag football" looks like this

Semantic search result

Google gives this search result based on feedback, which is in turn based on Artificial Intelligence.


2015 Rank Brain Updation:




RankBrain Update

This up gradation is also called Rank Brain upgradation of Google. RankBrain is part of Google’s Hummingbird algorithm.

In this updation use of Artificial Intelligence increased a lot. Just like how an adult human being responds to a query, Google began giving its search results for a query.
e.g: A senior citizen sees and cordless mouse and queries in Google as 'mouse without wire', whereas a technical person would query for the same product as 'wi-fi mouse ' or 'bluetooth mouse'.
     In such a case google behaves like a knowledgeable person and gives both the persons the necessary search results.



2015 Update- Mobilegeddon

This updation of Google is a Mobile-friendly updation.

Mobilegeddon Update

 One's website  should be mobile friendly and easily readable in mobile devices with tapping/zooming.
Unplayable content  should not be present, i.e: there should not be any content that needs horizontal zooming. 

2014 Pirate Update:


Google Piracy Update


This Google update was released to tackle copying of multimedia files, which made it harder for people to watch free movies and music torrents online.

Worst affected ones were PirateBay, torrentz.eu,etc.


2018 Medic core Update:

 

Google Medic Update

 


This update is Google's Health and Wellness related update.


Thank you for reading my blog. I value your comments and suggestions, please add them. Looking forward to meeting you in my next blog on similar topics.
 


No comments:

Post a Comment