History and Evolution of SEO


What is SEO


Can you imagine a life without Google - one of the most-used search engines? The answer would be an absolute NO. Google and its services have become an intrinsic factor of our day-to-day life. What enables Google to get us the results we are looking for? How it filters search results from its data reservoir? Have you ever thought about it? It is a technique called Search Engine Optimization or SEO that enables Google to respond with relevant results to our queries.  Over the years, SEO evolved a lot from a 32-page starter guide. Here is a piece of writing which will help you understand the history and evolution of SEO.

What led to the birth of SEO and how it evolved?

Google was an academic project of two young men, Larry Page and Sergey Brin, in the late 90s. Neither of them had ever imagined that their academic project would become so popular and make finding information online easier. Initially, Google was employed as a subdomain of its creators’ university website where the search results were forwarded to the provided email address. It took more than 24 hours to complete this operation.

Internet usage had reached its peak by the year 2000. Following the Al-Qaeda attack on World Trade Center on September 11, 2001, many people who had access to Google searched for news updates. But, it failed to provide relevant results, giving a blow to those who were behind Google. Officials at Google met and discussed the issue.  They found that Google was unable to crawl the available web pages to bring out the search results about the world’s biggest building. [The word CRAWLING in this context means the process of scanning a webpage by the search engine. It is one of the three processes Google does to get us to search results. The other two processes are CASHING (saving the snapshot of the web page according to the search engine’s categories) and INDEXING (providing the final result).] 

World Trade Center attack


Google decided to increase its efficiency by crawling more web pages. Therefore, it prepared a 32-page best practice manual or guidelines entitled ‘SEO Starter Guide’ for webmasters (the custodian of a website).  As webmasters started using these guidelines, Google was able to crawl more websites and its service started growing, and so does its challenges.

Evolution of SEO

Initially, Google was content-specific or niche specific. When a user searches with keywords, it provides those web pages which carried those keywords repeatedly. More the keyword, the higher the possibility of top page ranking. In other words, repeated use of keywords in a website helped to increase a website’s Google ranking.

When Google search became popular, more numbers of websites tried to get Google’s visibility and started stuffing their pages with keywords to get top ranking. This black hat technique (unethical practice) to increase the page ranking created a headache for Google when they realized that the quality of search results is not as good as expected.

So, they decided to make a change in its algorithm. They started giving link-specific ranking. When a top-ranked page mentions another website and gives links to navigate to that page, it is considered a vote. The website that receives more number of links from other websites is likely to get top ranking.  Many of the websites managed to get onto the top ranking using this algorithm. At the same time, many webmasters made use of this opportunity and turned into link sellers, compromising the quality of the website and their services. This again worried Google.

Aspects of SEO

Then, Google proposed the quality link-specific technique, in which pages were scored out of 10. The trust value of that website was shown by the rankings. Here, in addition to the linkages, 200 additional parameters were taken into account. Then, high-rank web page owners began to sell links to webmasters.

Passing the Juice

Google, therefore, devised a new strategy called "Passing the Juice." Webmasters with high page rankings will see a drop in their ratings if they connect to other websites. The value or equity conveyed from one page to another is referred to as link value. Additionally, Google proposed a "no follow" mechanism that halts the transfer of equity.

Passing the juice

Why Google made repeated changes in its algorithm? And why it was concerned?

As mentioned above, Google changed its algorithm whenever it felt that the quality of its search results are compromised. After introducing the “no follow’ attribution, Google tried to implement various changes in its algorithm, but it was highly concerned about doing so. What made them so concerned? The story is as follows.

In 2000, when Google soared in popularity, it started a program called Google AdWords, now known as, Google Ads. It is an advertisement program designed to help businesses to find their customers in a customized way. For example, a rent-a-car business in a particular locality can reach out to its customers in that area by customizing its location. All you need is a Google account for your business.  It was a huge success for Google, leading to the birth of major online businesses like Amazon, Flipkart, eBay, etc. Adwords was a major income source for Google in the initial years of 2000.

Google Adwords


How it works?

Google AdWords service is similar to how a prepaid mobile service works. When a business with a Google account decides to advertise its products or services through Google, they will be asked to pay a certain amount to run the advertisement campaign. Further, the advertiser is asked to set a daily spending limit. For every click the advertisement gets, a certain amount will be deducted from the daily limit (Cost Per Click aka CPC) and Google stops showing ads when the daily spending limit hits.  Here, the possibility for a lead to convert into business is very high. CPC varies depending on various factors.

Following the success of Adwords, Google introduced Google Adsense in 2003. Google AdSense is a program run by Google through which website publishers in the Google Network of content sites serve text, images, video, or interactive media advertisements that are targeted to the site content and audience. These advertisements are administered, sorted, and maintained by Google.

How does cookies work


When a website applies for Adsense, Google checks it for its trustworthiness and understands the character of the website. If it finds the website eligible, Google will provide a Javascript code for the ads to be inserted into the website. This code can be copied into the website and it will be shown properly. If any visitor to the website finds the ad interesting and clicks on it, Google will get a payment from the advertiser’s budget (Pay Per Click aka PPC) set for the campaign. At the same time, the website owner earns a percentage as a commission.

Those who get Adsense approval will always strive to get more visits and monetary benefits by adding quality content frequently. If Google decides to change its algorithm at a time when a website enjoys millions of visits and a handful of revenue and the ranking of the website goes down due to the change, Google’s revenue from that particular website will be affected negatively. That is the reason why Google hesitated to make algorithm changes in the period from 2003 to 2008. 

From 2008 to 2010

By 2008, Google brought in personalized search results, meaning when a person does a search while he is logged in to his Google account, Google displays results related to the word searched and based on the person’s search history. It became more interactive and in effect seemed more like a personal assistant who knows the interest of the person.

In 2009, websites that received higher traffic and user interaction topped Google ranking. Google also began to track user behavior and store them in its data centers. This way Google collected every data regarding a user from the time the search started, the pages visited, and how much time the user spent on each page till the time of exit.

Google also started considering Bounce Rate as a metric for ranking pages. Bounce rate is defined as the percentage of visitors who do not interact and exits early. When the bounce rate is high, the ranking will be low. 

Moving on to 2010, social media become hugely popular. In early 2010, Facebook—the world's largest social media entity—had roughly 400 million active users. Therefore, Google considered social media as a metric for ranking websites. 

social media platforms


Those websites which got a higher number of social media sharing and interactions topped the ranking. Times when two sites gain equal ranking, then Google considered the influencer power as metric for ranking.





Comments

Popular posts from this blog

SEO On Page Optimization Explained

How To Verify A Blog In Google Search Console