Search Engine Optimization Concepts and Techniques

SEO –Search Engine Optimization, the goal of this article is to discuss the fundamentals of Search Engine Optimization techniques so that anyone can build the strategy to bring more traffic to their website. Search Engine Optimization is all about getting your site listed up in search results which are actually getting your site indexed by a search engine.

Concepts and Techniques

SEO –Search Engine Optimization, the goal of this article is to discuss the fundamentals of Search Engine Optimization techniques so that anyone can build the strategy to bring more traffic to their website. Search Engine Optimization is all about getting your site listed up in search results which are actually getting your site indexed by a search engine. Before diving into the optimization concepts let us have a brief look at the strategy behind search engines.

Significance of SEO

With all the basic information about a search engine’s behavior and functionality a lot of experts have analyzed the ways to optimize sites to make them available in search results.

Ranking up in search engine result will always attract more visitors to your site. Those visitors will in turn become a potential customer for your business. In other words getting up in search results is a great form of advertising where a business can promote its products and features via a website to attract more customers to buy their products. Even a large number of well-known brands still rely on Search engines to attract visitors to their site.

When a website is ready to capture all your products and features it’s not just over with that. A website which is not shown up in a search result when somebody searches for it is almost like that the website doesn’t exist at all. For any kind of advertising there are some factors like geographical region that needs to be taken care of to target potential customers.

Search Engines

Over 93% of internet users use Search engine to filter the information available. There are a lot of search engines available but the most popular ones are Google, Bing and Yahoo. Among these 3 popular search engines Google tops in ranking by holding 70% of total search engine market share. In this article, we will explore a search engine’s common behavior and some specific features applicable for specific search engines.

Search Listings and User Preferences

In Google, when a user searches for a particular product the search results would bring up some thousands and millions of websites matching his search keyword. This Search listing can be categorized in two ways. One is the paid advertisements, the one that appears in the top and right column of all search results.

SearchEngine Optimization

The other is the search result right below the top advertisement called as organic results. When a user searches for a keyword few hundreds and millions of websites related to the search gets listed as organic listings.

75% of users never scroll past the first page of search results. Among them 63% of the users prefer to click only the first 3 results which are often called the “Golden Triangle” listing.

In Google this paid search listings is offered by Google AdWords service. Other search engine providers also operate a similar service. All these services are offered on the basis of “Pay per click”. Yahoo Search Marketing is a keyword-based “Pay per click” internet advertising service provided by Yahoo.

SEO PayPer Click

Irrespective of all these optimization, paid searches will bring these search listings available at the top. But 70-80% of users ignore the paid ads, focusing on the organic results.

Search Engine Types and Behavior

In general Search engines are of two types.

  • Crawler Based
  • Human Powered

Crawler Based

These are search engines powered by automated software agent that runs automatically without human intervention. These software agents called bots, spiders or crawlers will visit all the pages of a site. When crawlers lands on a page it will first look for the keyword of the page. The keyword here is the most frequently used or handled word in that page. Once the keyword is identified the page is indexed in the search engine’s database under this keyword. This process is called web crawling. The crawlers have the ability to navigate to all the links available in a page. Followed by this, the crawler will revisit all the pages at regular intervals time to check for any change in the information provided in those pages.

Every search engine has its own bot to collect data. Googlebot is Google’s web crawling bot, bingbot is a web-crawling robot deployed by Microsoft to supply Bing and SLURP is the crawler for Yahoo.

Human Powered

Online directories like yahoo directory and Open directory rely on human editors to maintain the index. Unlike crawler based machines this index will be updated only when a person manually updates the index. This kind of directories allows users to narrow their search.

In both cases the mechanism of producing the search results is the same. When a user searches using a keyword the entire web pool is not searched to find the relevant one. Instead the search engine’s index is queried to locate the preferred site. Apart from these two specific types some search engines employ a hybrid model encompassing both human powered and crawler based.

Indexing

During indexing the keyword and the URL from where it was found are not the only values stored in the database. Other information like the following also gets stored:

  • Placement of keywords
  • Keyword count in a page
  • Are the pages linked from here also has the keyword?

With all these information a weight is assigned for all entries in the index with which the ranking is determined. Every search engine makes use of different algorithms to assign a weight to a keyword.

Querying the search engine index

This is nothing but building a search. A search keyword supplied by a user may be a single keyword or a combination of keywords. In both cases the search engine index is queried with this keyword. When it comes in combination a Boolean operator as needed is placed in between to refine the result.

Every search engine will have its own algorithm and techniques to assign a weight to every keyword placements.

Optimization Techniques

In general search engine optimization techniques can be classified into On page and Off page Optimization Techniques, which is applicable for almost all search engines.

On Page Optimization

Possible optimization steps that can be carried out in a page or a site itself. These are the factors available on a web site or a web page that influences its position in search listings. The following are some of the major on page optimization factors

Keyword Targeting

Keyword targeting is more like mapping the terms that a user searches using a search engine with the terms that are available on your site. Selecting the proper list of keywords plays a vital role in optimizing a site to bring it up in organic search listings. A keyword should be relevant to:

  • The business
  • The content of the site
  • The potential customers being targeted

Before finalizing any keyword it is essential to ensure the following features,

  • Traffic for that keyword – Do users really search for this keyword a lot? What is the frequency at which this keyword appears in the search?
  • Wealth of the keyword – Know your competitors. Popular websites comes under this keyword search.

Once we arrive with the keywords that we care about, targeting them at the site at necessary places will help to rank better in that specific keyword search.

Keyword placements

Always ensure keyword placements in the following key areas of a website.

The Title tag of the page

This is the one that appears as a header when your site is listed in search results. It is necessary to have the keyword here.

<title>Salesforce Implementation Partner in Phoenix, Arizona | MST Solutions</title>

Title Tag of Page

Meta Tag

  • Meta Description

This is the bottom line description for a site when it is listed in the search results. This is what a user reads before clicking it. A better and apt description will always attract more users to the site.

<meta name=”description” content=”MST Solutions is a leading Salesforce Silver Partner in Arizona and Integrate with Salesforce CRM, Configuration and Custom Development Services.”>

Meta Content
  • Meta Revisit    

This would tell the crawler when to revisit the site

<meta name=”revisit-after” content=”30 days”>

Images Alt Attribute

The alt attribute specifies an alternate text for an image, if the image cannot be displayed. Placing the keyword in the alt attribute will ensure that your images are caught in image search where the alt attribute value is used in conjunction with the search terms

Image alt

Header Tags

H1 tags improves page ranking.  Being the highest level of header tags it would give the information on what the page is about. It would be easily identified by a user when the page is visited. Crawlers always check the relevancy of the header with its content and all other contents of the page.  It is because of these reasons search engines have provided H1 tags more weightage compared to all other tags.

To make full use of all the exact purpose of H1 tags makes sure that:

  • There is only one H1 tag for the entire page
  • Keyword is placed in the H1 tag

Apart from these keywords need to be a part of the site’s domain name and page URL

Off page optimization

Domain Authority

Gaining Search Engine’s Trust – When there are a lot of sites under a specific keyword, to determine the quality of the site’s content search engines set some trust metrics for a site. These trust metrics include but not limited to:

  • The number of inbound links to the site
  • Backlinks from other sites – The number of times the site is referred by other sites
  • Social media sharing – Number of times the site’s content is shared in social network.
  • Active involvement in relevant community conversations like blogging
  • Adding links to directories like Open Directory – Enrolling in paid directories like Yahoo directory and Open directory is one form of getting your links referred in other locations of the web.

All these are like endorsements to the site. In simple words, search engines look for the amount of time other sites talk about your site directly.

Attractive and relevant content in the site

When you want to grab the attention of more users to the site always make sure that the contents of your site are relevant to what the users are exactly searching for.  Make the contents more attractive and creative, which would automatically increase traffic to your site and make other sites to refer the site in turn.

Make your URL clean and ensure the quality of the content you post on the site. This remains the key optimization tip for Bing.

Sharing Icons

Enable social media sharing icons across the site’s pages that would allow visitors’ to share and like your content in Social media

Technical SEO

Sitemap.xml

Sitemap.xml is an XML file that informs the search engines about the URLs of the site that are available for crawling. A Sitemap file is typically saved to the root directory of a site. This file is submitted to search engines for indexing. Each links in the file has attributes that specifies the frequency at which the site’s content will change, the last modified date of the page and the priority of the page. The maximum sitemap size is 50MB or 50,000 URLs.  A site can have more than one sitemap file.

Google introduced Google Sitemaps that allows web developers to publish lists of links from across their sites. The basic idea of having this is that some sites have a large number of dynamic pages that are only available through the use of forms and user entries. The Sitemap file contains URLs to these pages so that web crawlers can find them. Bing, Google, Yahoo and Ask (another search engine service provider) now jointly support the Sitemaps with same protocol. So a Sitemap in a site lets the four biggest search engines have the updated page information.

Attributes

  • URL location
  • The last updated date of a page
  • Frequency of its change
  • Priority

File Format

<?xml version=”1.0″ encoding=”utf-8″?>

<urlset xmlns=”http://www.sitemaps.org/schemas/sitemap/0.9″>

<url>

<loc>http://example.com/</loc>

<lastmod>2014-01-01</lastmod>

<changefreq>daily</changefreq>

<priority>0.8</priority>

</url>

</urlset>

These files can be generated in a site dynamically with the help of simple scripts.

Robots.txt

This is a simple text file placed inside the root directory of any site and contains what is known as Robots Exclusion Protocol. This file contains the site’s instruction for a crawler. This specifies the list of pages that should and should not be indexed by a search engine. When a search engine encounters the site it would search for the robots.txt and obeys all the instruction placed in this file. If search engines don’t find a robots.txt, they simply assume that this site does not have the instruction for it and therefore they index everything they find along the way. This would result in indexing all the contents of the site.

Apart from Google, Bing and Yahoo the other major search engines like AOL, Baidu and Yandex also obeys a Robots.txt of this format

File Format:

User-agent:

Disallow:

Here the User-agent is the search engine and Disallow will have the directory and list of all pages that needs to be excluded from indexing.

Below is a basic example for Robots.txt that disallows specific folders for Googlebot

User-agent: Googlebot

Disallow: /assets/

Disallow: /tmp/

Disallow: /cgi-bin/

Below is a sample robots file that prevents access to specific files for all the search engine robots

User-agent: *

Disallow: /~pub/bar.html

Here * indicates all the bots of all search engines. The location of a sitemap is also specified in Robots.txt.

SEO Don’ts

Black Hat SEO – This is a practice of using unethical techniques or tricks that breaks all the search engine rules and focuses only on getting listed in Search Engines. Below are some of those techniques that should not be practiced while optimizing a site.

  • Keyword stuffing – An act of dumping the site with repetitive keywords. People follow this technique to increase the keyword count of a site so as to bring the site’s ranking up in search results but search engines would easily find this and ban the site permanently or temporarily.
  • Cloaking – A technique in which the content presented to the search engine spider is different from that presented to the user’s browser. The purpose of cloaking is sometimes to deceive search engines so they display the page when it would not otherwise be displayed
  • Duplicate contents – Similar contents available in more than one site.
  • Incorrect robots.txt –A lot of care is needed while creating a robots file for the site. Any incorrect statement would end up with either unwanted pages to get indexed or nothing at all
  • Hidden Text – Similar to keyword stuffing, where keywords are hidden in the site.  Some noted places are Hidden keywords, Hidden H1 tags.
  • Title Stacking – Rather than adding a single title tag adding more than one associated with different keywords.

These days, almost all Search engines are smart enough to identify all these tricks and will never index such sites.

About MST

At MST Solutions our cornerstone is to adapt, engage and create solutions which guarantee the success of our clients. The talent of our team and experiences in varied business verticals gives us an advantage over other competitors.

Recent Articles

Work with us.

Our people aren’t just employees, they are key to the success of our business. We recognize the strengths of each individual and allow them time and resources to further develop those skills, crafting a culture of leaders who are passionate about where they are going within our organization.