Wednesday, 25 July 2018

What is Google Analytics?


Google Analytics is an application where you can track your sites real time traffic, progress, events and conversions. After you publish your post it is a mechanism which helps to control and evaluate your reach. 

Benefits of using Google Analytics


Google Analytics

Cost Free

The main attraction is definitely its free of cost service. It is provided by Google for its web masters to better manage their websites.

Find New Keywords

It helps you discover new keywords with which visitors come to your site. It helps you to optimise content with those keywords.

Identify Pages with more Clicks

It helps you to identify the pages which has more views. It ensures that your campaigns are directed to the right pages and right audiences.

Segmenting Audience

When you start a new campaign you can segment new visitors from old visitors. This way you will know the number of people you were able to draw attention with your campaign.

Customer Satisfaction

You get acquainted with what your customers really want and what they enjoy the most in your site. It helps you to bring in more of those quality content and less of unproductive content. It improves customer relation and increase satisfaction.

How to track your website in Google Analytics?


Google Analytics

There are two types of websites:

Content Management System(CMS): 

Under CMS website, in settings you can copy paste the tracking ID provided by Google Analytics. 

HTML based websites:

Under HTML, you must go to HTML script add the tracking code provided by Google Analytics right above the </head> section.

Types of Visits/ Web Traffic


  1. Direct Traffic: Visitors are considered direct when website address is typed correctly. 
  2. Organic Traffic: Visitors are considered organic when they come from search results and not from any paid or referral links.
  3. Social Media Traffic: These are visitors who come from posts shared in social media sites like Facebook, Twitter, Youtube etc.
  4. Referral Traffic: These are visitors who come from websites that have our site links.
  5. Ad Traffic: This is a paid form of traffic. These are visitors who come by clicking on ads given in Google, Facebook, Youtube.


Conclusion


If you have a small, medium or large business, this application can be the most useful tool. It is tailored to suit any kind of business you are engaged in. If you have not tried Google Analytics, you must definitely try it out. The best part is it is free of cost which makes it accessible to everyone from startups to multinational companies. 







Tuesday, 24 July 2018

Web Master Tools- A Useful Guide to SEO

Webmaster Tools

Webmaster tools or search console is a way of communication or interaction between Google and its web masters. It is an application that shows how the search engine views our website. By adding your website and its related links on Webmaster Tools you are able to track all information related with SEO like crawl errors, indexing, links, search keywords etc.

Features of Webmaster Tools


  • It helps to monitor any errors or broken links in the website.
  • It ensures if the site is correctly configured for search engines.
  • It ensures that our website is being crawled.
  • Indexing request can be given by going to 'fetch as Google' under crawl option. It is a way of asking google to index our website links.
  • It shows the keywords the users typed to land on our site.


Some Useful Elements in Webmaster Dashboard


Webmaster Dashboard


Search Queries
This section shows the keywords which the user used to get to the site. This helps the SEO Analyst to find the most used keywords and start optimising with that.

This section also shows the number clicks, impressions and CTR (Click Through Rate) which the keywords get. 

Links
This section shows the internal links and back-links which you have from other websites. This helps to monitor the websites that provide you with links.

Crawl Errors
Under this Google provides information about any errors their Bots encountered while crawling the page. It also gives information on number of pages crawled each day which can be helpful.

Keywords
This section also shows keywords similar to Search section. But keywords shown here are the recommendation from Google. That is it shows the keywords which Google thinks is relevant.

Site maps
Here you can see the site maps Google has found for your site. If it is no right you can submit a new site map for your website.

Saturday, 21 July 2018

Tips on Writing and Optimising an Article

In this post you will get acquainted with what a snippet is and various techniques which you can adopt to optimise your content.

A snippet is the organic result of a search result. It consists of three components:
Title: It should have the focus keyword. It is the first thing which the user reads.
URL: It is the first portion crawled by Google bots.
Description:It displays the Meta Description we provide in the head section of HTML. It must be unique and informative. It is a strong component for users to the select our website link. 

Outline of an Article


Title

Title is the head section of the post. It is the primary identification mark of the article. It must be:

Unique.
Meaningful.
No full CAPS and full small letters. Starting letter of every word must be capital letter. e.g. Tips on Optimising Title of Head Section.
No grammatical errors.  
No too long. (55-60 characters and 512 pixel width)
Focus keyword can be placed.
No two titles should be the same in your website. This leads to cannibalisation and your website will lose in the game.


Body

Body portion has the content in detail. It consists of:

H1
It is the most significant portion in Body Section. 
Only one H1 is required to avoid confusion. If there are two H1's which are contrary, it will be cached and added to sandbox.
It must be simple and easy to understand. (7-8 words)
Focus keyword can be placed. 
Ensure no spelling and grammatical errors.

H2
It is almost similar to H1. If H1 is not relevant H2 will be taken.
It is considered as subheadings. 
It must be relevant with H1.
No lengthy H2's must be there in a page. 
One H2 is ideal in a post.

H3
You can give multiple H3's.
Related keywords can be paced in this.

Various Components of Optimisation


Optimisation tips

Anchor text Optimisation


These are the hyperlinks given from our website to other sites (external link) and to our own site (internal link).
This makes a text easy to comprehend.
Never give focus keyword as anchor text because it will be seen as a recommendation.
Give links to your own page and trusted sites.
Anchor text boosts crawl because it is seen as a fuel to crawling bots.


Content Optimisation


Use simple short sentences.
When selecting a keyword never consider the search volume. 
Keyword density must be considered. For short keywords use it 3-4 times and for long keywords use it only 1-2 times.
Try to include focus keyword in first sentence or first para. This helps the user understand the topic in the beginning itself.
Place focus keyword before coma or fullstop.
Optimise the page not just for Google Bots but for user as well by making it readable and easy to understand.
Using Bold letters for some words in a long para helps the user to read easily.
A score out of 100 is given for content known as content reading ease score.

Image Optimisation


Alt Name also known as alternate text is the first identity of an image. Right name should be given for the image.
When you save an image that is the second identity of the image.

You must ensure that when you paste the image the content surrounding it must be relevant

Authorship Optimisation


This optimisation helps you gain editorial power. This is a way of telling Google that you are the author of a certain article or website in which you write.

How to authorise your content?

Go to your Google+ account- About page- Enable site link- copy paste blog URL and title.
Then copy Google+ URL- go to blog post add link on your name by pasting Google + URL- go to HTML hyperlink ahrf tag add "rel="author".

Make sure that you incorporate these tips when writing content which helps you optimise content for Google bots and Users as well.

                 Simna Nahas


Thursday, 19 July 2018

The Rise of SEO Practice Around the World

Introduction of SEO to Businesses

SEO

SEO gained prominence in the beginning of 21st century. After the World Trade Center Attack, every media was flooded with the devastating news. But Google was unable to provide its users with any data relevant to the issue. Further analyses gave them insights on the issue. It was a wake up call for google. That is when they decided to provide certain guidelines (SEO starter guide) to webmasters. This was the introduction of SEO to businesses. These guidelines helped the company to optimise their content and gain visibility in the internet. It was a win win situation where the companies got more reach and google produced quality content which in turn helped in retaining customer loyalty.

Process of Selecting Data by Google Bots


  1. Crawling- Bots scan the content and take screenshots of the pages which has certain keywords. 
  2. Cached- These screenshots are saved to the right category or folder. 
  3. Indexing- It exhibits the content which is saved when the user searches for it.


Timeline of Algorithm Updates adopted by Google 

Timeline of Algorithm Updates

1.Niche specific 

This is also known as content specific. It worked on the basis of getting maximum keywords. Your company got listed if you manage to get maximum number of keywords in your webpage. This resulted in keyword stuffing(usage of these keywords unnecessarily to gain visibility). This produced poor quality content which affected google. Later on google took it off and was considered as Black Hat Practice( unethical).

2.Link specific 

Under this a website got good ranking on the basis of links they obtained from other websites. These were known as backrub or back links which allowed the users to use the link to access your website from the host site. This led to a situation where the websites started selling and buying links. This way poor quality websites also started getting good ranks which affected Google directly.

3.Quality link specific

This led Google to introduced quality link specific. A new method "page rank" was introduced. Websites were given good page ranking on the basis of 200 different categories. A quality score out of 10 was awarded. Content quality and trust value was give more weight-age. Some of the websites that earned 10/10 were US government, flash player download etc.
Under this websites were listed when they got back links from websites that had good page ranking. This too led to misuse from the websites that had good page ranking. They started selling links.

4.Passing the Juice

This led to the introduction of passing the juice. As the name implies it gave out a part of the websites equity along with the links they provided. This put a stop on websites selling links because it affected their equity value.

Along with this amendment google put forth a new meta tag rel="no follow" which helped webmasters provide links relevant to their content without losing any equity. This way they will not provide unnecessary links but can give links to pages which helps the users to better understand the content on their website.

5. 2003-2008 

Many amendments were witnessed during this time. Google adopted these with great fear because with every up gradation they were incurring huge loss. Some of the biggest changes which changed googles financial position happened during this time. 

Few of the updates during this period are:

Google AdWords- this later became major income source. Its a Prepaid program where google gets pay per click. When users click on the ad the websites prepaid expense get deducted.
Google Adsense- it is an affiliate program where google places ads on blogs and websites. 
Google Suggest- this tool gives suggestions in the search engine when we type out. 
All these led google to become more of a personal assistant to its users from getting information to commanding orders as voice notes.

6. Ranking per Interaction (2009)

Google started Ranking as per user interaction. Google was able to determine the amount of time and efforts which users spent on each websites using cookies and data centers.
They calculated Bounce rate ie the percentage of users who exit a webpage in no time. It helps to identify the level of activity that a website has.

7. Social Media Signal (2010)

With the advent of different social media platforms google started to rank those websites who shared their content on all social media platforms and those who added a link of their Facebook twitter pages on their websites. 

This trend was short lived because everybody who had an access to internet connection could share and add their pages on websites. Only the early birds were able to reap the benefits out of it and those who followed suit did not get any benefit.

Social media authority value was another way of ranking. In this, posts who got likes comments and shares from people of high ranking gets more value.

8. Panda Update (2011)

Panda Update
In 2011 a web spam team was set up to maintain quality. It was done against content spamming. It helped to remove spam from search results.

Types of spamming 

  • Content duplication- it is considered plagiarism if you copy and paste content from other websites.
  • Low quality content- this is content with spelling and grammatical errors. If the content lacks in information it is referred to as garbage content.
  • Content spinning- when webmasters put up the same content in different forms to fill up pages.
  • Thinny pages- when the text and html is not in proportion ie when there is lack of text.
  • In 2014, updation 4.0 took of content duplication largely and it added it to its algorithm as a permanent filter. This helped to take out the spam at the time of crawling itself.
Many variations of Panda update followed suit in the coming years.


9. Penguin Update (2012)

Penguin Update

This was introduced against link spamming. To combat the issue of manipulative links. 

Types of link spamming are:
  • Link exchange- webmasters getting links in exchange of providing links
  • Selling and buying links for money.
  • Link farming- programmers created softwares that could manipulate links.
  • Link referral- it is a portal that provides suitable websites who can provide you with links.
  • Comment spamming- commenting on blogs along with your sites URL
  • Wiki spamming -being a volunteer in Wikipedia and providing the URL of your website on irrelevant areas
  • Guest blogging- this is not a black hat practice if done in the right manner. You can publish posts on other websites which give mileage to you as well as for that website. 
Many variations of this update followed suit. One among that is:

 Real time update : This is Penguin 4.0 update where Google updates right at the time of crawling and indexing.


10.Park a domain Update (2012)

This updation removed all domains that were inactive. These domains were bought and was not made into websites.

11. Exact match domain (EMD) Update (2012)

Just like the name suggests this up-gradation was targeted at websites that have the exact domain name as their activity. Eg. an SEO company in India having domain name 'seocompanyindia.com'. Those websites which had EMD with no up-gradation for a very long time were removed from search result. Those that had up-gradation a were exempted from this.

12. Pirate Update (2012)

This updation allowed webmasters to directly file a complaint under US DMC Act against those websites that duplicated their content, images etc.

After inquiry, Google took action against those websites by removing them from the search result and informing their host website about this act. This host website will suspend the website shortly after getting the notice.

13. Hummingbird Update (2013)

This was the introduction of Semantic search result and the beginning of the usage of Artificial Intelligence. This enabled Google to provide an extensive summary of search result to the users on topics they searched. This summary of information was obtained 30% on the basis of quality content and the rest was based on feedback received from vast majority at forums.


14. Pigeon Update (2014)

pigeon update

This up-gradation was added to enhance the user experience. It was targeted at making businesses flourish locally.

In order to gain grounds locally SEO analysts were expected to accomplish 5 tasks:

  1. Create a gmail account for the company. Go to "my business" and placemark the exact location of business on google map. A verification process will follow in order to test the authenticity.
  2. Add on address, phone number and pincode of the place in the website. 
  3. Submit the details in local dictionary like Just Dial, Sulekha.com etc.
  4. Embed the place map on the About section of the website. This is known as Geo Tagging.
  5. Ensure that the interactions in social media profiles must be local in order to gain better visibility locally.


15. Mobile Giddon Update (2015)

Under this Google asked webmasters to make their websites mobile friendly. Those websites that are mobile friendly gets good ranking in listing. It has gained tremendous importance in the recent years.

16. Rank Brain Update (2015)

This is an advanced level of hummingbird update. In this Google acts as an intelligent human being by giving responses appropriate to different age groups, class difference etc. Here Google is trying to be your friend.