GoogleWebmaster

Following these guidelines will help Google find, index, and rank your site. Even if you choose not to implement any of these suggestions, we strongly encourage you to pay very close attention to the "Quality Guidelines," which outline some of the illicit practices that may lead to a site being removed entirely from the Google index or otherwise penalized. If a site has been penalized, it may no longer show up in results on Google.com or on any of Google's partner sites.

Design and content guidelines:

# Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.

# Offer a site map to your users with links that point to the important parts of your site. If the site map is larger than 100 or so links, you may want to break the site map into separate pages.

# Create a useful, information-rich site, and write pages that clearly and accurately describe your content.

# Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.

# Try to use text instead of images to display important names, content, or links. The Google crawler doesn't recognize text contained in images.

# Make sure that your elements and alt attributes are descriptive and accurate.

# Check for broken links and correct HTML.

# If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.

# Keep the links on a given page to a reasonable number (fewer than 100).


Technical guidelines:


# Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.

# Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.

# Make sure your web server supports the If-Modified-Since HTTP header. This feature allows your web server to tell Google whether your content has changed since we last crawled your site. Supporting this feature saves you bandwidth and overhead.

# Make use of the robots.txt file on your web server. This file tells crawlers which directories can or cannot be crawled. Make sure it's current for your site so that you don't accidentally block the Googlebot crawler. Visit http://www.robotstxt.org/faq.html to learn how to instruct robots when they visit your site. You can test your robots.txt file to make sure you're using it correctly with the robots.txt analysis tool available in Google Webmaster Tools.

# If your company buys a content management system, make sure that the system can export your content so that search engine spiders can crawl your site.

# Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.

# Test your site to make sure that it appears correctly in different browsers.


Quality guidelines:

These quality guidelines cover the most common forms of deceptive or manipulative behavior, but Google may respond negatively to other misleading practices not listed here (e.g. tricking users by registering misspellings of well-known websites). It's not safe to assume that just because a specific deceptive technique isn't included on this page, Google approves of it. Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit.

If you believe that another site is abusing Google's quality guidelines, please report that site at https://www.google.com/webmasters/tools/spamreport. Google prefers developing scalable and automated solutions to problems, so we attempt to minimize hand-to-hand spam fighting. The spam reports we receive are used to create scalable algorithms that recognize and block future spam attempts.


Quality guidelines - basic principles:


* Make pages primarily for users, not for search engines. Don't deceive your users or present different content to search engines than you display to users, which is commonly referred to as "cloaking."

* Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you. Another useful test is to ask, "Does this help my users? Would I do this if search engines didn't exist?"

* Don't participate in link schemes designed to increase your site's ranking or PageRank. In particular, avoid links to web spammers or "bad neighborhoods" on the web, as your own ranking may be affected adversely by those links.

* Don't use unauthorized computer programs to submit pages, check rankings, etc. Such programs consume computing resources and violate our Terms of Service. Google does not recommend the use of products such as WebPosition Gold™ that send automatic or programmatic queries to Google.


Types of SEO Reports :

SEO Report provides a comprehensive analysis of your website which includes Website analysis and top 10 competitor analysis apart for the followings:

• Title tag Report
• Keyword Desnsity Report

• Keyword Meta tag Report
• Description Meta tag Report

• Body Text Report
• In page Link Report
• Link Popularity Report
• Outbound Link Report
• IMG ALT Attribute Report
• Top 10 inbound Link Optimizer Report

• Page Rank Report
• Anchor Text Report

• Comprehensive Summary

• Suggestions for Optimization


The complete report consists of about 75-100 pages and covers each and every aspect of search engine report.

Title Tag Report:
Title tag report analyse your existing title tag optimized for given keywords. Report also provides the most appropriate title tag for maximum optimization.


Keyword Density Report:
Keyword density report provides the density of all keywords. It also suggests the ways to improve your keyword density.


Keyword Meta tag Report:
Keyword meta tag report analyses existing keywords and provides suggestions for improvement.


Description Meta tag Report:
Description meta tag report provides analysis and ways to optimize it.

Body Text Report:
Body text report generates a comprehensive report for your main content of the given page.

In page Link Report:
In page links are the links

Link Popularity Report:
Link popularity refers to the number of incoming links. The Report contains a comprehensive report on the Link Popularity and also provides suggestions to optimize the the links.

Outbound Link Report:
Outbound links are the links from the website to other websites. Google and other search engines rely on the outbound links for calculating the page rank of a web page. The report contains the comprehensive outbound link report and suggestions for optimizing the same.


IMG ALT Attribute Report:
Alt tags are very important for SEO. The report contains the Alt tags and we also provide suggestions to optimized Alt tags for images.

Top 10 competitor Report:
The report offers top 10 competitors analysis against various criteria.


Keyword Density Report:
Keyword density report calculates the density of all the keywords in your website.

We generate the SEO reports from a variety of sources, we do manual analysis, compiled them together and provide to you in an easy and readable format.