After June 2009, Google Page Rank next update was definitely expected in September 2009, but now there are some news coming as Google has started the process. Google updates the PR at an interval of 3 months or so but this time they are taking more time. Google released the Google Wave and then they are also busy in Google Caffeine. Webmaster team is also busy in the upgrading the Webmaster Tool. But this time lot many websites confirming the news that Google has started the process in July and it is going to release its update on 30 October 2009.


The last update was in 23rd of June. Until now only a few site have been affected by the PR update. On 30 Oct 09 we will see clearly what are the effects of this October PR update.

Although page rank doesn’t determine your position in results, it shows your site social status and power. A site with PR 7 has more back links and more weight, when this site publishes an article it will get quickly on the first page of Google Search, even if it doesn’t have back links.

There are a lot of ways that you can improve your site’s page ranking in search engines, unfortunately, not all of them are good. Some people employ certain methods in acquiring a high page rank in search engines, even if these are considered to be deceitful in the sense that they are designed to trick the search engines – one of these methods is actually duplicating web content.

What is duplicate content?

Duplicate content in SEO is actually any web content that is considered to be similar to another site. Search engines have actually implemented new filters specifically to monitor these types of deceitful attempts to improve site’s search engine page rankings. A lot of people think that by creating multiple but similar replicas of their web pages or content, that they will be able to improve their site’s page rankings since they will be able to get multiple listings for their site. Since search engines are now monitoring these types of trickery, sites using duplicate content can actually end up getting banned from search engine indexes instead of improving their ranking.

What are considered as duplicate content?

There are a couple of duplicate content types that are being rampantly utilized by a lot of people, each one a bit different in their use, but all of them employed for the same purpose, which is to trick search engines to get better page rankings.

One way of getting a duplicate content is by having very similar websites or identical web pages on different sub-domains or domains that offer basically the same content. This may include landing or door pages aside from the content, so make sure that you avoid using this if you don’t want your site to become vulnerable to search engines’ duplicate content filter.

Another method of creating duplicate content is by simply taking content from another website or page and reorganizing it to make it appear dissimilar to its original form, though it is actually the same.

Product descriptions from many eCommerce sites are actually being utilized by other sites as well. Other sites simply copy the product description of manufacturer’s utilized by other competitive markets as well. And add the fact that the product name, as well as the name of artist, manufacturer, writer or creator would be included, a significant amount of content would show up on your page. Although this is much harder to spot, it is still considered to be a duplicate content, or spam.

Distribution of copied articles by other sites other than the one that distributed the original article can also be considered to be a duplicate content. Unfortunately, although some search engines still deem the site where the original article came from as relevant, some however, do not.

How do a search engines filter duplicate content?

Search engines filter for duplicate content by using the same means for analyzing and indexing page ranking for sites, and that is through the use of crawlers or robots. These robots or crawlers go through different websites and catalogues these sites by reading and saving information to their database. Once this is done, these robots then analyzes and compares all the information it has taken from one website to all the others that It has visited by using certain algorithms to determine if the site’s content is relevant, and if it can be considered as a duplicate content or spam.

How to avoid duplicate content?

Although you may not have any intentions to try and deceive search engines to improve your site’s page ranking, your site might still get flagged as having duplicate content. One way that you can avoid this from happening is by checking yourself if there are duplicate contents of your page. Just make sure that you avoid too much similarities with another page’s content for this can still appear as duplicate content to some filters, even if it isn’t considered to be a spam.

Do you know that the basic excogitation of shared online bookmarks innovate in around April 1996 when their is an official launch of itList which features to included public and private bookmarks.As the time passes and web 2.0 is ready to take birth their is a series of launch of several online bookmark services like Blink, Backflip, Clip2, ClickMarks, HotLinks, and so many others.They provided folders for masterminding bookmarks, and some services automatically synchronizes the bookmarks into folders.

What is the benefits of using Social bookmarking?

If you are a webmaster,blog owner or a blogger then you must have to know about the benefits of using social bookmarking services.Some of the important reasons are as follows:

1. To get unique traffic from various zone of social networks
2. In order to create a social buzz on your product and brands
3. To get faster indexing of the website in the search engines like google,yahoo and msn live search.
4. To get high Google page rank from various back links and bookmarking sites.
5. To get a good alexa ranks with the help of target traffic coming on your website from classified origins.

What to do if we don’t have time?

If you don’t have time to buzz your brand then you can ask your friends to do so or the best way is to get some Social Bookmarking Service that may enrich you with the above 5 benefits like Buzz,traffic Google indexing and if you want you also the Link Building Service to get better juice of google page ranks.

Ad Format:

The third way to improve ad performance is to choose another ad format. AdSense provides you with many different ad formats.

In general, wide formats seem to perform better than narrow formats. This is because people can read more words at a time without having to skip a line. I've experienced that the 336x280 large rectangle, the 250x250 square and the 160x600 wide skyscraper have done best. These are quite big ad formats, of course, so they need lots of space, but if they're placed well they usually generate more income than the smaller and narrow ad formats.

Another good format is the wide 728x90 leaderboard, which does best under the editorial content of a web page or directly under the page header, in my experience. The 468x60 banner format does not convert that well, but it can still be placed in areas where there isn't much space available, i.e. directly in the page header (next to the logo) or within articles.
I hope these tips help to get your Adsense CTR up by at least 100% if not more.


Best Google Adsense Link Colors to get higher CTR?

• Match the colors of your ads with the colour scheme of your site. Blending with your sites color profile helps to identify them not as ads, but as links similar to those of your site. The more the AdSense looks like part of your site, the higher CTR you will get. You can also match the Adsense fonts with your website font design for great results.

• Blend ads with your page - remove the borders by having a similar color as your background helps to show ads as being part of your site. Do not blend text or the ‘Ads by Gooooogle’ with your background color as it is against Google TOS (Google does not like hidden text!). However, such blending may not work for you always due to banner blindness. Neither do they see the ads, not do they click on them. So sometimes a bold contrasting ad may work better depending on your website design.

• Experiment by changing the colors, background of advertisements. You have to find out what works best for your site, not others. You can also rotate Adsense colors to reduce ad blindness.

GoogleWebmaster

Following these guidelines will help Google find, index, and rank your site. Even if you choose not to implement any of these suggestions, we strongly encourage you to pay very close attention to the "Quality Guidelines," which outline some of the illicit practices that may lead to a site being removed entirely from the Google index or otherwise penalized. If a site has been penalized, it may no longer show up in results on Google.com or on any of Google's partner sites.

Design and content guidelines:

# Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.

# Offer a site map to your users with links that point to the important parts of your site. If the site map is larger than 100 or so links, you may want to break the site map into separate pages.

# Create a useful, information-rich site, and write pages that clearly and accurately describe your content.

# Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.

# Try to use text instead of images to display important names, content, or links. The Google crawler doesn't recognize text contained in images.

# Make sure that your elements and alt attributes are descriptive and accurate.

# Check for broken links and correct HTML.

# If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.

# Keep the links on a given page to a reasonable number (fewer than 100).


Technical guidelines:


# Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.

# Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.

# Make sure your web server supports the If-Modified-Since HTTP header. This feature allows your web server to tell Google whether your content has changed since we last crawled your site. Supporting this feature saves you bandwidth and overhead.

# Make use of the robots.txt file on your web server. This file tells crawlers which directories can or cannot be crawled. Make sure it's current for your site so that you don't accidentally block the Googlebot crawler. Visit http://www.robotstxt.org/faq.html to learn how to instruct robots when they visit your site. You can test your robots.txt file to make sure you're using it correctly with the robots.txt analysis tool available in Google Webmaster Tools.

# If your company buys a content management system, make sure that the system can export your content so that search engine spiders can crawl your site.

# Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.

# Test your site to make sure that it appears correctly in different browsers.


Quality guidelines:

These quality guidelines cover the most common forms of deceptive or manipulative behavior, but Google may respond negatively to other misleading practices not listed here (e.g. tricking users by registering misspellings of well-known websites). It's not safe to assume that just because a specific deceptive technique isn't included on this page, Google approves of it. Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit.

If you believe that another site is abusing Google's quality guidelines, please report that site at https://www.google.com/webmasters/tools/spamreport. Google prefers developing scalable and automated solutions to problems, so we attempt to minimize hand-to-hand spam fighting. The spam reports we receive are used to create scalable algorithms that recognize and block future spam attempts.


Quality guidelines - basic principles:


* Make pages primarily for users, not for search engines. Don't deceive your users or present different content to search engines than you display to users, which is commonly referred to as "cloaking."

* Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you. Another useful test is to ask, "Does this help my users? Would I do this if search engines didn't exist?"

* Don't participate in link schemes designed to increase your site's ranking or PageRank. In particular, avoid links to web spammers or "bad neighborhoods" on the web, as your own ranking may be affected adversely by those links.

* Don't use unauthorized computer programs to submit pages, check rankings, etc. Such programs consume computing resources and violate our Terms of Service. Google does not recommend the use of products such as WebPosition Gold™ that send automatic or programmatic queries to Google.


Types of SEO Reports :

SEO Report provides a comprehensive analysis of your website which includes Website analysis and top 10 competitor analysis apart for the followings:

• Title tag Report
• Keyword Desnsity Report

• Keyword Meta tag Report
• Description Meta tag Report

• Body Text Report
• In page Link Report
• Link Popularity Report
• Outbound Link Report
• IMG ALT Attribute Report
• Top 10 inbound Link Optimizer Report

• Page Rank Report
• Anchor Text Report

• Comprehensive Summary

• Suggestions for Optimization


The complete report consists of about 75-100 pages and covers each and every aspect of search engine report.

Title Tag Report:
Title tag report analyse your existing title tag optimized for given keywords. Report also provides the most appropriate title tag for maximum optimization.


Keyword Density Report:
Keyword density report provides the density of all keywords. It also suggests the ways to improve your keyword density.


Keyword Meta tag Report:
Keyword meta tag report analyses existing keywords and provides suggestions for improvement.


Description Meta tag Report:
Description meta tag report provides analysis and ways to optimize it.

Body Text Report:
Body text report generates a comprehensive report for your main content of the given page.

In page Link Report:
In page links are the links

Link Popularity Report:
Link popularity refers to the number of incoming links. The Report contains a comprehensive report on the Link Popularity and also provides suggestions to optimize the the links.

Outbound Link Report:
Outbound links are the links from the website to other websites. Google and other search engines rely on the outbound links for calculating the page rank of a web page. The report contains the comprehensive outbound link report and suggestions for optimizing the same.


IMG ALT Attribute Report:
Alt tags are very important for SEO. The report contains the Alt tags and we also provide suggestions to optimized Alt tags for images.

Top 10 competitor Report:
The report offers top 10 competitors analysis against various criteria.


Keyword Density Report:
Keyword density report calculates the density of all the keywords in your website.

We generate the SEO reports from a variety of sources, we do manual analysis, compiled them together and provide to you in an easy and readable format.


Today we're launching the Google Website Optimizer YouTube Channel, which will be the home for videos about Website Optimizer and website testing. You'll find helpful videos like the Always Be Testing webinar series with Bryan Eisenberg, as well as simple instructional ones like Setting up an A/B Experiment in 5 minutes. We've also put up all the video case studies of the Website Workout winners.

We'll continue adding more videos, both from Google and from our partners. If you want to keep up with all the new videos you can subscribe to the Website Optimizer Channel or add an iGoogle gadget to your homepage.

You can now update your Google Affiliate Network sign-in information to a Google Account. This will let you access all your Affiliate Network account and Google products with a single sign-in. After you update to a Google Account, you’ll be able to: Use the same sign-in for Google Affiliate Network and other Google products, including AdWords, AdSense, Gmail, Google Docs, Google Calendar, Personalized Search, and much more.

Switch among Google Affiliate Network and other Google applications without having to sign in again.

The following image provides an overview of the process to update to a Google Account.

Please note: We cannot update the Google Affiliate Network sign-in pages until all users have updated. Therefore, we recommend that you bookmark the Google Account sign-in page for Google Affiliate Network.

Users who have updated to Google Accounts have the option to bookmark their new sign in pageAlready Updated? Sign in with my Google Account from the legacy sign in page. In addition, if you are already logged in to your Google Account (via another product) and navigate to either of these pages, you will be brought to your Affiliate Network dashboard (no need to enter login information again).

Users that choose to not update to a Google Account at this time can continue to log in with their existing credentials from the legacy sign in page.
or select the link marked


Okay, first off I have to give props to Jeff Call one of my coworkers
whom I asked to create a simple depiction of SEO not dying and he came
back to me with this masterpiece in no time:

Incredible! I wish I had that kind of talent. Now onto the post:


There is constantly talk about SEO dying. In years past, there have been many people who have predicted that SEO would be dead by now. Many others think SEO is dying out as I write this post. I really hope these people didn’t quit their SEO jobs and get into a career that has anything to do with forecasting. At this point in time, SEO is as alive and well as it has ever been. With the downturn in the economy, companies are flocking to SEO agencies desperately seeking for a medium that brings results for a fraction of the price of other marketing mediums…and they’re finding it. But this post isn’t about what’s going on now. It’s about the future of SEO and why it will never die…that’s right…never.


In order to explain this, it is imperative that the value of search engines is understood. Due to the development of the internet and other technologies, it is extremely easy to globally communicate and share information. So easy that our society is experiencing an information overload. A quick example: Over 10 million new books are written every year. If you think of the time it takes to write a book (I would assume an average of 1-5 years) that’s pretty astounding. Now think of how many new websites are created every year…new web pages!..new blog posts!


If you were looking to buy the book Vita di Alberto Pisani by Carlo Dossi in Italian, and you didn’t have the internet, how would you find and purchase it? Perhaps I’m naive, but I think accomplishing this task would be quite difficult and take days, if not weeks or months to complete. Now think of how you would accomplish this same task with the intern et as a tool. How would you do it? Amazon? Google? Ebay? Regardless of how you find it, I can almost g.