Skip to Main Content (press enter)

«

»

Below is part of a recent report from the excellent Search Engine Watch, wrtitten up from their attendance at the Search Engine Strategies Conference in New York earlier this year. Danny Sullivan, Editor of Search Engine Watch is possibly the UK’s greatest Search Engine import – born and bred Californian now living near Salisbury in Wiltshire – and a subscription to the members area is remarkable value for money. With that blatant plug over, I hope they’ll forgive my almost verbatim use of a part of that report, below.

I have used the section that is from-the-horses-mouth Google-speak, confirming the importance of building web sites for the Google crawler, Googlebot. For those readers who know me, they will recall that I often refer to the fact that Google approves of ethical search engine optimisation, because if helps them present relevant information effectively to its visitors.

Craig Nevill-Manning, Senior Research Scientist at Google apparently provided an overview of Google’s ranking process and elaborated on a few points to help webmasters. From here on, it gets pretty verbatim – just remember that subscription!

Nevill-Manning stated that the Page Rank of a page is dependent on the aggregate importance of all the pages pointing to that page. He said that this is one significant factor that factors into the rank “so that for the same query, the different pages with essentially the same content we chose the one that has the best reputation in terms of the best reputation of
others linking to it.”

To many of you, including my clients, this may not be news. The important difference is that it is being confirmed from the source.

Nevill-Manning went on to explain that on the other side of the ranking function is the text analysis. “Google looks at the words on the page, the links, the text of the links pointing to that page, and various other items on the page like proximity of adjacent words and so on”. Nevill-Manning stated that there are about 100 additional factors considered and those factors are constantly being tweaked to improve the ranking to make it more relevant.

Like Yahoo, Google updates its index frequently. Google looks for content that has changed recently or that changes regularly over time. For news updates, Google has developed a separate news crawl that can update on a minute-by-minute basis.

Nevill-Manning devoted a large portion of his briefing to providing tips specifically oriented toward webmasters. Here’s a top level summary:

1. Create good content. Nevill-Manning encouraged webmasters to create sites with appropriate and relevant content. The Google linking theory maintains that if other people like your site, they will link to you organically and you will attract just the kind of page rank that Google values.

2. Get links from relevant sources. Nevill-Manning discouraged webmasters from getting everyone to link to their site (random links) and instead encouraged webmasters to get related sites to link to their site. He said the focus should be on the user and how useful to them these links will be.

3. Get proper directory representation. Nevill-Manning recommended that webmasters submit their site to web directories and to make sure their site is represented in appropriate places within the web directories.

4. Use 301 redirects when moving content. Nevill-Manning emphasized that if your pages are moving or your site is moving; use a 301 redirect (permanent redirect) rather than a 302. According to Nevill-Manning, Google will interpret the 301 more appropriately.

5. Protect your bandwidth. Nevill-Manning mentioned that if you find that Googlebot is using too much of your server’s bandwidth, you can tell Google to back off. Nevill-Manning recommended using the standard HTTP if-modified-since header to tell Googlebot when a page was last modified.

6. Isolate “no index” pages. Webmasters often have pages they don’t want indexed by Google (these might include cgi scripts, web logs, or pages that are duplicates of existing site pages).

Nevill-Manning also warned webmasters of a few practices not to follow.

1. Don’t cloak. Nevill-Manning described cloaking as giving Google different information than you’d give a web surfer.

2. Don’t do automated queries. Another Google no-no is using programs that automate queries. Google considers such programs a violation of their terms of service and have devised methods to turn off offenders.

3. Don’t hide text or links on a page. Nevill-Manning reminded webmasters to not use any method that subverts the way Google ranks such as hiding text or links on a page.

Google offers more helpful webmaster information on its site.

As I said at the beginning of this piece, much of this is effectively common knowledge to the search engine optimisation community, the reason to publish it like this is because there is nothing like the ‘Creator’ confirming that which is established through experiment.

Subjects: , , , , ,
Filed in Blog, July 22nd, 2004. Leave a comment, or trackback from your own site. Follow comments via RSS

Leave a comment