A Firefox extension called “Fasterfox” can “pre-fetches” links on website page so that if you were to click on a link it would load much faster because its already been downloaded. This is great for the surfer, but is a major bandwidth problem for webmasters who have low bandwidth. Since Fasterfox can cause many servers to overload much faster than if a person viewing the same content without Fasterfox were to view it.
The latest version of Fasterfox, v1.0.3, checks the for the robots.txt file on the site the viewer is visiting to check whether it should pre-fetch or not. This new feature allows webmasters to add the following text (in bold) to their robots.txt file to prevent Fasterfox from pre-fetching links (and make their site slower). Text To Add To “robots.txt”:
1. User-agent: Fasterfox 2. Disallow: /
Adding these two lines somewhere in your robots.txt file and placing it in the root folder will prevent Fasterfox from pre-fetching links anywhere on your site. (ie. yourwebsite.com/robots.txt) Webmasters can modify the text so that Fasterfox will only be prevented from pre-fetching on specified directories.