MovableBlog: Grub Distributed Search Engine
March 19, 2003
By having websites crawl their own content, and having volunteers donate their bandwidth and clock cycle resources, it decreases bandwidth consumption across the Internet dramatically, allows for pre-processing on the resulting data, and ultimately improves search results sent to end users."
Emphasis mine. Somehow I don't think bandwidth will be decreased dramatically, but rather merely redistributed. It also apparently respects robots.txt. Mark Pilgrim has yet to pronounce that it is a unwanted robot from hell and/or include it in his robots.txt. [first link via Scripting News]