Grefix
01-14-2004, 06:58 AM
This may seem like an odd question but when crawling a site (for instance http://www.xxx.com) is it possible to stop the spider after it has spidered a certain amount of the site's bandwidth?
I ask this because my site spiders sites hosted by a free webhost with a limited amount of bandwith. A few days ago the spider got hung on one of these and used about 56MB of it's bandwidth. You can imagine the owner of that site wasn'tvery happy with that.
I ask this because my site spiders sites hosted by a free webhost with a limited amount of bandwith. A few days ago the spider got hung on one of these and used about 56MB of it's bandwidth. You can imagine the owner of that site wasn'tvery happy with that.