Stop spidering site after using an amount of bandwidth
This may seem like an odd question but when crawling a site (for instance http://www.xxx.com) is it possible to stop the spider after it has spidered a certain amount of the site's bandwidth?
I ask this because my site spiders sites hosted by a free webhost with a limited amount of bandwith. A few days ago the spider got hung on one of these and used about 56MB of it's bandwidth. You can imagine the owner of that site wasn'tvery happy with that. |
Hi. The below is untested, but you might try making the following changes in the spider.php file. Of course, another alternative is to avoid crawling such sites or use a search depth of zero or one.
PHP Code:
|
All times are GMT -8. The time now is 05:58 PM. |
Powered by vBulletin® Version 3.7.3
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright © 2001 - 2005, ThinkDing LLC. All Rights Reserved.