![]() |
|
![]() |
#1 |
Green Mole
Join Date: Jan 2004
Posts: 1
|
Stop spidering site after using an amount of bandwidth
This may seem like an odd question but when crawling a site (for instance http://www.xxx.com) is it possible to stop the spider after it has spidered a certain amount of the site's bandwidth?
I ask this because my site spiders sites hosted by a free webhost with a limited amount of bandwith. A few days ago the spider got hung on one of these and used about 56MB of it's bandwidth. You can imagine the owner of that site wasn'tvery happy with that. |
![]() |
![]() |
![]() |
#2 |
Head Mole
Join Date: May 2003
Posts: 2,539
|
Hi. The below is untested, but you might try making the following changes in the spider.php file. Of course, another alternative is to avoid crawling such sites or use a search depth of zero or one.
PHP Code:
__________________
Responses are offered on a voluntary if/as time is available basis, no guarantees. Double posting or bumping threads will not get your question answered any faster. No support via PM or email, responses not guaranteed. Thank you for your comprehension. |
![]() |
![]() |
![]() |
|
|
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Bandwidth requirements? | new2dev | How-to Forum | 1 | 02-17-2005 05:36 AM |
phpdig blocked when spidering any site | heli | Troubleshooting | 3 | 09-30-2004 10:42 AM |
Fixing spider.php, protecting from locking site after timeout or users stop | Konstantine | Mod Submissions | 3 | 04-09-2004 12:37 PM |
Spidering issue with my site | pager | Troubleshooting | 5 | 01-19-2004 10:05 AM |
Problems spidering dynamic site | Ph0nK | Troubleshooting | 1 | 01-13-2004 03:39 PM |