![]() |
Multiple spiders
Hi,
Is it just a matter of opening another shell, or browser tab to run another instance of the spider... or is there more to it? Tia for a response on this... |
I have install spider serveral times, in different dirs use one db.
Think this the fasted way to run more than one spider. Have spider more than 19000 urls in 3 weeks, take 1 gb traffic. More/to much spiders make the server slow. Search at the same, during spidering with more than one spider at the same time, results show can come slow also. I have experience, use more than 5 spider at the same time the script hang. I could save the db, trough copy to a other db, but must reinstall the script and delete the old db. Marten :) |
Thank you marb...
By the way... did you figure out how the wrapper.php works? I read through the comments in the script, but I'm still not sure how to implement it for multiple spiders and url.txt files.
|
tryangle wrote;
Quote:
The results I get of multiple spidering is coming from one DB. No matter how may times or on wich url's I install the script. I'm use not the text_content file, Have change this in the congi file. Quote:
http://phpdig.net/showthread.php?s=&threadid=796 and that works good. Marten :) |
All times are GMT -8. The time now is 02:46 PM. |
Powered by vBulletin® Version 3.7.3
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright © 2001 - 2005, ThinkDing LLC. All Rights Reserved.