PDA

View Full Version : Multiple spiders


tryangle
04-21-2004, 04:26 AM
Hi,

Is it just a matter of opening another shell, or browser tab to run another instance of the spider... or is there more to it?

Tia for a response on this...

marb
04-22-2004, 10:09 AM
I have install spider serveral times, in different dirs use one db.
Think this the fasted way to run more than one spider.
Have spider more than 19000 urls in 3 weeks, take 1 gb traffic.
More/to much spiders make the server slow.
Search at the same, during spidering with more than one spider at the same time, results show can come slow also.
I have experience, use more than 5 spider at the same time the script hang.
I could save the db, trough copy to a other db, but must reinstall the script and delete the old db.

Marten :)

tryangle
04-23-2004, 06:35 PM
By the way... did you figure out how the wrapper.php (http://www.phpdig.net/showthread.php?threadid=662) works? I read through the comments in the script, but I'm still not sure how to implement it for multiple spiders and url.txt files.

marb
04-24-2004, 02:43 AM
tryangle wrote;

By the way... did you figure out how the wrapper.php works? I read through the comments in the script, but I'm still not sure how to implement it for multiple spiders and url.txt files.
No I dont work with wrapper.php, I want to tryout.
The results I get of multiple spidering is coming from one DB.
No matter how may times or on wich url's I install the script.
I'm use not the text_content file, Have change this in the congi file.

define('SUMMARY_LENGTH',50000); //length of results summary

define('TEXT_CONTENT_PATH','text_content/'); //Text content files path
define('CONTENT_TEXT',0); //Activates/deactivates the
//storage of text content.
It was a idea of René Haentjes see;
http://phpdig.net/showthread.php?s=&threadid=796
and that works good.

Marten :)