View Full Version : Spidering a directory - timeout after 10 documents

03-15-2004, 04:03 AM
Hi here is my config:

+ PhpDig 1.8.0
+ PHP Version 4.2.3
+ safe_mode: off
+ mysql 3.23.49
+ Server API: CGI
+ System: Linux infong 2.4.21 #1 SMP Wed Jul 30 09:58:54 CEST 2003 i686 unknown

+ Directories admin/temp, includes & text_content permissions set to 777.
+ Database holds the following tables: engine, excludes, keywords,logs, sites, spider & tempspider.
+ not behind any firewall

define('SPIDER_MAX_LIMIT',300); //max recurse levels in spider
define('SPIDER_DEFAULT_LIMIT',300); //default value
define('RESPIDER_LIMIT',300); //recurse limit for update


I try to spider a directory called "glossar" with 24 documents.
Beginning to spider with Search depth: 1

And after ca. 1 Minute the prozess stopped and only 10 documents were spidered.

What can I do?

thx for support

tams (hamburg - germany)

03-15-2004, 06:47 AM
It sounds like your settings are all correct. I found the first time that I spidered my site, the process seemed to hang like that. However, I cleared all the tables and started over. The second time, the spidering seemed to work fine.

Try doing that, and let us know how it goes. :)

03-15-2004, 10:31 AM
i respidered and everything is ok.

thx you