Hi. Some users have run spider.php on different (sub)domains at the same time using the same database tables without incident. However, PhpDig doesn't specifically account for
multithreading issues.
If you want to try running PhpDig in a distributed fashion on the same domain, perhaps set the the following in the config.php file, where X is one or two:
PHP Code:
define('SPIDER_MAX_LIMIT',X); //max recurse levels in spider
define('SPIDER_DEFAULT_LIMIT',X); //default value
define('RESPIDER_LIMIT',X); //recurse limit for update
define('LIMIT_DAYS',0); //default days before reindex a page
and try entering the site at different spots, for example:
Code:
prompt> php -f spider.php http://www.domain.com/dir1/ &
prompt> php -f spider.php http://www.domain.com/dir2/ &
The &
backgrounds the process and returns you to the shell prompt.