![]() |
|
![]() |
#1 |
Green Mole
Join Date: Jan 2006
Posts: 1
|
Incomplete spidering
I have not been able to find any other posts that answer my question.
The spidering stops with no error message at times ranging from 25-ish seconds to 4-ish minutes. Each time I spider and usually I click 'stop spider' and the number of pages in the database has not gone up. I am indexing via browser - firefox and have Code:
network.http.keep-alive.timeout = 600 Code:
max_execution_time = 600 max_input_time = 600 i entered 1 link Code:
search depth: 1 links per: 0 some other settings: Code:
define('SPIDER_MAX_LIMIT',20); //max recurse levels in spider define('RESPIDER_LIMIT',5); //recurse respider limit for update define('LINKS_MAX_LIMIT',20); //max links per each level define('RELINKS_LIMIT',5); //recurse links limit for an update //for limit to directory, URL format must either have file at end or ending slash at end //e.g., http://www.domain.com/dirs/ (WITH ending slash) or http://www.domain.com/dirs/dirs/index.php define('LIMIT_TO_DIRECTORY',false); //limit index to given (sub)directory, no sub dirs of dirs are indexed define('LIMIT_DAYS',0); //default days before reindex a page define('SMALL_WORDS_SIZE',2); //words to not index - must be 2 or more |
![]() |
![]() |