![]() |
pages number limited indexing
Hi people.
When I index a web site I'd like to limit the max number of pages to index per site :bang: . For example I would index only 20 pages on site A, 100 on site B and so on. This can be useful to limit indexing of huge web sites. Do you agree? Best regards. JÿGius³:) |
Sorry, i don't agree this - for what ? The user search words and the word is on page 21 - but this is not index.
Why would you index parts of a Site limit by Pages ? -Roland- |
Hi. I haven't tested the below but what it should do is limit the number of links found per page to a max of 20, where each indexed page will only have a max of 20 links to follow. This is a per page rather than per site adjustment, so if you want to have a max of 100 links for one site, then you'll need to adjust the below added line and/or set a different search depth level.
In spider.php find the following: PHP Code:
PHP Code:
|
Another thing that seems to be working for me, and limits the total number of linked pages written the the database is to find the line:
PHP Code:
PHP Code:
|
Whoops this works for finding 250 links total, but if you want 250 links per site you have to reset $links_found array. So, after this:
PHP Code:
PHP Code:
|
All times are GMT -8. The time now is 02:46 AM. |
Powered by vBulletin® Version 3.7.3
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright © 2001 - 2005, ThinkDing LLC. All Rights Reserved.