![]() |
Crawler speed improvement (although affects limit)
I had the problem phpdigExplore() returns to many duplicate links. This caused the spider to check 100s of duplicate URLs, which caused a slowdown, and the 1000 pages limit was hit quite fast.
Finally I added the following code at the end of phpDigExplore(): PHP Code:
|
All times are GMT -8. The time now is 02:18 AM. |
Powered by vBulletin® Version 3.7.3
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright © 2001 - 2005, ThinkDing LLC. All Rights Reserved.