404 error via shell... no pages indexed
When spidering via web I had no problems, the spider found links and index the pages, but when I do it via ssh:
-------------------------------------------------------------------------- HTTP/1.1 404 Not Found - http://www.chile-empresas.cl/robots.txt<br> See http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html for explanation.<br> 404s are either dead links or something looked like a link to PhpDig so PhpDig tried to crawl it.<br> <br> -------------------------------------------------------------------------- This is a part of my spider.log... I have a cronlist.txt file containing a list of urls, but in all sites appear this message and 0 pages indexed. My spidering settings are: Search Depth:5 Link per:10 Days After:7 Reindex depth:5 |
That is just a warning message, and nothing to be concerned about. Check out this thread. :)
|
Oh! Thanks...
but viewing the shell console appear this messages: PHP Warning: eregi() REG_EMPTY in /home/.../robot_functions.php on line 1252 what means this? ... I have a lot of this messages. |
All times are GMT -8. The time now is 08:16 AM. |
Powered by vBulletin® Version 3.7.3
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright © 2001 - 2005, ThinkDing LLC. All Rights Reserved.