acidelic
04-27-2004, 09:58 AM
I'm encountering a problem getting the spider to work. Creating a robots.txt file did not seem to help.
Here is my robots.txt:
User-agent: *
Disallow: /include/
When I tell it to spider localhost or the hostname of the local server (depth=5) always results in this:
Spidering in progress...
--------------------------------------------------------------------------------
SITE : http://localhost/
Exclude paths :
- include/
There are no CPU cycles being used and the page is fully loaded. What else can I check? How will I know that the spider process is actually doing something?
Thanks
Here is my robots.txt:
User-agent: *
Disallow: /include/
When I tell it to spider localhost or the hostname of the local server (depth=5) always results in this:
Spidering in progress...
--------------------------------------------------------------------------------
SITE : http://localhost/
Exclude paths :
- include/
There are no CPU cycles being used and the page is fully loaded. What else can I check? How will I know that the spider process is actually doing something?
Thanks