![]() |
|
![]() |
#1 |
Green Mole
Join Date: Feb 2006
Posts: 3
|
Problem with robot.txt file
Hi,
I have a robot.txt file at the root of my website that I use to exclude some path from being indexed. Let's say I want to exclude the folder /en/ and all is content from being indexed. The code im using in the robot.txt User-agent: * Disallow: /en/ My problem is that PHPdig is still indexing all the files under /en/ Anything I could do? |
![]() |
![]() |
![]() |
#2 |
Green Mole
Join Date: Feb 2006
Posts: 3
|
I just find my problem...
I was using a robot.txt file instead of robots.txt ![]() |
![]() |
![]() |
![]() |
|
|
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Does the robot? | Dave A | How-to Forum | 2 | 01-08-2005 08:28 PM |
Problem with indexing from text file | bloodjelly | Troubleshooting | 9 | 04-19-2004 03:56 PM |
phpdig spider hangs (a powerpoint file problem) | davideyre | Troubleshooting | 1 | 03-29-2004 12:35 PM |
robots.txt versus robotsxx.txt | Charter | IPs, SEs, & UAs | 0 | 03-11-2004 06:00 PM |
Problem spidering sites at in .txt over 20 address | joshuag200 | Troubleshooting | 3 | 01-30-2004 08:13 PM |