PhpDig.net

PhpDig.net (http://www.phpdig.net/forum/index.php)
-   How-to Forum (http://www.phpdig.net/forum/forumdisplay.php?f=33)
-   -   Exclude filenames with certain attributes? (http://www.phpdig.net/forum/showthread.php?t=1684)

the_hut2 12-30-2004 07:32 AM

Exclude filenames with certain attributes?
 
All installed and spidering nicely. However, the directory in which the content resides contains a mailarchive of 8000 messages, each of which has its own .html file. Additionally it contains indexes for every 15 messages, again each index has its own .html file.

If the files are in the form:

http://www.mydomain.com/message001.html
http://www.mydomain.com/message002.html
http://www.mydomain.com/message003.html
etc

and

http://www.mydomain.com/index001.html
http://www.mydomain.com/index002.html

is it possible to restrict the spider so that any file which contains the characters "index" in the title is ignored or, alternatively, restrict the spider such that it only searches files which start with the characters "message"?

I had hoped that robots.txt would have been the answer, but you cannot use wildcards to specify exclusions. Typing in a list of the path of every file beginning with "index" is not an option (there are over 150 of them) and in any case, the content is updated every day....

Any help MUCH appreciated

rAdoN 12-30-2004 01:29 PM

in config - see

http://www.phpdig.net/forum/showthread.php?t=1659

the_hut2 12-30-2004 04:00 PM

Thanks for your help, which is much appreciated, but I am still failing. Here is what I did:

I commented out the old FORBIDDEN_EXTENSIONS line and replaced it with

PHP Code:

define('FORBIDDEN_EXTENSIONS','(*index*|guestbook|\.(html|cgi|php|asp|pl|rm|ico|cab|swf|  css|gz|z|tar|zip|tgz|msi|arj|zoo|rar|r[0-9]+|exe|bin|pkg|rpm|deb|bz2)$)'); 

The difference between this and the elements in your post in this thread : http://www.phpdig.net/forum/showthread.php?t=1659 is that I added in *index* before the \ and html after it. I aslo deleted the space.

Then in robot_functions php file, I changed the 3 instances of regs[5] around the !eregi(BANNED,$regs[5]) piece for regs[2]

However, the spider still continues to index files named index123.html

Where am I going wrong?

rAdoN 12-30-2004 04:22 PM

like this

PHP Code:

define('FORBIDDEN_EXTENSIONS','(index[0-9]+\.html|guestbook|\.(cgi|php|asp|pl|rm|ico|cab|swf|css|gz|z|tar|zip|tgz|msi|arj|zoo|rar|r[0-9]+|exe|bin|pkg|rpm|deb|bz2)$)'); 

remove space

remove cgi|php|asp|pl| if you want index of links end with that extensions

you need to go admin update - delete links you no want - run the cleans - edit FORBIDDEN_EXTENSIONS - index

rAdoN 12-30-2004 04:37 PM

Quote:

Originally Posted by the_hut2
Then in robot_functions php file, I changed the 3 instances of regs[5] around the !eregi(BANNED,$regs[5]) piece for regs[2]

ps - not all three - just !eregi(BANNED,$regs[5]) to !eregi(BANNED,$regs[2]) - only for ban words in BANNED for all link instead of domain

the_hut2 12-31-2004 02:17 AM

Great! That works! :banana:

Thanks for your help.


All times are GMT -8. The time now is 12:47 AM.

Powered by vBulletin® Version 3.7.3
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright © 2001 - 2005, ThinkDing LLC. All Rights Reserved.