PhpDig.net

Go Back   PhpDig.net > PhpDig Forums > Troubleshooting

Reply
 
Thread Tools
Old 03-22-2004, 04:18 AM   #1
peterpeter
Green Mole
 
Join Date: Mar 2004
Posts: 7
Max exec time of 30 seconds exceeded

Hi,

Recently I installed PhpDig 1.8.0. However, as others in this forum I receive the following message:

Fatal error: Maximum execution time of 30 seconds exceeded in /xx/.../phpdig/admin/robot_functions.php on line nnn.

My host's config is Linux/Apache/PHP 4.3.2, safe_mode = on. I can't change that and I'm afraid they won't either.

Spidering seems to go well for most of the files, except for some large (450 kB) ones. In these cases the message appears. I already tried to spider the files separately, but to no avail.

I've read the other threads on this exec time subject, but it didn't clear the air for me.

Any suggestions on what may be the cause and, more importantly, how to resolve this?

Thanks, Peter.
peterpeter is offline   Reply With Quote
Old 03-23-2004, 12:30 AM   #2
Charter
Head Mole
 
Charter's Avatar
 
Join Date: May 2003
Posts: 2,539
Hi. From php.net is the following:

set_time_limit() has no effect when PHP is running in safe mode. There is no workaround other than turning off safe mode or changing the time limit in the php.ini.

For PhpDig to fully function, safe_mode should be set to no. However, you might find this thread useful.
__________________
Responses are offered on a voluntary if/as time is available basis, no guarantees. Double posting or bumping threads will not get your question answered any faster. No support via PM or email, responses not guaranteed. Thank you for your comprehension.
Charter is offline   Reply With Quote
Old 03-23-2004, 03:04 AM   #3
peterpeter
Green Mole
 
Join Date: Mar 2004
Posts: 7
Hi Charter,

Thanks for the reply. I found the thread interesting, but it didn't help. Some additional information:

The largest file correctly spidered is 550 Kb. The file that is giving me trouble is not the largest one (420 Kb), but it does contain a lot of HTML tags, in particular internal links (links to named anchor). The fatal error always points to robot_functions.php line 141, where the tags are cleaned.

Any suggestion of how this process can be speeded up or what else I can do?

Peter
peterpeter is offline   Reply With Quote
Old 03-23-2004, 05:17 AM   #4
Konstantine
Green Mole
 
Join Date: Mar 2004
Location: Russia
Posts: 21
Maybe this thread will interest you.
Konstantine is offline   Reply With Quote
Old 03-23-2004, 07:11 AM   #5
peterpeter
Green Mole
 
Join Date: Mar 2004
Posts: 7
Thanks Konstantine, but that didn't help either.

However, I found a quick&dirty solution:

1. I temporarily removed all <A HREF ...> and <A NAME ...> tags from the file.
2. Replaced the original file with the adapted version.
3. Spidered the adapted version, which went great !!
4. Replaced the updated file with the original one.

It does work for me but, although the files on my site will change only every few weeks, I'm not looking forward to do this trick every time

So if anyone has an idea for a clean solution ....

Peter
peterpeter is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
command line exec() pulls wrong file (php as cgi?) kzant Troubleshooting 2 02-01-2005 07:14 AM
Indexing stops after 2 seconds msberryuk Troubleshooting 3 01-13-2005 05:28 AM
Max indexing ? christophe How-to Forum 4 01-01-2005 12:42 PM


All times are GMT -8. The time now is 10:46 PM.


Powered by vBulletin® Version 3.7.3
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright © 2001 - 2005, ThinkDing LLC. All Rights Reserved.