PDA

View Full Version : Shell mode not working


bloodjelly
03-17-2004, 03:32 PM
Hi -

I've been running this command fine for a while, so I know it worked at one point:



PHP:
--------------------------------------------------------------------------------

exec("/usr/bin/php -f /path/to/spider.php $site >> /dev/null &");

--------------------------------------------------------------------------------


where $site = "http://www.mysite.com/"

This worked great until I recently upgraded to a newer version of PHP, and now the command doesn't produce any results. I tried the command EXEC('WHOAMI') which worked, so I know exec still functions properly. The path to PHP is correct as well. Any ideas?

bloodjelly
03-21-2004, 03:35 PM
I figured out that the problem comes from this code in the configuration file:

if ((isset($relative_script_path)) && ($relative_script_path != ".") && ($relative_script_path != "..")) { exit();
}

Without this line, the script runs fine. I put the exec() script in the search root directory and set $relative_script_path = '.', but it didn't work. Hmm...

Charter
03-21-2004, 03:50 PM
>> exec("/usr/bin/php -f /path/to/spider.php $site >> /dev/null &");

Hi. If you run spider.php from the admin directory it should be okay, but if you run spider.php from another directory, try adding ../path/to as part of that code in the configuration file.

bloodjelly
03-21-2004, 04:40 PM
ok, it works in the admin directory. Thanks charter.:)

bloodjelly
03-21-2004, 09:57 PM
spoke too soon -

It's having weird problems now. It only spiders one site at a time, even when I run multiple execs, and sometimes they run after I close the browser, sometimes they don't, even when all of them are told to run in the background with the command I gave.

I'm going insane:bang:

Charter
03-23-2004, 12:19 AM
Hi. Closing the browser does not necessarily stop a process. To guarantee the stop of a process, perhaps use kill -9 PID where PID is the process ID number. Also, when you run multiple execs, try watching the tempspider table to see if the multiple site URLs show up in the table.

bloodjelly
03-23-2004, 10:09 AM
I've sent the question to globalservers to see what they can make of it. Redirecting the errors last night gave me this message when I tried running the exec command twice with two different urls:

"/usr/bin/php: error while loading shared libraries: libstdc++-libc6.2-2.so.3: cannot open shared object file: Error 23"

Then pretty soon I reloaded the page enough to get this:

"PHP Warning: Unable to load dynamic library '/usr/lib/php4/ldap.so' - libldap.so.2: cannot open shared object file: Too many open files in system in Unknown on line 0"

Still I can only get one site to spider at a time. Maybe it has something to do with the redirection?

Charter
03-24-2004, 09:13 AM
Hi. There was another "too many open files" error reported in this (http://www.phpdig.net/showthread.php?threadid=673) thread. Maybe your errors are for the same reason? Perhaps the wrapper in this (http://www.phpdig.net/showthread.php?threadid=662) thread may be of interest.

bloodjelly
04-08-2004, 04:43 PM
Thanks for the help, charter. The problem was with Globalservers, but as they have a tendency of doing, they fixed the problem without telling me what it was. I might try that wrapper anyway, though. Thanks again :D