I sent this to the php-general mailing list but nobody seemed to be
able to help me there.
I run a shared webserver with a few hundred vhost containers in
Apache's config. Recently I got to a point where I added enough vhosts
to cause a problem with curl functions in PHP. Basically, when PHP
tries to make the curl call to get the webpage, it just hangs the HTTP
request to my webserver. If I remove a few vhost containers, the
problem stops, if I add them back, the problem starts again.
I've tried turning up several ulimit values in the apache init script,
but none of those fixed the problem. I know that its not a problem with
the number of file handles that apache can open because I also had to do
ulimit -n 32768
in the apache init script recently in order to allow more logfiles to be
opened.
Anyone have any ideas as to what might be causing the curl problem?
This is running on:
Apache 2.0.51
PHP 4.3.10
Linux 2.6.13
/proc/sys/fs/file-max is set to 524288
lsof | wc -l returns 33497
Thanks,
Mark
ps. I am well aware that I need to upgrade my software so there is no
need to tell me to do that. I'm working on it.
--
Mark S. Krenz
IT Director
Suso Technology Services, Inc.
http://suso.org/
I sent this to the php-general mailing list but nobody seemed to be
able to help me there.I run a shared webserver with a few hundred vhost containers in
Apache's config. Recently I got to a point where I added enough vhosts
to cause a problem with curl functions in PHP. Basically, when PHP
tries to make the curl call to get the webpage, it just hangs the HTTP
request to my webserver. If I remove a few vhost containers, the
problem stops, if I add them back, the problem starts again.I've tried turning up several ulimit values in the apache init script,
but none of those fixed the problem. I know that its not a problem with
the number of file handles that apache can open because I also had to doulimit -n 32768
in the apache init script recently in order to allow more logfiles to be
opened.Anyone have any ideas as to what might be causing the curl problem?
This is running on:
Apache 2.0.51
PHP 4.3.10
Linux 2.6.13/proc/sys/fs/file-max is set to 524288
lsof | wc -l returns 33497
Thanks,
Mark
ps. I am well aware that I need to upgrade my software so there is no
need to tell me to do that. I'm working on it.
As you intend to upgrade, would it be worth waiting until after the upgrade.
Apache 2.0.59 (not too far behind)
PHP 4.4.4 (maybe a little too far behind)
Linux 2.6.18 (again not too far behind)
Now, there may well be some SIGNIFICANT changes in there.
You say that adding / removing a few makes a difference. Is the
filesize that you are editing (or that ends up being edited if you are
using a web-based tool) significant in size?
Say a 64K or some other bounday? It may be that the filesize has maxed
out (some developer somewhere thought no-one would EVER need more than
1MB of vhost containers!).
--
Richard Quadling
Zend Certified Engineer : http://zend.com/zce.php?c=ZEND002498&r=213474731
"Standing on the shoulders of some very clever giants!"
I really can't upgrade right now. Basically, I'd be risking breaking
the machine because I'd have to upgrade to a newer support version of
Fedora that might have some incompatibilities with some custom packages
I've setup and end up having way too much downtime. Even if it was at 3
in the morning. Its easier to just setup a new machine and migrate
people to it, but I'm not to that point yet. So I'm trying to buy some
time and figure out why I can't add any more vhosts.
When you say filesize, do you mean the filesize of the vhost.conf file
that holds the vhost config? If so then that doesn't seem to be the
case. The vhost.conf file is currently 198K in size and has 315 vhosts.
It stops working if I have 319 or 320 vhosts.
I'm going to try setting this up on a newer machine with newer
versions and see if I run into the same problem with the curl functions
under PHP.
If this helps, this is the curl code that I'm using to test whether
curl works or not.
<?php
$ch = curl_init("http://www.cnn.com/");
$fp = fopen("example_homepage.txt", "w");
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
print "Well, it seemed to work because this is the last function
call<BR>\n";
?>
When it fails to work due to having too many vhosts turned on, nothing
is written to the file and my browser just sits there waiting for the
page to complete. Interestingly enough, things like max_execution_time
doesn't seem to stop the request after the allotted time. It can sit
there for 10+ minutes and never stop.
Mark
On Thu, Sep 21, 2006 at 02:27:42PM GMT, Richard Quadling [rquadling@googlemail.com] said the following:
As you intend to upgrade, would it be worth waiting until after the upgrade.
Apache 2.0.59 (not too far behind)
PHP 4.4.4 (maybe a little too far behind)
Linux 2.6.18 (again not too far behind)Now, there may well be some SIGNIFICANT changes in there.
You say that adding / removing a few makes a difference. Is the
filesize that you are editing (or that ends up being edited if you are
using a web-based tool) significant in size?Say a 64K or some other bounday? It may be that the filesize has maxed
out (some developer somewhere thought no-one would EVER need more than
1MB of vhost containers!).--
Richard Quadling
Zend Certified Engineer :
http://zend.com/zce.php?c=ZEND002498&r=213474731
"Standing on the shoulders of some very clever giants!"--
--
Mark S. Krenz
IT Director
Suso Technology Services, Inc.
http://suso.org/
To follow up on this, it might be a problem that was fixed in a more
recent version of PHP. Or at least in the curl code.
I ran a test on a recent Gentoo Linux machine running Apache 2.0.54,
PHP 5.1.4 and Linux 2.6.15 with 10,000 vhosts in one vhost.conf file and
the curl test ran fine. A little slow, but works.
So I'd still appretiate any insight into how I can fix the problem with
my current version. Because that's where the problem is.
From looking through the last few years of bug reports related to
curl, I haven't found anything related to what I am experiencing.
Mark
On Thu, Sep 21, 2006 at 03:28:28PM GMT, Mark Krenz [mark@suso.org] said the following:
I really can't upgrade right now. Basically, I'd be risking breaking
the machine because I'd have to upgrade to a newer support version of
Fedora that might have some incompatibilities with some custom packages
I've setup and end up having way too much downtime. Even if it was at 3
in the morning. Its easier to just setup a new machine and migrate
people to it, but I'm not to that point yet. So I'm trying to buy some
time and figure out why I can't add any more vhosts.When you say filesize, do you mean the filesize of the vhost.conf file
that holds the vhost config? If so then that doesn't seem to be the
case. The vhost.conf file is currently 198K in size and has 315 vhosts.
It stops working if I have 319 or 320 vhosts.I'm going to try setting this up on a newer machine with newer
versions and see if I run into the same problem with the curl functions
under PHP.If this helps, this is the curl code that I'm using to test whether
curl works or not.<?php
$ch = curl_init("http://www.cnn.com/");
$fp = fopen("example_homepage.txt", "w");curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);curl_exec($ch);
curl_close($ch);
fclose($fp);print "Well, it seemed to work because this is the last function
call<BR>\n";?>
When it fails to work due to having too many vhosts turned on, nothing
is written to the file and my browser just sits there waiting for the
page to complete. Interestingly enough, things like max_execution_time
doesn't seem to stop the request after the allotted time. It can sit
there for 10+ minutes and never stop.Mark
On Thu, Sep 21, 2006 at 02:27:42PM GMT, Richard Quadling [rquadling@googlemail.com] said the following:
As you intend to upgrade, would it be worth waiting until after the upgrade.
Apache 2.0.59 (not too far behind)
PHP 4.4.4 (maybe a little too far behind)
Linux 2.6.18 (again not too far behind)Now, there may well be some SIGNIFICANT changes in there.
You say that adding / removing a few makes a difference. Is the
filesize that you are editing (or that ends up being edited if you are
using a web-based tool) significant in size?Say a 64K or some other bounday? It may be that the filesize has maxed
out (some developer somewhere thought no-one would EVER need more than
1MB of vhost containers!).--
Richard Quadling
Zend Certified Engineer :
http://zend.com/zce.php?c=ZEND002498&r=213474731
"Standing on the shoulders of some very clever giants!"--
--
Mark S. Krenz
IT Director
Suso Technology Services, Inc.
http://suso.org/--
--
Mark S. Krenz
IT Director
Suso Technology Services, Inc.
http://suso.org/
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Mark Krenz wrote:
To follow up on this, it might be a problem that was fixed in a more
recent version of PHP. Or at least in the curl code.I ran a test on a recent Gentoo Linux machine running Apache 2.0.54,
PHP 5.1.4 and Linux 2.6.15 with 10,000 vhosts in one vhost.conf file and
the curl test ran fine. A little slow, but works.So I'd still appretiate any insight into how I can fix the problem with
my current version. Because that's where the problem is.From looking through the last few years of bug reports related to
curl, I haven't found anything related to what I am experiencing.
If you have compiled curl as shared module you can try to compile in a
more current version, it may not compile as clean and may need some
source line fixes, but would it be worth a try?
Can you eliminate that the problem maybe is the curl library itself?
HTH
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.3 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFFE5R01nS0RcInK9ARAjuzAJ4o3vJCl/YNv81lFjMeBCSSo5vgSACfU/dZ
7Og3BmZZi7/6FwqfHI2nWcw=
=GFQT
-----END PGP SIGNATURE
<?php
$ch = curl_init("http://www.cnn.com/");
$fp = fopen("example_homepage.txt", "w");
In the realm of Voodoo Programming, but worth a shot nonetheless...
curl_init()
;
curl_setopt($ch, CURLOPT_URL, 'http://www.cnn.com');
I vaguely recall a problem back in the day where under some
circumstances the arg to curl_init was messing me up. This could
EASILY have been that I had a typo in my own arg, of course. It's a
Swiss-cheese memory...
When it fails to work due to having too many vhosts turned on,
nothing
is written to the file and my browser just sits there waiting for the
page to complete. Interestingly enough, things like
max_execution_time
doesn't seem to stop the request after the allotted time. It can sit
there for 10+ minutes and never stop.
No surprise.
PHP only "counts" CPU time that PHP is spending, not CPU time that
curl or MySQL or exec()
or extension xyz is spending.
You should be able to set a curl time out though, so you can at least
kill off this script when it is taking too long...
Any followup on this bit should move back to PHP-General, I should think.
--
Like Music?
http://l-i-e.com/artists.htm