Newsgroups: php.internals Path: news.php.net Xref: news.php.net php.internals:21638 Return-Path: Mailing-List: contact internals-help@lists.php.net; run by ezmlm Delivered-To: mailing list internals@lists.php.net Received: (qmail 64289 invoked by uid 1010); 23 Jan 2006 03:08:08 -0000 Delivered-To: ezmlm-scan-internals@lists.php.net Delivered-To: ezmlm-internals@lists.php.net Received: (qmail 64274 invoked from network); 23 Jan 2006 03:08:07 -0000 Received: from unknown (HELO lists.php.net) (127.0.0.1) by localhost with SMTP; 23 Jan 2006 03:08:07 -0000 X-Host-Fingerprint: 216.9.132.134 mail.suso.org Linux 2.5 (sometimes 2.4) (4) Received: from ([216.9.132.134:44970] helo=arvo.suso.org) by pb1.pair.com (ecelerity 2.0 beta r(6323M)) with SMTP id 71/2A-06819-79844D34 for ; Sun, 22 Jan 2006 22:08:07 -0500 Received: by arvo.suso.org (Postfix, from userid 509) id F1C801301B6; Mon, 23 Jan 2006 03:10:50 +0000 (GMT) Date: Mon, 23 Jan 2006 03:10:50 +0000 To: internals@lists.php.net Message-ID: <20060123031050.GD23755@arvo.suso.org> Mime-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Disposition: inline User-Agent: Mutt/1.5.11 Subject: Fwd: [PHP] proc_open and buffer limit? From: mark@suso.org (Mark Krenz) I asked this last night on the general mailing list and also have asked around about it. Nobody seems to know. I normally wouldn't ask a support quesiton on a developers mailing list, but nobody else seems to know so I thought that perhaps a developer knows. Why would I be running into this STDIN data size limit when using proc_open? I've tried setting the limits for apache to unlimited in /etc/security/limits.conf just to see if its a system limit. ----- Forwarded message from Mark Krenz ----- Date: Sun, 22 Jan 2006 00:25:33 +0000 From: Mark Krenz To: php-general@lists.php.net Subject: [PHP] proc_open and buffer limit? I'm using PHP 5.1.1 on Apache 2.0.54 on Gentoo Linux. I've been trying to write a program to pass information to a program using proc_open, however when I do, it only passes the first 65536 bytes of the stream and then cuts off the rest. To make sure its not the program I'm trying to send to, I tries using /bin/cat instead and get the same problem. Below, I've included the code that I'm using, which for the most part is from the proc_open documentation page. For testing, I'm reading from a word dictionary which is over 2MB in size. Is there something I'm missing about using proc_open? ------------------------------------------------------------------------- $program = "/bin/cat"; $descriptorspec = array( 0 => array("pipe", "r"), 1 => array("pipe", "w"), 2 => array("file", "/tmp/error-output.txt", "a") ); $cwd = '/var/www'; $env = array('HOME' => '/var/www'); $process = proc_open($program, $descriptorspec, $pipes, $cwd, $env); if (is_resource($process)) { stream_set_blocking($pipes[0], FALSE); stream_set_blocking($pipes[1], FALSE); $handle = fopen("/usr/share/dict/words", "r"); while (!feof($handle)) { $input .= fread($handle, 8192); } fwrite($pipes[0], $input); fclose($handle); fclose($pipes[0]); while (!feof($pipes[1])) { $output .= fgets($pipes[1], 8192); } fclose($pipes[1]); print "
$output


\n"; $return_value = proc_close($process); echo "command returned $return_value\n"; } ------------------------------------------------------------------------- -- Mark S. Krenz IT Director Suso Technology Services, Inc. http://suso.org/ -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php ----- End forwarded message ----- -- Mark S. Krenz IT Director Suso Technology Services, Inc. http://suso.org/