I've been using PHP for linux command-line applications. Some are quite
large. I've built the code to combine the mainline plus everything it
calls into a single file to avoid portability issues with include
libraries. I've built the code to compress the resulting file using
gzdeflate after optionally stripping comments and excess whitespace.
As a result, I have the uncompressed code in a variable after using
gzinflate. Executing it cleanly has become an issue, and I'm looking
for a solution. I see the following possible solutions:
-
Build the mainline as a function, write the decompressed code to a
temp file, include the temp file, delete the temp file, then invoke the
mainline function. This works reasonably well with the exception that
magic constants like FILE are set during the parsing of the include
file. The result is that for example FILE contains the name of the
temp file, which causes results other than the original. I know of no
way to change FILE once it has been set, and if the application
relaunches using FILE it is attempting to invoke the now-missing
temp file. -
Build the mainline as it was originally coded, write the decompressed
code to a temp file, include the temp file. The problem with this
approach is that if the application issues an exit() the temp file will
be left laying around. Additional issues may exist but this one is imo
a show-stopper. -
Pass the decompressed code to eval(). This approach is rather a joke
due to the well-intentioned efforts of whoever chose to consider eval()
a security exposure and modified echo to tell the user it is eval'ed code.
Approach (1) seems the most promising but using it will require that the
target applications be specially coded with regard to FILE and
possibly other magic constants. I really don't want to place special
requirements on the coding of the target application.
Suggestions would be appreciated, as I don't want to have to modify the
interpreter at this point. Thanks in advance.
Hi,
You may use register_shutdown_function()
to clean things up after exit()
http://jp2.php.net/manual/en/function.register-shutdown-function.php
So simply extract files to tmp dir and delete everything after execution.
I guess this is what you need.
BTW, I don't think eval() is evil as long as programmers know what they
are doing.
Regards,
--
Yasuo Ohgaki
yohgaki@ohgaki.net
2013/7/18 crankypuss fullmoon@newsguy.com
I've been using PHP for linux command-line applications. Some are quite
large. I've built the code to combine the mainline plus everything it
calls into a single file to avoid portability issues with include
libraries. I've built the code to compress the resulting file using
gzdeflate after optionally stripping comments and excess whitespace.As a result, I have the uncompressed code in a variable after using
gzinflate. Executing it cleanly has become an issue, and I'm looking for a
solution. I see the following possible solutions:
Build the mainline as a function, write the decompressed code to a temp
file, include the temp file, delete the temp file, then invoke the mainline
function. This works reasonably well with the exception that magic
constants like FILE are set during the parsing of the include file.
The result is that for example FILE contains the name of the temp
file, which causes results other than the original. I know of no way to
change FILE once it has been set, and if the application relaunches
using FILE it is attempting to invoke the now-missing temp file.Build the mainline as it was originally coded, write the decompressed
code to a temp file, include the temp file. The problem with this approach
is that if the application issues an exit() the temp file will be left
laying around. Additional issues may exist but this one is imo a
show-stopper.Pass the decompressed code to eval(). This approach is rather a joke
due to the well-intentioned efforts of whoever chose to consider eval() a
security exposure and modified echo to tell the user it is eval'ed code.Approach (1) seems the most promising but using it will require that the
target applications be specially coded with regard to FILE and possibly
other magic constants. I really don't want to place special requirements
on the coding of the target application.Suggestions would be appreciated, as I don't want to have to modify the
interpreter at this point. Thanks in advance.
Hi,
You may use
register_shutdown_function()
to clean things up after exit()http://jp2.php.net/manual/en/function.register-shutdown-function.php
So simply extract files to tmp dir and delete everything after execution.
I guess this is what you need.
I would expect that if the target application also registers a shutdown
function there could be problems, but its potential nesting is something
to look into.
I suppose it would also be possible to fork the application and delete
the temp file once the "child" fork had completed, which /might/ be less
prone to being overridden by an arbitrary application.
BTW, I don't think eval() is evil as long as programmers know what they
are doing.
Neither do I, but that in itself doesn't change the way it works. <g>
Thanks for your suggestion.
crankypuss wrote:
... I don't want to have to modify the interpreter at this point...
Sorry, but this list is for just this purpose, so you post does belong
on the DL.
Regards Terry
PS. read up on PHAR extensions and use of streams. There's nothing
stopping you specifying a phar or even a compress.zlib stream on the
commandline.
crankypuss wrote:
... I don't want to have to modify the interpreter at this point...
Sorry, but this list is for just this purpose, so you post does belong
on the DL.
Everyone loves a list-nanny <g> Not to worry too much, I expect to get
around to it shortly, or maybe just say fork-it, or shamelessly grab
some of the code as part of a base.
Regards Terry
PS. read up on PHAR extensions and use of streams. There's nothing
stopping you specifying a phar or even a compress.zlib stream on the
commandline.
I see from a quick google that this might be at least a partial answer,
but I'll need to learn more about it; the objective is to reduce
external dependencies rather than to add one.
I've been using PHP for linux command-line applications. Some are quite large. I've built the code to combine the mainline plus everything it calls into a single file to avoid portability issues with include libraries. I've built the code to compress the resulting file using gzdeflate after optionally stripping comments and excess whitespace.
didn't you just reinvent the PHAR?
http://docs.php.net/manual/en/intro.phar.php
Phar archives are best characterized as a convenient way to group several files into a single file.
As such, a phar archive provides a way to distribute a complete PHP application in a single file
and run it from that file without the need to extract it to disk. Additionally, phar archives can
be executed by PHP as easily as any other file, both on the commandline and from a web server.
The Phar extension is built into PHP as of PHP version 5.3.0 so you don't need to explicitly install it.
It's already present in PHP
--
Alexey Zakhlestin
CTO at Grids.by/you
https://github.com/indeyets
PGP key: http://indeyets.ru/alexey.zakhlestin.pgp.asc
I've been using PHP for linux command-line applications. Some are quite large. I've built the code to combine the mainline plus everything it calls into a single file to avoid portability issues with include libraries. I've built the code to compress the resulting file using gzdeflate after optionally stripping comments and excess whitespace.
didn't you just reinvent the PHAR?
I think not, though on the surface there are similarities.
Rather than conveniently including an entire directory full of code that
is mostly unused, I'm including only the code that is used, and other
functions of the program that does this include getting a call-tree,
listing references, and so forth; the function of collecting all the
actually-used code into a single file is only a small part of the whole.
And strange as it may seem, this whole project is only an interim
measure on the way to something that might be quite different.
Thanks for mentioning Phar though, as Terry mentioned it might be useful
in compressing/executing the single-file application, though I haven't
yet figured it out to that extent.
I've been using PHP for linux command-line applications. Some are quite
large. I've built the code to combine the mainline plus everything it
calls into a single file to avoid portability issues with include
libraries. I've built the code to compress the resulting file using
gzdeflate after optionally stripping comments and excess whitespace.As a result, I have the uncompressed code in a variable after using
gzinflate. Executing it cleanly has become an issue, and I'm looking for a
solution. I see the following possible solutions:
Build the mainline as a function, write the decompressed code to a temp
file, include the temp file, delete the temp file, then invoke the mainline
function. This works reasonably well with the exception that magic
constants like FILE are set during the parsing of the include file.
The result is that for example FILE contains the name of the temp
file, which causes results other than the original. I know of no way to
change FILE once it has been set, and if the application relaunches
using FILE it is attempting to invoke the now-missing temp file.Build the mainline as it was originally coded, write the decompressed
code to a temp file, include the temp file. The problem with this approach
is that if the application issues an exit() the temp file will be left
laying around. Additional issues may exist but this one is imo a
show-stopper.Pass the decompressed code to eval(). This approach is rather a joke
due to the well-intentioned efforts of whoever chose to consider eval() a
security exposure and modified echo to tell the user it is eval'ed code.Approach (1) seems the most promising but using it will require that the
target applications be specially coded with regard to FILE and possibly
other magic constants. I really don't want to place special requirements
on the coding of the target application.Suggestions would be appreciated, as I don't want to have to modify the
interpreter at this point. Thanks in advance.--
check out http://us1.php.net/phar and
http://www.php.net/manual/en/wrappers.phar.php
currently this is the preferred method for shipping an application in a
single file, as it is allows you to work those files and directories in the
phar via most php functions as those would be normal files/directories on
the disk so stuff like LINE would point a valid path.
for 2, you could use shutdown functions, but with phar:// you wouldn't need
to extract the files, hence no need for the cleanup.
ps: beware, if you try to pass these paths to external libs/application,
they won't be able to work with the phar:// files of course.
--
Ferenc Kovács
@Tyr43l - http://tyrael.hu
I've been using PHP for linux command-line applications. Some are quite
large. I've built the code to combine the mainline plus everything it
calls into a single file to avoid portability issues with include
libraries. I've built the code to compress the resulting file using
gzdeflate after optionally stripping comments and excess whitespace.As a result, I have the uncompressed code in a variable after using
gzinflate. Executing it cleanly has become an issue, and I'm looking for a
solution. I see the following possible solutions:
Build the mainline as a function, write the decompressed code to a temp
file, include the temp file, delete the temp file, then invoke the mainline
function. This works reasonably well with the exception that magic
constants like FILE are set during the parsing of the include file.
The result is that for example FILE contains the name of the temp
file, which causes results other than the original. I know of no way to
change FILE once it has been set, and if the application relaunches
using FILE it is attempting to invoke the now-missing temp file.Build the mainline as it was originally coded, write the decompressed
code to a temp file, include the temp file. The problem with this approach
is that if the application issues an exit() the temp file will be left
laying around. Additional issues may exist but this one is imo a
show-stopper.Pass the decompressed code to eval(). This approach is rather a joke
due to the well-intentioned efforts of whoever chose to consider eval() a
security exposure and modified echo to tell the user it is eval'ed code.Approach (1) seems the most promising but using it will require that the
target applications be specially coded with regard to FILE and possibly
other magic constants. I really don't want to place special requirements
on the coding of the target application.Suggestions would be appreciated, as I don't want to have to modify the
interpreter at this point. Thanks in advance.--
check out http://us1.php.net/phar and
http://www.php.net/manual/en/wrappers.phar.php
currently this is the preferred method
Preferred by whom, thank you very much? Who is it that has set
himself/itself up as The Authority on this subject?
for shipping an application in a
single file, as it is allows you to work those files and directories in the
phar via most php functions as those would be normal files/directories on
the disk so stuff like LINE would point a valid path.
The objective is not to perpetuate the idiocy of includes but to obviate it.
for 2, you could use shutdown functions, but with phar:// you wouldn't need
to extract the files, hence no need for the cleanup.ps: beware, if you try to pass these paths to external libs/application,
they won't be able to work with the phar:// files of course.
Thank you.
check out http://us1.php.net/phar and
http://www.php.net/manual/en/**wrappers.phar.phphttp://www.php.net/manual/en/wrappers.phar.php
currently this is the preferred methodPreferred by whom, thank you very much? Who is it that has set
himself/itself up as The Authority on this subject?
the pear installer is shipped as a phar file, phpunit also provides phar
releases, composer also provided as a phar file, phpdocumentor also
available as a phar file, etc.
I only talking about my personal experience, but it seems to me that
currently phar is the preferred method for shipping single file php
applications out there.
for shipping an application in a
single file, as it is allows you to work those files and directories in
the
phar via most php functions as those would be normal files/directories on
the disk so stuff like LINE would point a valid path.The objective is not to perpetuate the idiocy of includes but to obviate
it.
I fail to see how includes are an idiocy in general, similarly there is no
clear winner in the dynamic linking vs static linking, but depends on your
usecase.
From your primary comment I missed that your main reason for bundling
everything into one file was for portability, depending on your userbase I
would say that some dependency management tool (composer comes to mind)
would serve that goal better, see:
http://www.phptherightway.com/#dependency_management
--
Ferenc Kovács
@Tyr43l - http://tyrael.hu
check out http://us1.php.net/phar and
http://www.php.net/manual/en/**wrappers.phar.phphttp://www.php.net/manual/en/wrappers.phar.php
currently this is the preferred methodPreferred by whom, thank you very much? Who is it that has set
himself/itself up as The Authority on this subject?the pear installer is shipped as a phar file, phpunit also provides phar
releases, composer also provided as a phar file, phpdocumentor also
available as a phar file, etc.
I only talking about my personal experience, but it seems to me that
currently phar is the preferred method for shipping single file php
applications out there.
So by "preferred" what you really meant was "most widely used". Okay, I
can accept that. Of course what is widely used is often simply what is
available, rather than what one would really prefer if other options
were available.
for shipping an application in a
single file, as it is allows you to work those files and directories in
the
phar via most php functions as those would be normal files/directories on
the disk so stuff like LINE would point a valid path.The objective is not to perpetuate the idiocy of includes but to obviate
it.I fail to see how includes are an idiocy in general, similarly there is no
clear winner in the dynamic linking vs static linking, but depends on your
usecase.
The world of the conceptual is a very large place, and those of us who
work in the software arena are often confused by the sheer multiplicity
of options and approaches while being simultaneously frustrated that
most of them amount to workarounds that create as many problems as they
solve. More below.
From your primary comment I missed that your main reason for bundling
everything into one file was for portability, depending on your userbase I
would say that some dependency management tool (composer comes to mind)
would serve that goal better, see:
http://www.phptherightway.com/#dependency_management
Thank you for the link. I really am not interested in a package manager
that will conveniently download even more packages and add to the
ever-increasing clutter, but thank you for the consideration.
I started this thread with what I thought was a fairly simple technical
question about PHP, under the impression that others might know of some
functionality that is even more poorly documented than the usual and
thus mostly unknown to the general user community. Terry Ellison
pointed out that this list is specifically for discussion of PHP
language modifications, and as such my initial question (and of course
all subsequent discussion) was misplaced.
As such it would probably be best not to extend the discussion
indefinitely. I will leave you with a few comments:
-
If the purpose of an application is to restore the operating system
from a backup, or install an operating system onto a virgin machine, or
any of a host of other similar functions, external dependencies are
deadly because if any is imperfectly "installed" the whole falls apart
and the user has nothing. -
The concept of include files was a bad one from the start, but in
those days (mostly the 1960s and early 1970s) there were no better
alternatives and not much available to build them with. Interpreters
were toys because of hardware limitations, and includes were
compile-time artifacts which quickly became even more detrimental than
they were to begin with once conditional compilation was added to the
picture. Every time a new condition was added in an include file, the
number of regression tests required to validate the resulting
executables was multiplied. Since full regression testing is a drag on
profits, the approach of stopping when the few most common
configurations had been tested became the "preferred" approach within
most of the industry.
Perhaps that will provide some perspective to the approach that I am
taking here. Any further discussion should not appear in this mailing
list, and FFS not in my personal email; perhaps the newsgroup
"comp.lang.misc" would be a reasonable venue if anyone cares to pursue
the general topic.
Thank you all for your patience, farewell.