hi.
i run a website which i want to harden against hacking by 3rd parties.
i wrote this website back in 2002-2010, and then built apps on top of the
base code.
now i want to upgrade the entire thing to the latest css3 standards and
also include anti-hacking measures, because at one point i got kicked off
the internet by my ISP because they detected the thing had indeed been
hacked, and someone installed phishing software on my site.
i want to employ cron jobs that run regularly, to do checksum testing of
vital parts of my operating system.
ideally, i could have a script run indefinitely or every 2 seconds, as
root, from cron, to test for changes to my filesystem (well, the part that
is governed by Directory section in
/etc/apache2/sites-enabled/001-localhost.conf) and vital OS config files.
but i do wonder if this is going to wear out the SSD where the OS and
webserver files are stored on.
and i wonder if i should be writing this script as some sort of shell
script (bash? /bin/sh? i dunno (i run ubuntu 20.04)), or if i could be
using the convenient php for it.
and i would like to know if as far as exploits go, it's better to stay
(currently) on php7.4, or move my entire setup to php8.
thanks for your attention and any help you might provide me. :)
The most secure setup possible is to use a static site generator and upload
it's output to a static server with no server side parsing enabled. In
my opinion Hugo is the best of these which is written in Go, and that's
it's largest drawback - written in a language I'm not too familiar with.
Jigsaw is a PHP implementation of the same concept, but I haven't had a
chance to try it out. There are a lot of sites out there using WordPress
and Drupal which are so small and so infrequently uploaded that, frankly,
the owners could do themselves a huge favor by switching over.
If your problem scope still requires server side scripting you'd be better
served leaving the server security to the experts. Look into AWS,
Microsoft's Azure at a start, and there are also more PHP centric providers
like Aquina or Pantheon. Owning and managing the silicon directly isn't
advised anymore and hasn't been common practice for at least a decade.
On Sun, Jan 10, 2021 at 2:10 AM Rene Veerman <
rene.veerman.netherlands@gmail.com> wrote:
hi.
i run a website which i want to harden against hacking by 3rd parties.
i wrote this website back in 2002-2010, and then built apps on top of the
base code.now i want to upgrade the entire thing to the latest css3 standards and
also include anti-hacking measures, because at one point i got kicked off
the internet by my ISP because they detected the thing had indeed been
hacked, and someone installed phishing software on my site.i want to employ cron jobs that run regularly, to do checksum testing of
vital parts of my operating system.ideally, i could have a script run indefinitely or every 2 seconds, as
root, from cron, to test for changes to my filesystem (well, the part that
is governed by Directory section in
/etc/apache2/sites-enabled/001-localhost.conf) and vital OS config files.
but i do wonder if this is going to wear out the SSD where the OS and
webserver files are stored on.
and i wonder if i should be writing this script as some sort of shell
script (bash? /bin/sh? i dunno (i run ubuntu 20.04)), or if i could be
using the convenient php for it.and i would like to know if as far as exploits go, it's better to stay
(currently) on php7.4, or move my entire setup to php8.thanks for your attention and any help you might provide me. :)
hi.
i run a website which i want to harden against hacking by 3rd parties.
i wrote this website back in 2002-2010, and then built apps on top of the
base code.now i want to upgrade the entire thing to the latest css3 standards and
also include anti-hacking measures, because at one point i got kicked off
the internet by my ISP because they detected the thing had indeed been
hacked, and someone installed phishing software on my site.i want to employ cron jobs that run regularly, to do checksum testing of
vital parts of my operating system.ideally, i could have a script run indefinitely or every 2 seconds, as
root, from cron, to test for changes to my filesystem (well, the part that
is governed by Directory section in
/etc/apache2/sites-enabled/001-localhost.conf) and vital OS config files.
but i do wonder if this is going to wear out the SSD where the OS and
webserver files are stored on.
and i wonder if i should be writing this script as some sort of shell
script (bash? /bin/sh? i dunno (i run ubuntu 20.04)), or if i could be
using the convenient php for it.and i would like to know if as far as exploits go, it's better to stay
(currently) on php7.4, or move my entire setup to php8.thanks for your attention and any help you might provide me. :)
For the most recent security fixes, always run the latest version of a currently supported version of PHP:
https://www.php.net/supported-versions.php
Currently supported versions are 7.3, 7.4, and 8.0, so you should run either 7.3.26, 7.4.14, or 8.0.1.
Many Linux distributions back-port security fixes to earlier versions of PHP, so if you’ve installed PHP using a package manager, check with the maintainers to ensure your PHP version has the latest security updates.
Cheers,
Ben
On Sun, 10 Jan 2021 at 08:10, Rene Veerman <
rene.veerman.netherlands@gmail.com> wrote:
i run a website which i want to harden against hacking by 3rd parties.
Hi Rene,
I'm not sure the PHP Internals mailing list is the best place to ask for
this kind of help.
And what you're talking about covers a lot of different things.
As to your specific questions...
Considering you are using Ubuntu 20.04, which is the current LTS version,
make sure the software patches are being installed automatically, every
night (via apt
), and restart your server every now and again to make sure
it's using the latest Linux Kernel.
Don't compile PHP yourself, unless you're prepared to re-compile as soon as
a new version of PHP is available.
But that's just about fixing the security issues in the OS and PHP itself,
which are (fortunately) fairly rarely the way attackers get in.
The PHP scripts (and other things you install/write) on your server are
often the main issue...
3rd party code, like frameworks, libraries, systems like WordPresss, and
plugins; these need to be kept to do date as well, because they have a lot
of people looking for and exploiting issues in them (e.g. if someone finds
an issue with WordPress, it's fairly trivial to use a service like Shodan
to quickly find all websites running that bit of software, and start
exploiting them).
The scripts you have written yourself, that's
something completely different, and I'd suggest getting someone to check
over your code, as there are many mistakes that can be made (see the OWASP
Top 10 for an introduction)... or at the least, try to use a tool that can
check for common security issues/mistakes (there are many
vulnerability scanning tools out there).
As to looking for file changes on your system, I'd suggest this isn't a
good use of your time, as things frequently change, and to create a system
that can give you good reports (i.e. not filled with hundreds of perfectly
good changes), that's difficult to get right.
That said, there is already software that's already been written to do
this, for example tripwire
. This is a basic guide that should still work:
https://computingforgeeks.com/how-to-install-and-configure-tripwire-on-ubuntu-18-04/
One simple technique for mitigating the risk of malicious files is user
permissions. If your webserver is running under the 'www-data' account,
make sure that account cannot create/edit/delete any files on your system
(at the very least, not in the DocumentRoot). Yes, that can be tricky if
people are legitimately uploading files (e.g. images), but the more
restrictions you can apply, the better (because you, like everyone, will
make mistakes).
And I hate to say this, but these are just the basics, server admin is a
big subject, so is the whole area of security... and while I'd encourage
people to get involved (as it's good fun), it's often much easier to get a
3rd party to host your website; or (if you can) use things like use a
static site generator (less things to go wrong).
Hope that helps,
Craig
On Sun, 10 Jan 2021 at 08:10, Rene Veerman <
rene.veerman.netherlands@gmail.com> wrote:
hi.
i run a website which i want to harden against hacking by 3rd parties.
i wrote this website back in 2002-2010, and then built apps on top of the
base code.now i want to upgrade the entire thing to the latest css3 standards and
also include anti-hacking measures, because at one point i got kicked off
the internet by my ISP because they detected the thing had indeed been
hacked, and someone installed phishing software on my site.i want to employ cron jobs that run regularly, to do checksum testing of
vital parts of my operating system.ideally, i could have a script run indefinitely or every 2 seconds, as
root, from cron, to test for changes to my filesystem (well, the part that
is governed by Directory section in
/etc/apache2/sites-enabled/001-localhost.conf) and vital OS config files.
but i do wonder if this is going to wear out the SSD where the OS and
webserver files are stored on.
and i wonder if i should be writing this script as some sort of shell
script (bash? /bin/sh? i dunno (i run ubuntu 20.04)), or if i could be
using the convenient php for it.and i would like to know if as far as exploits go, it's better to stay
(currently) on php7.4, or move my entire setup to php8.thanks for your attention and any help you might provide me. :)