This will create a dated and compressed tar backup once a week, of everything in /usr/local/apache into the /backup/ directory.
This certaintly may be useful considering the amount of worms floating around at the moment (like Santy, which overwrite all php and html files with rubbish)
** this assumes you have your websites in /usr/local/apache as per the apache/mysql/php compilation howto in this forum, obviously, if they are not in that path, edit the path to point to your websites/code **
(thanks Kobras for advising me here)
Simply do as follows:
Code:
vi /etc/cron.weekly/apache.cron
ok now paste the following into this blank file
Code:
#!/bin/sh
/bin/tar -jcf /backup/apache_backup_`date | awk '{print $2}'`_`date | awk '{print $3}'`_`date | awk '{print $6}'`.tar.bz2 /usr/local/apache/ 1&>/dev/null 2&>/dev/null
save the file and now make it executable
Code:
chmod +x /etc/cron.weekly/apache.cron
ok, create the /backup directory
Code:
mkdir /backup
and test the script (to see that it works)
Code:
/etc/cron.weekly/apache.cron
if all goes well you should see a file like this in your /backup directory
Quote:[root@www root]# ls -alsrxh /backuptotal 140M
140M apache_backup_Dec_29_2004.tar.bz2 4.0K .. 4.0K .
ok that's it ! all done,
now you may want to write another script to 'auto delete' a file if older than two weeks, otherwise your hard disc may soon fill up with backups.
cheers
anyweb