Linux-Noob Forums

Full Version: simple automated backup of apache (your websites)
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2 3

This will create a dated and compressed tar backup once a week, of everything in /usr/local/apache into the /backup/ directory.

 

This certaintly may be useful considering the amount of worms floating around at the moment (like Santy, which overwrite all php and html files with rubbish)

 

** this assumes you have your websites in /usr/local/apache as per the apache/mysql/php compilation howto in this forum, obviously, if they are not in that path, edit the path to point to your websites/code **

 

(thanks Kobras for advising me here)

 

Simply do as follows:

 

 



Code:
vi /etc/cron.weekly/apache.cron




 

ok now paste the following into this blank file

 



Code:
#!/bin/sh
/bin/tar -jcf /backup/apache_backup_`date | awk '{print $2}'`_`date | awk '{print $3}'`_`date | awk '{print $6}'`.tar.bz2 /usr/local/apache/ 1&>/dev/null 2&>/dev/null




 

save the file and now make it executable

 

 



Code:
chmod +x /etc/cron.weekly/apache.cron




 

ok, create the /backup directory

 



Code:
mkdir /backup




 

and test the script (to see that it works)

 



Code:
/etc/cron.weekly/apache.cron




 

if all goes well you should see a file like this in your /backup directory

 

 

 

Quote:[root@www root]# ls -alsrxh /backuptotal 140M

140M apache_backup_Dec_29_2004.tar.bz2  4.0K ..  4.0K .
 

ok that's it ! all done,

 

now you may want to write another script to 'auto delete' a file if older than two weeks, otherwise your hard disc may soon fill up with backups.

 

cheers

 

anyweb


nice anyweb :)

im going to write some "auto delete" tool but i need your help.

how should this tool delete the backup ?

once on at two weeks ?

or ?


the tool should check /backup and if three or more files exist then delete the last (oldest) ones so that there is only two remaining

 

for example

 

lets imagine the /backup dir has the following files

 

Quote:[root@www root]# ls -alsrxh /backuptotal 144M

144M apache_backup_Dec_29_2004.tar.bz2

144M apache_backup_Dec_22_2004.tar.bz2

144M apache_backup_Dec_15_2004.tar.bz2

  4.0K ..  4.0K .
 

then it should 'see' that there are three files, and that 144M apache_backup_Dec_15_2004.tar.bz2 is the oldest, it should then automatically delete 144M apache_backup_Dec_15_2004.tar.bz2

 

 

This will mean that you always should have at least TWO weeks worth of backup for your websites.

 

ok ?

 

cheers

 

anyweb


Try this:

 



Code:
#!/bin/sh

# How many files would you like to keep?
KEEP=2
# Where are the files stored?
BACKUPDIR=./backup


# DO NOT CHANGE ANYTHING BELOW
if [ `ls -1 $BACKUPDIR|wc -l` -gt $KEEP ]; then
       i=1
       for each in `ls -1t $BACKUPDIR`; do
               if [ $i -gt $KEEP ]; then
                       rm -f $BACKUPDIR/$each
               fi
               i=`expr $i + 1`
       done
fi




 

Note:

This keeps the latest $KEEP files concerning creation time, not filename! This should suit your needs anyways.


thanks z0ny

 

my script now looks like this

 



Code:
#!/bin/sh
/bin/tar -jcf /backup/apache_backup_`date | awk '{print $2}'`_`date | awk '{print $3}'`_`date | awk '{print $6}'`.tar.bz2 /usr/local/apache/ 1&>/dev/null 2&>/dev/null

# How many files would you like to keep?
KEEP=2
# Where are the files stored?
BACKUPDIR=./backup


# DO NOT CHANGE ANYTHING BELOW
if [ `ls -1 $BACKUPDIR|wc -l` -gt $KEEP ]; then
      i=1
      for each in `ls -1t $BACKUPDIR`; do
              if [ $i -gt $KEEP ]; then
                      rm -f $BACKUPDIR/$each
              fi
              i=`expr $i + 1`
      done
fi




 

cheers

 

anyweb


the mailing script :)

 

#!/bin/sh

#setting paths and files

 

#setting log path

tmp=$tmp

 

file=apache_backup_`date | awk '{print $2}'`_`date | awk '{print $3}'`_`date | awk '{print $6}'`.tar.bz2 >>$tmp/apachebackup.txt

space=`du -sh $file | awk '{print $1}'`

date=`date`

 

echo -n "" >$tmp/apachebackup.txt

echo "Hi anyweb, i am the backup tool from linux-noob.com, i got a new backup for you" >>$tmp/apachebackup.txt

 

echo "the new file is $file and has $space" >>$tmp/apachebackup.txt

echo "the date is $date" >>$tmp/apachebackup.txt

 

cat $tmp/apachebackup.txt | mail anyweb@linux-noob.com -s 'new backup'

 

rm -rf $tmp/apachebackup.txt

 

echo -e "Enjoy"

and of course the best backup is placing that tar file on another computer right?? :)

Guest

if you use Debian, take a look at the rsnapshot package. it's a marvellous backup solution.

yes i want to ftp this backup file to another computer on the network once a week (after the latest file is created)

 

let's say that:-

 

the ftp address is 100.0.0.2

the port is 21

the user is ftpuser

the password is ftppassword

 

how can i get the backup script to automagically ftp the 'latest' or just created backup file to the ftp server given the info above ?

 

i look forward to the answer

 

cheers

anyweb

On the LAN I would create a passwordless SSH login (key secured) and "upload" it via SCP.
Pages: 1 2 3