Jump to content
Sign in to follow this  
anyweb

simple automated backup of apache (your websites)

Recommended Posts

This will create a dated and compressed tar backup once a week, of everything in /usr/local/apache into the /backup/ directory.

 

This certaintly may be useful considering the amount of worms floating around at the moment (like Santy, which overwrite all php and html files with rubbish)

 

** this assumes you have your websites in /usr/local/apache as per the apache/mysql/php compilation howto in this forum, obviously, if they are not in that path, edit the path to point to your websites/code **

 

(thanks Kobras for advising me here)

 

Simply do as follows:

 

 

vi /etc/cron.weekly/apache.cron

 

ok now paste the following into this blank file

 

#!/bin/sh
/bin/tar -jcf /backup/apache_backup_`date | awk '{print $2}'`_`date | awk '{print $3}'`_`date | awk '{print $6}'`.tar.bz2 /usr/local/apache/ 1&>/dev/null 2&>/dev/null

 

save the file and now make it executable

 

 

chmod +x /etc/cron.weekly/apache.cron

 

ok, create the /backup directory

 

mkdir /backup

 

and test the script (to see that it works)

 

/etc/cron.weekly/apache.cron

 

if all goes well you should see a file like this in your /backup directory

 

 

 

[root@www root]# ls -alsrxh /backup

total 140M

140M apache_backup_Dec_29_2004.tar.bz2  4.0K ..  4.0K .

 

ok that's it ! all done,

 

now you may want to write another script to 'auto delete' a file if older than two weeks, otherwise your hard disc may soon fill up with backups.

 

cheers

 

anyweb

Edited by anyweb
  • Like 1

Share this post


Link to post
Share on other sites

nice anyweb :)

im going to write some "auto delete" tool but i need your help.

how should this tool delete the backup ?

once on at two weeks ?

or ?

Share this post


Link to post
Share on other sites

the tool should check /backup and if three or more files exist then delete the last (oldest) ones so that there is only two remaining

 

for example

 

lets imagine the /backup dir has the following files

 

[root@www root]# ls -alsrxh /backup

total 144M

144M apache_backup_Dec_29_2004.tar.bz2

144M apache_backup_Dec_22_2004.tar.bz2

144M apache_backup_Dec_15_2004.tar.bz2

  4.0K ..  4.0K .

 

then it should 'see' that there are three files, and that 144M apache_backup_Dec_15_2004.tar.bz2 is the oldest, it should then automatically delete 144M apache_backup_Dec_15_2004.tar.bz2

 

 

This will mean that you always should have at least TWO weeks worth of backup for your websites.

 

ok ?

 

cheers

 

anyweb

Edited by anyweb

Share this post


Link to post
Share on other sites

Try this:

 

#!/bin/sh

# How many files would you like to keep?
KEEP=2
# Where are the files stored?
BACKUPDIR=./backup


# DO NOT CHANGE ANYTHING BELOW
if [ `ls -1 $BACKUPDIR|wc -l` -gt $KEEP ]; then
       i=1
       for each in `ls -1t $BACKUPDIR`; do
               if [ $i -gt $KEEP ]; then
                       rm -f $BACKUPDIR/$each
               fi
               i=`expr $i + 1`
       done
fi

 

Note:

This keeps the latest $KEEP files concerning creation time, not filename! This should suit your needs anyways.

Share this post


Link to post
Share on other sites

thanks z0ny

 

my script now looks like this

 

#!/bin/sh
/bin/tar -jcf /backup/apache_backup_`date | awk '{print $2}'`_`date | awk '{print $3}'`_`date | awk '{print $6}'`.tar.bz2 /usr/local/apache/ 1&>/dev/null 2&>/dev/null

# How many files would you like to keep?
KEEP=2
# Where are the files stored?
BACKUPDIR=./backup


# DO NOT CHANGE ANYTHING BELOW
if [ `ls -1 $BACKUPDIR|wc -l` -gt $KEEP ]; then
      i=1
      for each in `ls -1t $BACKUPDIR`; do
              if [ $i -gt $KEEP ]; then
                      rm -f $BACKUPDIR/$each
              fi
              i=`expr $i + 1`
      done
fi

 

cheers

 

anyweb

Share this post


Link to post
Share on other sites

the mailing script :)

 

#!/bin/sh

#setting paths and files

 

#setting log path

tmp=$tmp

 

file=apache_backup_`date | awk '{print $2}'`_`date | awk '{print $3}'`_`date | awk '{print $6}'`.tar.bz2 >>$tmp/apachebackup.txt

space=`du -sh $file | awk '{print $1}'`

date=`date`

 

echo -n "" >$tmp/apachebackup.txt

echo "Hi anyweb, i am the backup tool from linux-noob.com, i got a new backup for you" >>$tmp/apachebackup.txt

 

echo "the new file is $file and has $space" >>$tmp/apachebackup.txt

echo "the date is $date" >>$tmp/apachebackup.txt

 

cat $tmp/apachebackup.txt | mail anyweb@linux-noob.com -s 'new backup'

 

rm -rf $tmp/apachebackup.txt

 

echo -e "Enjoy"

Edited by KobrAs

Share this post


Link to post
Share on other sites

yes i want to ftp this backup file to another computer on the network once a week (after the latest file is created)

 

let's say that:-

 

the ftp address is 100.0.0.2

the port is 21

the user is ftpuser

the password is ftppassword

 

how can i get the backup script to automagically ftp the 'latest' or just created backup file to the ftp server given the info above ?

 

i look forward to the answer

 

cheers

anyweb

Share this post


Link to post
Share on other sites

the lan ftp server is a windows box, its only a local ftp, no access from the outside (internet)

 

so i was hoping to modify the above script to do the following:

 

* after the backup file has been created, login to the ftp server, and check if there are any files present

 

* if the number of files present is 2 or less then start uploading the current backup

 

* if the number of files present is 3 or more, delete the oldest file, and start uploading the current backup

 

that's all i want it to do

 

help !

 

cheers

 

anyweb

Share this post


Link to post
Share on other sites

Hi,

 

I aint that good with shell script but heres a solution in perl :)

 

you can run it via command prompt:

 

perl -e 'use Net::FTP; $Host = "10.0.0.14"; $username = "test"; $password = "test123"; $ftp = Net::FTP->new($Host, Debug => 0) || die "Could not connect to ftp $@"; $ftp->login($username, $password) || die "Could not login" .$ftp->message; $ftp->put("Test.log", "Test.log") || die "failed to upload" . ftp->message; $ftp->quit;'

 

well thats all thats needed ;)

Share this post


Link to post
Share on other sites

As an alternative you can always set up a ssh connection between the two computers using ssh-keygen and then use rsync -ave ssh!

Share this post


Link to post
Share on other sites

ok i'm still playing with this and now i want to backup the mysql database weekly also, the exact same way as i am doing with apache.

 

the command to backup the database is executed from

 

/usr/local/mysql/bin

 

and it's

 

./mysqldump -u USER -p DATA >> forums_export.sql

 

i then have to input the root password

 

so, how can i automate this ? any ideas :)

 

cheers

 

anyweb

Share this post


Link to post
Share on other sites

you almost figure it out youreself :)

after the -p etner the password

./mysqldump -u USER -p password DATA >> forums_export.sql

Share this post


Link to post
Share on other sites

ok i've tried but its not working totally right

 

have a look at my script now

 

#!/bin/sh
/bin/tar -jcf /backup/apache_backup_`date | awk '{print $2}'`_`date | awk '{print $3}'`_`date | awk '{print $6}'`.tar.bz2 /usr/local/apache/ 1&>/dev/null 2&>/dev/null

# How many files would you like to keep?
KEEP=2
# Where are the files stored?
BACKUPDIR=./backup


# DO NOT CHANGE ANYTHING BELOW
if [ `ls -1 $BACKUPDIR|wc -l` -gt $KEEP ]; then
      i=1
      for each in `ls -1t $BACKUPDIR`; do
              if [ $i -gt $KEEP ]; then
                      rm -f $BACKUPDIR/$each
              fi
              i=`expr $i + 1`
      done
fi

#setting paths and files

#setting log path
tmp=$tmp

file=/backup/apache_backup_`date | awk '{print $2}'`_`date | awk '{print $3}'`_`date | awk '{print $6}'`.tar.bz2 >>$tmp/apachebackup.txt
space=`du -sh $file | awk '{print $1}'`
date=`date`

echo -n "" >$tmp/apachebackup.txt
echo "Hi anyweb, i am the backup tool from linux-noob.com, i got a new backup for you" >>$tmp/apachebackup.txt

echo "the new file is $file and has $space" >>$tmp/apachebackup.txt
echo "the date is $date" >>$tmp/apachebackup.txt

cat $tmp/apachebackup.txt | mail anyweb@linux-noob.com -s 'new backup'

rm -rf $tmp/apachebackup.txt

echo -e "Enjoy"

#backup mysql

./mysqldump -u USER-p PASSWORD DATABASENAME >> /backup/forums_export_`date | awk '{print $2}'`_`date | awk '{print $3}'`_`date | awk '{print $6}'`.sql

 

checking /backup i now have as follows:

 

ls -al /backup

total 5433644

drwxr-xr-x 2 root root    4096 Feb  2 18:13 .

drwxr-xr-x  20 root root    4096 Feb  2 18:13 ..

-rw-r--r-- 1 root root 1929071275 Feb  2 18:13 apache_backup_Feb_2_2005.tar.bz2

-rw-r--r-- 1 root root 1814359745 Jan 23 05:33 apache_backup_Jan_23_2005.tar.bz2

-rw-r--r-- 1 root root 1815163094 Jan 30 05:32 apache_backup_Jan_30_2005.tar.bz2

-rw-r--r-- 1 root root          0 Feb  2 18:13 forums_export_Feb_2_2005.sql

there should only be TWO apache backups, and one forums backup. That part looks ok but the script is erroring out on me...

\

the errors i get after the script is done are

 

[root@www cron.weekly]# ./apache.cron

ls: ./backup: No such file or directory

Enjoy

./apache.cron: line 44: ./mysqldump: No such file or directory

 

i've fixed the mysqldump error with

 

cd /usr/local/mysql/bin && ./mysqldump......

 

but i don't get the other one ?

Share this post


Link to post
Share on other sites

Hey, stray . in the BACKUPDIR=/backup <--

 

Slight cleaning of the script (which I like btw :))

 

#!/bin/sh

# from and to directories, and tmp directory
BUPDIR='/backup'
SRCDIR='/usr/local/apache'
TMP='/tmp'

# keep how many?
KEEP=2

# DO NOT CHANGE ANYTHING BELOW
##############################

PATH=/bin:/usr/bin:/sbin:/usr/sbin:/usr/local/mysql/bin

# current date
DATE=`date | awk '{ print $3"_"$6 }'`

# filename
FILE="${BUPDIR}/apache_backup_${DATE}.tar.bz2"

# do the backup
tar -jcf $FILE $SRCDIR >/dev/null 2>&1

if [ `ls -1 $BUPDIR | wc -l` -gt $KEEP ]; then
   i=1
   for each in `ls -1t $BUPDIR`; do
       if [ $i -gt $KEEP ]; then
           rm -f -- ${BUPDIR}/${each}
       fi
       let "i = i + 1"
   done
fi

echo "Apache backup completed. Enjoy"

## STAGE 2
# Backup mysql

mysqldump -u USER-p PASSWORD DATABASENAME >> ${BUPDIR}/forums_export.sql

echo "Forums (MySQL) backup completed. Enjoy"

## STAGE 3
# Mail the admin

cat > ${TMP}/backup.txt << EOF
Hi anyweb,

I'm the backup tool from linux-noob.com and I've got a new backup for you.

The date of the backup is ${DATE}

Apache:
The new file is ${FILE} and has `du -sh $FILE | awk '{print $1}'`

Forums:
The forums backup is now: `du -sh ${BUPDIR}/forums_export.sql | awk '{print $1}'`

Have a nice day,
:)

EOF

cat ${TMP}/backup.txt | mail anyweb@linux-noob.com -s 'New Backup'

rm -f -- ${TMP}/backup.txt

Share this post


Link to post
Share on other sites

thanks znx !

 

now onto my next issue, ive installed Gallery, and its working nicely, however by copying over the entire photos from 2005 to my personal website, the backup file is going to be close to 7GB in size, so what i want to do is to exclude the following path from the backup

 

/usr/local/apache/websites/.../family/gallery2/g2data/albums/2005

 

how do i EXCLUDE the above path and all files in it from the backup script i'm currently using ?

 

thanks in advance !

 

the ... is simply representing some other folders removed for clarity

Share this post


Link to post
Share on other sites

thanks dude but something is seriously wrong here,

 

i've changed the script and ran it to test it, before i ran it, approx 15gb free on the hdd (if not more)

 

now its at

[root@www /]# df -h
Filesystem			Size  Used Avail Use% Mounted on
/dev/mapper/VolGroup01-LogVol00
				   36G   27G  6.7G  81% /
/dev/hda1			  99M   23M   72M  24% /boot
/dev/shm			  125M	 0  125M   0% /dev/shm

 

and the script isnt complete yet (started hours ago, 5 or six hours ago....)

 

here's a copy of the script, please tell me what is causing it to nosedive my server

 

#!/bin/sh

# from and to directories, and tmp directory
BUPDIR='/backup'
SRCDIR='/usr/local/apache'
TMP='/tmp'

# keep how many?
KEEP=2

# DO NOT CHANGE ANYTHING BELOW
##############################

PATH=/bin:/usr/bin:/sbin:/usr/sbin:/usr/local/mysql/bin

# current date
DATE=`date | awk '{ print $3"_"$6 }'`

# filename
FILE="${BUPDIR}/apache_backup_${DATE}.tar.bz2"

# do the backup
tar -jcf --exclude /usr/local/apache/websites/kicks-ass/personal/family/gallery2/g2data/albums/2005 $FILE $SRCDIR >/dev/null 2>&1

if [ `ls -1 $BUPDIR | wc -l` -gt $KEEP ]; then
  i=1
  for each in `ls -1t $BUPDIR`; do
   if [ $i -gt $KEEP ]; then
	   rm -f -- ${BUPDIR}/${each}
   fi
   let "i = i + 1"
  done
fi

echo "Apache backup completed. Enjoy"

## STAGE 2
# Backup mysql

mysqldump -u ***** -p***** ***** >> ${BUPDIR}/forums_export_`date | awk '{print $2}'`_`date | awk '{print $3}'`_`date | awk '{print $6}'.sql

echo "Forums (MySQL) backup completed. Enjoy"

## STAGE 3
# Mail the admin

cat > ${TMP}/backup.txt << EOF
Hi anyweb,

I'm the backup tool from linux-noob.com and I've got a new backup for you.

The date of the backup is ${DATE}

Apache:
The new file is ${FILE} and has `du -sh $FILE | awk '{print $1}'`

Forums:
The forums backup is now: `du -sh ${BUPDIR}/forums_export.sql | awk '{print $1}'`

Have a nice day,
:)

EOF

cat ${TMP}/backup.txt | mail anyweb@linux-noob.com -s 'New Backup'

rm -f -- ${TMP}/backup.txt

 

help !

 

cheers

anyweb

Share this post


Link to post
Share on other sites

#!/bin/sh

# from and to directories, and tmp directory
BUPDIR='/backup'
SRCDIR='/usr/local/apache'
TMP='/tmp'

# keep how many?
KEEP=2

# DO NOT CHANGE ANYTHING BELOW  <-- this is here for a reason
################################################

PATH=/bin:/usr/bin:/sbin:/usr/sbin:/usr/local/mysql/bin

# current date
DATE=`date | awk '{ print $3"_"$6 }'`

# filename
FILE="${BUPDIR}/apache_backup_${DATE}.tar.bz2"

# do the backup
tar -jcf $FILE --exclude='*/personal/family/gallery2/g2data/albums/2005/*' $SRCDIR >/dev/null 2>&1

if [ `ls -1 $BUPDIR | wc -l` -gt $KEEP ]; then
  i=1
  for each in `ls -1t $BUPDIR`; do
   if [ $i -gt $KEEP ]; then
	   rm -f -- ${BUPDIR}/${each}
   fi
   let "i = i + 1"
  done
fi

echo "Apache backup completed. Enjoy"

## STAGE 2
# Backup mysql

mysqldump -u ***** -p***** ***** >> ${BUPDIR}/forums_export_`date | awk '{print $2}'`_`date | awk '{print $3}'`_`date | awk '{print $6}'.sql

echo "Forums (MySQL) backup completed. Enjoy"

## STAGE 3
# Mail the admin

cat > ${TMP}/backup.txt << EOF
Hi anyweb,

I'm the backup tool from linux-noob.com and I've got a new backup for you.

The date of the backup is ${DATE}

Apache:
The new file is ${FILE} and has `du -sh $FILE | awk '{print $1}'`

Forums:
The forums backup is now: `du -sh ${BUPDIR}/forums_export.sql | awk '{print $1}'`

Have a nice day,
:)

EOF

cat ${TMP}/backup.txt | mail anyweb@linux-noob.com -s 'New Backup'

rm -f -- ${TMP}/backup.txt

 

should be ok..

 

notice the tar command was mashed in the wrong style...

 

1. the method is use in this fashion:

 

tar -jcf ARCHIVE --exclude='stuff' DIRS

 

2. makes it a little easier to read by using * (notice the ' not the " are used!!!!) tar will expand ..

 

ok.. so i hope thats all good ^_^

 

--exclude dirName

 

add that to the tar command

 

 

/me slaps hijinks around with a rather large trout

 

hehe ;)

Share this post


Link to post
Share on other sites

thanks znx,

 

now the script is working again, however there are still some outstanding issues,

 

firstly its not completing correct, it reports an error (when run directly from /etc/cron.weekly/apache.cron)

 

[root@www cron.weekly]# sh apache.cron

Apache backup completed. Enjoy

Forums (MySQL) backup completed. Enjoy

du: cannot access `/backup/forums_export.sql': No such file or directory

 

secondly, the date format is not the way it used to be, and the way i want it should be month/day/year or something along those lines, now instead it appears as this

 

[root@www backup]# ls -al

total 2739120

drwxr-xr-x 2 root root 4096 Jan 4 10:00 .

drwxr-xr-x 24 root root 4096 Jan 3 16:09 ..

-rw-r--r-- 1 root root 2756244356 Jan 4 10:00 apache_backup_4_2006.tar.bz2

-rw-r--r-- 1 root root 22908879 Jan 3 16:08 forums_export_Jan_3_2006.sql

-rw-r--r-- 1 root root 22932818 Jan 4 10:00 forums_export_.sql

 

so you can see the apache_backup does not list the month, just 4 for the day (4th of jan), in addition, the mysql backup does not contain any date whatsoever

 

can you help me resolve these final issues please ?

 

cheers

 

anyweb

Edited by anyweb

Share this post


Link to post
Share on other sites

i've now altered the script to include the date function for the mysql backup also,

 

however i still want month/day/year as the format and not the current format

 

see below

 

[root@www backup]# ls -al

total 2739388

drwxr-xr-x 2 root root 4096 Jan 4 12:06 .

drwxr-xr-x 24 root root 4096 Jan 3 16:09 ..

-rw-r--r-- 1 root root 2756485546 Jan 4 12:06 apache_backup_4_2006.tar.bz2

-rw-r--r-- 1 root root 22938116 Jan 4 12:06 forums_export_4_2006.sql

-rw-r--r-- 1 root root 22932818 Jan 4 10:00 forums_export_.sql

 

also, when the job is complete it generates an error

 

[root@www cron.weekly]# sh apache.cron

Apache backup completed. Enjoy

Forums (MySQL) backup completed. Enjoy

du: cannot access `/backup/forums_export.sql': No such file or directory

 

in addition, i don't think its doing the root email

 

cheers

anyweb

Share this post


Link to post
Share on other sites

ok..

 

replace a few lines......

 

....
# current date
DATE=`date +"%h_%d_%Y`
....
mysqldump -u ***** -p***** ***** >> ${BUPDIR}/forums_export_${DATE}.sql
....
The forums backup is now: `du -sh ${BUPDIR}/forums_export_${DATE}.sql | awk '{print $1}'`
....

 

done.. ^_^

Share this post


Link to post
Share on other sites

thanks again dude

 

heres the complete script for those that would like to use a similar one in the future

 

#!/bin/sh

# from and to directories, and tmp directory
BUPDIR='/backup'
SRCDIR='/usr/local/apache'
TMP='/tmp'

# keep how many?
KEEP=2

# DO NOT CHANGE ANYTHING BELOW
##############################

PATH=/bin:/usr/bin:/sbin:/usr/sbin:/usr/local/mysql/bin

# current date
DATE=`date +"%h_%d_%Y"`

# filename
FILE="${BUPDIR}/apache_backup_${DATE}.tar.bz2"

# do the backup

tar -jcf $FILE --exclude='*/personal/family/gallery2/g2data/albums/2005/*' $SRCDIR >/dev/null 2>&1


if [ `ls -1 $BUPDIR | wc -l` -gt $KEEP ]; then
  i=1
  for each in `ls -1t $BUPDIR`; do
   if [ $i -gt $KEEP ]; then
	   rm -f -- ${BUPDIR}/${each}
   fi
   let "i = i + 1"
  done
fi

echo "Apache backup completed. Enjoy"

## STAGE 2
# Backup mysql

mysqldump -u **** -p***** DATABASENAME >> ${BUPDIR}/forums_export_${DATE}.sql

echo "Forums (MySQL) backup completed. Enjoy"

## STAGE 3
# Mail the admin

cat > ${TMP}/backup.txt << EOF
Hi anyweb,

I'm the backup tool from linux-noob.com and I've got a new backup for you.

The date of the backup is ${DATE}

Apache:
The new file is ${FILE} and has `du -sh $FILE | awk '{print $1}'`

Forums:
The forums backup is now: `du -sh ${BUPDIR}/forums_export_${DATE}.sql | awk '{print $1}'`

Have a nice day,
:)

EOF

cat ${TMP}/backup.txt | mail anyweb@linux-noob.com -s 'New Backup'

rm -f -- ${TMP}/backup.txt

 

works fine now, as you can see below

 

[root@www cron.weekly]# sh apache.cron

Apache backup completed. Enjoy

Forums (MySQL) backup completed. Enjoy

You have mail in /var/spool/mail/root

 

and

 

-rw-r--r--   1 root root 2756189055 Jan  4 15:38 apache_backup_Jan_04_2006.tar.bz2
-rw-r--r--   1 root root   22946133 Jan  4 15:38 forums_export_Jan_04_2006.sql

Share this post


Link to post
Share on other sites
Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
Sign in to follow this  

×
×
  • Create New...