Jump to content


We use cookies to log you in, for ads and for analytics. OK

Photo

backup/restore remotely


  • Please log in to reply
18 replies to this topic

#1 feedmebits

feedmebits

    Linux-Noob Senior Member

  • Moderator
  • PipPipPipPipPipPipPip
  • 673 posts

Posted 13 October 2011 - 12:54 PM

I'm looking for some different ways to backup/restore files from a remote server that I incase I mess up something really bad.
I'll be able to restore instead of having to do a full reinstall. I'll check google later.
  • inetryconydot, BycleIlliny, and 24 others like this
"Your heart is free, have the courage to follow it"

#2 hybrid

hybrid

    Linux-Noob Frequent Member

  • Admin
  • PipPipPipPipPipPipPipPip
  • 1,009 posts

Posted 13 October 2011 - 04:14 PM

I use a few tools for server backup and restore:
  • rsync
  • tar
  • partimage
rsync

rsync is a tool which synchronises files and folders from one place to another. You can use it locally, or across remote machines. It is particularly good because it only transfers the files that have changed -- and even only transfers the individual portions of files that have changed -- so saves a lot of unnecessary transfer when used remotely, or on a slow USB drive.

I use rsync to keep backups of whole trees of folders and files, for example, /var/www to /backup/var/www. If I have a problem and want to revert the whole folder back to the backup, I can simply reverse the direction in which rsync runs, and it synchronises the backup copy to the live copy, 'undo'-ing all of the changes on the live copy.

We use rsync for this site, keeping seven copies of the web server data folder, one for each day of the week. rsync runs each day, and synchronises just the changes that have happened in the past week to the backup folder for that day of the week. This way, we have seven full backups of the files and can restore from any of them, but the CPU and disk I/O impact of the backup is pretty low, since we're only backing up relatively small changes each night!

tar

tar is great for taking a snapshot of some files and folders and storing them in one file. It's easy to keep track of the files, it's simple and ubiquitous and you can easily apply compression to the archived files to reduce disk space requirements. It's also more difficult to accidentally instruct it to do something destructive when creating a backup (such as synchronise in the wrong direction with rsync!) The primary disadvantage is that you will be creating a brand new tar archive each time you want to do a backup, which uses more transfer and CPU time.

I use tar backups as well, so that I have different types of backups from which I can restore, in case any one of them goes wrong!

partimage

This probably won't apply in your case for this particular scenario, since you seem to be interested in backing up files/folders, but for completeness, I'll mention it. partimage dumps an entire partition on disk into a file, optionally with compression. This means that the entire operating system can be backed up and restored exactly as it is. I use this from time to time so that I can restore the whole OS very rapidly and not do a reinstall (after doing that, I might then updated the web server documents with the most recent tar backup, for example, so that it is then up-to-date).

The main disadvantage is that you have to backup the whole partition each time, and you can't run partimage to backup or restore a mounted partition -- so I must reboot from a live CD, for example, to create or restore the backup.

--

I'm sure there are also many other great tools that I haven't thought of (or innovative ways to use the ones I have mentioned). Exactly which tool you will pick depends on what your requirements and priorities for this particular backup scenario are. :)
  • Wilfredlui, Ruperttln, bNynde4ged and 3 others like this
My website | Portfolio

FOSSwire -- all about free and open source software, featuring tips, tricks, tutorials, reviews, articles and all the latest news from the free software universe.

#3 Dungeon-Dave

Dungeon-Dave

    Linux-Noob Frequent Member

  • Moderator
  • PipPipPipPipPipPipPip
  • 972 posts

Posted 13 October 2011 - 05:50 PM

Are you looking to remotely fire off a backup (eg: webmin) or looking to backup from one server onto another?

As Hybrid mentioned above, I'll add things like sbackup, rbackup, simple-backup etc - there are many tools that allow a backup over sftp/rsync.

Do you know how much data you seek to backup?
  • Wilfredlui, Ruperttln and bNynde4ged like this

#4 feedmebits

feedmebits

    Linux-Noob Senior Member

  • Moderator
  • PipPipPipPipPipPipPip
  • 673 posts

Posted 13 October 2011 - 05:54 PM

Are you looking to remotely fire off a backup (eg: webmin) or looking to backup from one server onto another?

As Hybrid mentioned above, I'll add things like sbackup, rbackup, simple-backup etc - there are many tools that allow a backup over sftp/rsync.

Do you know how much data you seek to backup?


It's more for my webserver project, as in if I screw something up I can restore it. I was more thinking as in /etc files and my web data which would be in /home/user/html/website/htdocs. Right now it's not much. It's more of knowing how to do it for when I make mistakes so I don't have to reconfigure everything. And if I make a mistake on my website that's not easily undo-able so I can restore it and of course also making backups/restoring my mysql db's
  • Ruperttln and bNynde4ged like this
"Your heart is free, have the courage to follow it"

#5 hybrid

hybrid

    Linux-Noob Frequent Member

  • Admin
  • PipPipPipPipPipPipPipPip
  • 1,009 posts

Posted 13 October 2011 - 06:14 PM

At a very simple level, you can begin by just doing manual tar backups of those directories:

# cd /
# tar --same-owner -czpvf /backup/etc.tar.gz /etc

I personally use the --same-owner and -p switches so that my tar archives preserve the exact permissions. The other switches:

-c -- create a new archive
-z -- use gzip compression
-p -- preserve permissions, as I mentioned
-v -- verbose progress, so I can see all the files as they are added to the archive
-f -- the archive created should have the file path and file name that immediately follows the -f

Then we have the target filename for the new backup file, followed by the source directory to backup.

(We change directory to one level above the source directory to make the internal structure of the archive a little cleaner.)

It is very similar for other directories:

# cd /home/user/html/website
# tar --same-owner -czpvf /backup/htdocs.tar.gz htdocs

MySQL backups can be done with the excellent mysqldump. A basic example:

# mysqldump -u root -p -A > /backup/mysql_dbs.sql

Obviously, all these examples will need to have filenames altered each time you run them, to avoid overwriting your old backups, and they have to be run manually! If you're doing this a lot, it's useful to put this sort of thing together into a script that runs automatically, so you don't even have to think about it!

If you wanted to restore one of the tar backups, you would do the following. Note that we'll restore into somewhere temporary -- you can then check everything looks good after extracting the files before moving them over to the 'live' location:

#  cd /tmp
# mkdir restore
# cd restore
# tar --same-owner -xzpvf /backup/htdocs.tar.gz

In this example, the archive will be restored to /tmp/restore/htdocs. Note that the only real difference is we don't provide the destination directory (tar assumes we want to extract to the current working directory), and we have the -x switch for extract.
  • Ruperttln and bNynde4ged like this
My website | Portfolio

FOSSwire -- all about free and open source software, featuring tips, tricks, tutorials, reviews, articles and all the latest news from the free software universe.

#6 feedmebits

feedmebits

    Linux-Noob Senior Member

  • Moderator
  • PipPipPipPipPipPipPip
  • 673 posts

Posted 13 October 2011 - 06:20 PM

I'll have a look at that, and how do I for example back up from my server to my own pc?
  • Ruperttln likes this
"Your heart is free, have the courage to follow it"

#7 hybrid

hybrid

    Linux-Noob Frequent Member

  • Admin
  • PipPipPipPipPipPipPipPip
  • 1,009 posts

Posted 13 October 2011 - 06:24 PM

If you're running an SSH daemon on your server, you could use scp to 'pull' the files down from the server and onto your local system.

$ scp -P your_ssh_port your_username@your_server:/backup/htdocs.tar.gz htdocs.tar.gz

That would bring down the specified file on the remote machine and store it in the current directory on your local system.
  • Ruperttln likes this
My website | Portfolio

FOSSwire -- all about free and open source software, featuring tips, tricks, tutorials, reviews, articles and all the latest news from the free software universe.

#8 Dungeon-Dave

Dungeon-Dave

    Linux-Noob Frequent Member

  • Moderator
  • PipPipPipPipPipPipPip
  • 972 posts

Posted 13 October 2011 - 06:53 PM

Are you looking to remotely fire off a backup (eg: webmin) or looking to backup from one server onto another? As Hybrid mentioned above, I'll add things like sbackup, rbackup, simple-backup etc - there are many tools that allow a backup over sftp/rsync. Do you know how much data you seek to backup?

It's more for my webserver project, as in if I screw something up I can restore it. I was more thinking as in /etc files and my web data which would be in /home/user/html/website/htdocs. Right now it's not much. It's more of knowing how to do it for when I make mistakes so I don't have to reconfigure everything. And if I make a mistake on my website that's not easily undo-able so I can restore it and of course also making backups/restoring my mysql db's


If you've documented what you've done to build your server/site up until now, you've got a "recovery plan" - so you know how to bring a base install back up to your working webserver.

Basically, it's a case of including those differences (config files in /etc, content in htdocs, as you said) into a backup plan and having some plan or script to do the rebuild. I've done this a few times on some boxes to migrate servers and rebuild a new platform after an OS upgrade.
  • Ruperttln likes this

#9 feedmebits

feedmebits

    Linux-Noob Senior Member

  • Moderator
  • PipPipPipPipPipPipPip
  • 673 posts

Posted 13 October 2011 - 07:03 PM

Sorry Dave. More like backing up from one server to another. As in from my remote server to my pc(other server). Yeah that's true lol as long as I have it documented I have a basic backup plan. Will have a look at the ones the both of you mentioned and let you know once I've tried some of them out :)
  • Ruperttln likes this
"Your heart is free, have the courage to follow it"

#10 Dungeon-Dave

Dungeon-Dave

    Linux-Noob Frequent Member

  • Moderator
  • PipPipPipPipPipPipPip
  • 972 posts

Posted 13 October 2011 - 07:05 PM

As well as using SCP, there's also LFTP or WGET to leech over files.

If you have SSH key authentication, those commands will seemlessly login - no pass required. This is how pull backups across servers.
  • Ruperttln likes this

#11 feedmebits

feedmebits

    Linux-Noob Senior Member

  • Moderator
  • PipPipPipPipPipPipPip
  • 673 posts

Posted 04 February 2012 - 08:14 PM

Finally wrote a to do list and learning about backups and restores is the first on my list so this will be a useful post
  • Ruperttln likes this
"Your heart is free, have the courage to follow it"

#12 feedmebits

feedmebits

    Linux-Noob Senior Member

  • Moderator
  • PipPipPipPipPipPipPip
  • 673 posts

Posted 04 February 2012 - 10:11 PM

Works great. Made a backup of my web data and dumps of mysql db's. Now I can ftp them over to my own pc and keep backups there. All though I need to change the file permissions before I do.
  • Ruperttln likes this
"Your heart is free, have the courage to follow it"

#13 feedmebits

feedmebits

    Linux-Noob Senior Member

  • Moderator
  • PipPipPipPipPipPipPip
  • 673 posts

Posted 05 February 2012 - 04:17 PM

what's the advantage of backing up a mysql db with a dump rather then making a tar archief? You can restore from both I would think.
  • Ruperttln likes this
"Your heart is free, have the courage to follow it"

#14 Dungeon-Dave

Dungeon-Dave

    Linux-Noob Frequent Member

  • Moderator
  • PipPipPipPipPipPipPip
  • 972 posts

Posted 05 February 2012 - 06:37 PM

  • "tar" takes a series of files and creates one file from them.
  • "mysqldump" dumps the contents of several tables/databases and creates one file from it.

A restore from a tar archive involves extracting the archive into a directory (and optionally moving the contents back to their original location, once verified).

A restore from a mysqldump file involves connecting to the database then "executing" (sourcing) the dump file - the file itself is essentially a load of SQL statements that recreates tables and constraints, restoring the data via a pile of insert statements.

They do the same kind of thing, just in a different context - one is for content held in directories, the other for content held in databases.
  • Ruperttln likes this

#15 feedmebits

feedmebits

    Linux-Noob Senior Member

  • Moderator
  • PipPipPipPipPipPipPip
  • 673 posts

Posted 05 February 2012 - 06:53 PM

ah I see. Which is more pratical for backing up and restoring?
  • Ruperttln likes this
"Your heart is free, have the courage to follow it"

#16 Dungeon-Dave

Dungeon-Dave

    Linux-Noob Frequent Member

  • Moderator
  • PipPipPipPipPipPipPip
  • 972 posts

Posted 05 February 2012 - 08:42 PM

For databases... mysqldump.

For files... tar. (or rsync)

Both files can be compressed (gzip, bzip2) to reduce their size.

#17 hybrid

hybrid

    Linux-Noob Frequent Member

  • Admin
  • PipPipPipPipPipPipPipPip
  • 1,009 posts

Posted 06 February 2012 - 10:21 PM

A mysqldump as a .sql file is a much better prospect for restoring than tarring up the /var/lib/mysql/data (or wherever) directory.

If you run the tar of the MySQL data while the MySQL daemon is running, the data directory will be in an unknown state -- perhaps there are locks on tables, or tar runs at just the moment a new row is inserted, and your database backup is inconsistent and possibly useless. Even if the MySQL daemon isn't running, it's not good practice.

Running a proper MySQL dump avoids all of these issues -- dumping the data exactly as it is, in a format where MySQL can just execute the SQL statements, one by one, to recreate the databases at the exact time of the snapshot.
My website | Portfolio

FOSSwire -- all about free and open source software, featuring tips, tricks, tutorials, reviews, articles and all the latest news from the free software universe.

#18 feedmebits

feedmebits

    Linux-Noob Senior Member

  • Moderator
  • PipPipPipPipPipPipPip
  • 673 posts

Posted 12 February 2012 - 04:14 PM

I succeeded in doing my backup and restore Posted Image
backing up mysqldump:

mysqldump -u test_dba -pPassword1234 feedtest_db > testbackup.sql
backup htdocs:
tar -czpvf /home/user/website/htdocs /home/other-user/testbackup-htdocs.tar.gz
copy these files to my virtualbox test machine. On my test machine I create db/db user. Then after that is done.

Restore db:
mysql -u testuser -ppassword feedtest_db < testbackup..sql

then restore your webdata:
tar -zxvf testbackup-htdocs.tar.gz /home/user/website

Then you can see if you can get to your website and you will get the following error:Database connection error (2): Could not connect to MySQLThe problem which I figured out is that when setting up the db/user on the test machine is that you setup a different user/password, the credentials differs from the the credentials in the dumpfile. All you have to dois edit the configuration.php file of Joomla(in this case) and replace the user/password with the correct credentials and your site restore will have worked Posted Image
"Your heart is free, have the courage to follow it"

#19 hybrid

hybrid

    Linux-Noob Frequent Member

  • Admin
  • PipPipPipPipPipPipPipPip
  • 1,009 posts

Posted 13 February 2012 - 09:40 PM

Nice to see you've got it working!
My website | Portfolio

FOSSwire -- all about free and open source software, featuring tips, tricks, tutorials, reviews, articles and all the latest news from the free software universe.


0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users