Anybody knows a good script to backup files and db and upload to remote ftp?

Status
Not open for further replies.

Zeokat

Active Member
788
2009
60
0
I´m searching for a script that backups mysql database and files of a folder and upload them to my remote server using ftp. With this solution i want to have a daily backup at another server, which i can restore very fast if my main VPS goes down.

At the moment i found this: http://www.cyberciti.biz/tips/how-t...rver-files-to-a-ftp-server-automatically.html

I still have to test it, if you know any other good alternative reply with it.

Thanks in advance.
 
3 comments
Thanks masterbator, something simple twice better ;)

It can be improved backuping files too.

---------- Post added at 07:30 PM ---------- Previous post was at 04:31 PM ----------

Well... i modified the code provided by masterbator to adjust to my needs and correct some issues on my CentOS 5 x64 machine, fixes:

- Tar shows a warning for use absolute paths (the files are compressed Ok anyway, but i not love see warning messages so i decied to fix them).

- Changed ".tgz" extension of compressed files to ".tar.gz", both are equivalent, correct and working extensions but i always used ".tar.gz" and not give problems unless you go to a very very very old OS that not allows more than one point and more than three letters into a extension (MS-DOS for example).

- Changed "-" with "_", since can cause issues have "-" char into filenames.

- Ftp client on my CentOS 5 x64 (using same code as masterbator) result in upload files corrupted to my remote server, then i changed ftp to ncftp (installed by default on my CentOS 5 x64). I´m using into remote server vsftpd as ftp server/daemon.

- I use /tmp folder as temporary folder to create backups and delete them, seems a safer location to play with command "rm -rf".

- Line "rm -rf /your/path/*" is a little hardcore for me, i used prefix namefile, wildcard "*" and file extension, less hardcore.

- Added support to backup files of my public_html folder.

- Added support to upload the compressed and uncompressed database backup (i´m paranoic, but with a safe backup in my hands ;)).

The code (remember to edit script with your paths and user/pass, the blue text):

Code:
#!/bin/bash
### Dump mysql database and compress it ###
echo Starting Database Backup
mysqldump -u[COLOR=DeepSkyBlue]Your_db_User[/COLOR] -p[COLOR=DeepSkyBlue]Your_db_password[/COLOR] -hlocalhost [COLOR=DeepSkyBlue]Your_db[/COLOR] > /tmp/sql_backup_`date +%d%m%y`.sql
cd /tmp
tar -zcf sql_zipped_backup_`date +%d%m%y`.tar.gz sql_backup_*.sql
echo Database Backup Completed

### Compress files ###
echo Starting Files Backup
cd /
tar -zcf /tmp/filesbackup_`date +%d%m%y`.tar.gz [COLOR=DeepSkyBlue]home/www/web.com/public_html[/COLOR]
echo Files Backup Completed

### Upload to remote ftp ###
cd /tmp
HOST='[COLOR=DeepSkyBlue]100.200.200.105[/COLOR]'
USER='[COLOR=DeepSkyBlue]ftp_user[/COLOR]'
PASSWD='[COLOR=DeepSkyBlue]ftp_pass[/COLOR]'
ncftp -u"$USER" -p"$PASSWD" $HOST<<EOF
put sql_zipped_backup_*.tar.gz
put filesbackup_*.tar.gz
put sql_backup_*.sql
quit
EOF

### Delete backups ###
echo Start Cleaning Backup Files
cd /tmp
rm -rf sql_zipped_backup_*.tar.gz
rm -rf sql_backup_*.sql
rm -rf filesbackup_*.tar.gz
echo Cleaning Backup Files Completed

exit 0
Enjoy.
 
Last edited:
I'm planning on using this for a backup system, supports almost everything.

https://github.com/meskyanichi/backup

Even Amazon.

Install Ruby

$ sudo yum groupinstall 'Development Tools'
$ sudo yum install readline-devel
$ cd /usr/local/src
$ wget ftp://ftp.ruby-lang.org/pub/ruby/1.9/ruby-1.9.1-p376.tar.gz
$ tar xzvf ruby-1.9.1-p376.tar.gz
$ cd ruby-1.9.1-p376
$ ./configure && make
$ sudo make install

Source: http://mjijackson.com/2010/02/ruby-1.9-centos-5

Install Backup

gem install backup

Source: https://rubygems.org/gems/backup

or use a Bash Script like supplied.

Regards,
Cory
 
Status
Not open for further replies.
Back
Top