How to backup the Synology Server to Amazon S3

From SynologyWiki
Jump to: navigation, search

This is a instruction guide on how to modify the Synology Server to backup files to Amazon's S3 service, an online storage space, offered by Amazon.com

To backup (synchronize) your data from your Synology box to Amazon S3, you'll need to install some additional software on your machine, like OpenSSL, wget-SSL, ruby and s3sync. This installation is detailed here.

The How-To is very simple, although it look daunting. Just try it. It is rather detailed (and thus quite long), too. I hope it get even a rather unexperienced user going. Thus said, if you don't know what telnet is, how to unpack a compressed file, maneuver around in the shell or what Amazon S3 or the "public" directory are, don't try this.

Yes, you guessed it: English is not my mother tongue. Bear with me. Let's go:

Activate Telnet

- Synology instructions (new, good)
- Extensive Instructions
- Concise Instructions


Sources:
- Get the official patch here
- Get the unofficial Patch here
Remark: The unofficial patch does not require a restart, and (thus) quits with an error message "42" (get the joke[1]?), but that's about it.
Choose your flavor.

Install Package Management ipkg (including OpenSSL and wgetSSL)

Download the ipkg-package for the correct architecture of your Synology-box (the following instructions show the PowerPC-Version (for x06-Series)) and put it in the "public" directory on your Synology.

- Log in via telnet, change to the "public" directory and call the installation routine of ipkg:

> telnet <ip_of_your_Synology>
> cd /volume1/public
> sh ds101-bootstrap_1.0-4_powerpc.xsh

- Update the list of available packages:

> ipkg update

- Update the (just installed) packages if new versions exist:

> ipkg upgrade


Sources:
- Homepage of ipkg-Packages
- Binary of ipkg Version 1.0-4 for PowerPC
- Binary of ipkg Version 1.0-4 for ARM (syno-x07)



Use s3cmd (Python)

Install Python

Log in via telnet (if not still logged on) and install ruby:

> ipkg install python


Install s3cmd and configure it

Download the s3cmd archive and unpack it where you want your binaries to go. For now, let's assume in your "public" directory.

Go to http://s3tools.org/download and download the latest version. This files are hosted on source forge so the easiest way is to download it on your computer and copy it to Synology. You can use the NFS or the file station to upload it to the station.

> cd /volume1/public
> cp SHARE_LOCATION/s3cmd-0.9.9.tar.gz s3cmd-0.9.9.tar.gz
> tar -xzf s3cmd-0.9.9.tar.gz
> rm s3cmd-0.9.9.tar.gz
> mv s3cmd-0.9.9 s3cmd
> cd s3cmd
> s3cmd --configure


You can use the two s3sync-ruby-scripts s3sync.rb and s3cmd.rb from the command line or from scripts.
To execute a backup/synchronization every so often, I've written some scripts and added them to the /etc/crontab of the Synology.


Example entry in /etc/crontab:

#  Execute every second day on 01:30 (am) as root
30    1    *    *    0,2,4,6    root    /volume1/public/s3cmd/sync_some_directories.sh


To restart the cron daemon issue the following commands:
/usr/syno/etc.defaults/rc.d/S04crond.sh stop
/usr/syno/etc.defaults/rc.d/S04crond.sh start
(kudos to ian for the tip; see here)


There are different ways to set those:
You could set them in your /etc/profile if the shell access is restricted to you or only people you trust with that information. I would not.
You can use a configuration file and secure it with unix permissions. Might be your way.
I'm keeping the variables in my s3sync-scripts and export them. See example below.

V#!/bin/sh

V_DATA=/volume1
V_LOGS="/volume1/tmp/s3cmd/${0##*/}.log"

SCRIPT_DIR=`dirname $0`
SCRIPT_NAME=`basename $0`
#  to start scripts from crontab, extend the path to include the binaries of all the ipkg-packages (python, OpenSSL, ...)
export PATH=/opt/bin:/opt/sbin:$SCRIPT_DIR:$PATH

#  Check if a script is running already; if so, quit
 if (ps  |  grep s3cmd | grep -v grep); then exit
fi

echo "Starting sync..."
#  Sync the following directory
s3cmd sync --delete-removed --exclude '@eaDir*' --exclude 'Thumbs.db' $V_DATA/<dir_to_backup>/ s3://<bucket>/<dir_to_backup>/ >> $V_LOGS


#  Archive log, if not empty, and add date and time to file name
[ -s $V_LOGS ]  &&  mv $V_LOGS  "${V_LOGS%/*}/${0##*/}`date +_%d.%m-%H`.log"  ||  rm $V_LOGS


Adjust the script to your liking.

Once you have a working script thats periodically called from crontab, you have an automatic backup of your Synology box to Amazons S3-Service.




Use s3sync (ruby)

Note this is not working with ruby 1.9!

Install ruby

Log in via telnet (if not still logged on) and install ruby:

> ipkg install ruby 


Install s3sync and certificate

Download the s3sync archive and unpack it where you want your binaries to go. For now, let's assume in your "public" directory.

> cd /volume1/public
> wget http://s3.amazonaws.com/ServEdge_pub/s3sync/s3sync.tar.gz
> tar -xzf s3sync.tar.gz
> rm s3sync.tar.gz

Download the certificate file and unpack it. Your need this certificate to use ssl-secured transfers (the option "--ssl" or "-s"). Of course you can use your own certificate.

> cd /volume1/public/s3sync/
> wget http://s3.amazonaws.com/ServEdge_pub/s3sync/certs/ca-certificates.crt


Sources:
- s3sync Binaries
- Certificate


Automating the Backup


You can use the two s3sync-ruby-scripts s3sync.rb and s3cmd.rb from the command line or from scripts.
To execute a backup/synchronization every so often, I've written some scripts and added them to the /etc/crontab of the Synology.


Example entry in /etc/crontab:

#  Execute every second day on 01:30 (am) as root
30    1    *    *    0,2,4,6    root    /volume1/public/s3sync/sync_some_directories.sh  &>/dev/null


To restart the cron daemon issue the following commands:
/usr/syno/etc.defaults/rc.d/S04crond.sh stop
/usr/syno/etc.defaults/rc.d/S04crond.sh start
(kudos to ian for the tip; see here)



The ruby script expects some environment variables to work. Those are:
- AWS_ACCESS_KEY_ID
- AWS_SECRET_ACCESS_KEY
- SSL_CERT_FILE
- S3SYNC_NATIVE_CHARSET


There are different ways to set those:
You could set them in your /etc/profile if the shell access is restricted to you or only people you trust with that information. I would not.
You can use a configuration file and secure it with unix permissions. Might be your way.
I'm keeping the variables in my s3sync-scripts and export them. See example below.

V_DATA=/volume1/public
V_LOGS="/volume1/public/s3sync/${0##*/}.log"
export AWS_ACCESS_KEY_ID=<enter_your_access_key_here>
export AWS_SECRET_ACCESS_KEY=<enter_secret_key_here>
export SSL_CERT_FILE=/volume1/public/s3sync/ca-certificates.crt
export S3SYNC_NATIVE_CHARSET=UTF-8
#  to start scripts from crontab, extend the path to include the binaries of all the ipkg-packages (ruby, OpenSSL, ...)
export PATH=/opt/bin:/opt/sbin:$PATH


#  Check if a ruby-script is running already; if so, quit
ps  |  grep "ruby"  |  grep -qv "grep"  &&  exit 0


#  Sync the following directory
./s3sync.rb $1 -v -r -s --delete --exclude="@eaDir"  $V_DATA/<dir_to_backup>/  <target_backup>:   >> $V_LOGS
<add_any_number_of_s3sync-lines_here_to_sync_any_number_of_directories>


#  Archive log, if not empty, and add date and time to file name
[ -s $V_LOGS ]  &&  mv $V_LOGS  "${V_LOGS%/*}/${0##*/}`date +_%d.%m-%H`.log"  ||  rm $V_LOGS


Adjust the script to your liking.

Once you have a working script thats periodically called from crontab, you have an automatic backup of your Synology box to Amazons S3-Service.



DONE!! HAVE FUN!!
END OF CONFIGURATION INSTRUCTIONS.


External Links

Ask your questions here, in the Original Forum thread

Personal tools
Community Resources