Synology usage series 28 – Mirroring wordpress from webhost to synology box – the automatic way

Part 3 – Synchronization – database

For now we have a plugin to generate database backup daily, and have a script to download the database backup to our NAS everyday.

What we need is another script to extract the database script and import it to our local mysql database.

Create the script

# vi /opt/usr/local/bin/

# Version 0.12
#       2011-05-18      Fix $ez_prefix bug
# Version 0.11
#       2011-05-18      Add $ez_prefix
# Version 0.1
#       2011-05-18

use File::Copy;

# Edit the variables below

# variables checking rountine

print "Checking variables correctness... ";
if( $ver_result eq false){
        print "\nPlease double verifiy values of all variables.\n";

print "Passed.\n";


#temp hardcode for dev

#print "$timestamp\n";
#print "$db_dump\n";

print "Looking for db dump $db_dump... ";
if( -f "$db_dump_path/$db_dump"){
        print "Found.\n";
        print "Not Found.\n";

if($dump_found eq true){
        #create temporary directory
        print "Creating temp directory holding working file... ";

        mkdir "$tmp_dir/$timestamp", 0700 unless -d "$tmp_dir/$timestamp";
        print "Done.\n";
        print "Copying db dump to temp directory... ";
        print "Done.\n";
        print "Unzipping db dump file...\n";
        system("unzip -d $tmp_dir/$timestamp $tmp_dir/$timestamp/$db_dump");


        if(-f "$tmp_dir/$timestamp/$ez_dir/$ez_sql"){
                $db_script_size=-s "$tmp_dir/$timestamp/$ez_dir/$ez_sql";
                if($db_script_size > 0){
                        print "Done.\n";
                print "Failed.\n";
        ## ready to imporot sql script
        if($zip_flag eq true){
                print "Importing data to $sql_db...\n";
                system("$mysql_bin --host=localhost --user=$sql_user --password=$sql_pwd $sql_db < $tmp_dir/$timestamp/$ez_dir/$ez_sql");

                print "Patching data... \n";
                system("$mysql_bin --host=localhost --user=$sql_user --password=$sql_pwd $sql_db < $patch_sql");

                print "Database imported and patched, ready for human verification.\n";

                print "Cleaning up temporary directories... ";

                if(-d "$tmp_dir/$timestamp"){
                        print "Failed.\n";
                        print "Done.\n";

print "Terminated.\n";
sub cleanup{
        my ($tmpdir) = @_;
        system("rm -r -f $tmpdir");
sub checkEnv{
        my($tmpdir,$dbdir,$mysqlbin,$patch) = @_;
        my $result=true;
        if(! -d $tmpdir){
                print "\nTemporary directory not existed.";
        if(! -d $dbdir){
                print "\nDB Dump folder not existed.";
        if(! -f $mysqlbin){
                print "\nMySQL executable not found.";
        if(! -f $patch){
                print "\nData patch sql script not found.";
        return $result;


sub generateDBDumpFilename{
        return "$prefix$year-$mon-$";

sub generateTimestampString{



        return "$year$mon$day$hr$min$sec";

sub appendLeadZero{
        my ($param) = @_;
        if($param < 10){
                $param = '0'.$param;
        return $param;

$db_dump_path defines where the script to looks for the daily backup file

$tmp_dir defines a temporary directory, must be writable by root, a temporary working directory will be generated here

$ez_dir defines the directory storing the backup file after unzip, leave it as default value unless the plugin changed the setting

$ez_sql defines the actual database backup sql generated by the plugin. Leave it as default value unless the plugin changed the setting

$ez_prefix defines the prefix of the backup filename generated by the plugin.

$mysql_bin defines the mysql command

$sql_user defines the local database user

$sql_pwd defines the password of the database user

$sql_db defines the local database

$patch_sql defines the data patching script describe below

What the script does?

- First lookup the backup file
- Create a temporary directory
- Copy the backup file to temporary directory
- Unzip the backup file if existed
- Import the database script to local database
- Patching data
- Remove the temporary directory

Setup cron job

Again, setup another cron job to execute the database import script. Must run this script AFTER the rsync script, otherwise it never able to lookup the backup file because the backup file is not downloaded to the NAS yet!

# vi /etc/crontab

Example below execute the script every 1pm.

0 13 * * * root /usr/bin/perl /opt/usr/local/bin/ | /opt/bin/nail -s "Sync database done."

Data Patching

The data imported contains setting for webhost account. We need to patch the local database because the domain name is changed from to

The script above will execute a data patching script after import the database. The patching script is defined by the $patch_sql variables.

Create the data patching script

# vi /opt/usr/local/bin/sync_patch.sql

Define data patching sql

You can define any patching sql to the file, below is my setup to patch the wp_options and wp_posts table.

lock tables wp_options write;
update wp_options set option_value = replace(option_value, '','') where option_value like '%' and option_name!='custom_login_settings' and option_name not like '%wp_table_reloaded_data%';
update wp_options set option_value = replace(option_value, '/home/id/public_html','/volume1/web/public_html') where option_value like '%/home/id/public_html%';
unlock tables;

#fixing wp_posts
lock tables wp_posts write;
update wp_posts set post_content = replace(post_content, '','') where post_content like '%';
unlock tables;

My script above will patch all wordpress option except the 'Custom Login' and 'WP Table Reloaded' Plugins.

Feel free to modify it for your setup.

Now my webhost wordpress site is mirrored to my NAS and update itself everyday 🙂

Leave a Reply

Your email address will not be published. Required fields are marked *