自动备份cpanel数据到AWS

SHU88的所有Magento空间均已实现自动备份到AWS云存储

首选确保您有AWS的S3账户并新建bucket。本例使用s3cmd完成对cpanel服务器的数据备份。

一、安装s3cmd

cd /etc/yum.repos.d
#下载centos5的yum源
http://s3tools.org/repo/RHEL_5/s3tools.repo
#下载centos6的yum源
http://s3tools.org/repo/RHEL_6/s3tools.repo
yum install s3cmd

安装完成后进行配置,

s3cmd --configure

然后输入自己的密匙

$ s3cmd --configure
Enter new values or accept defaults in brackets with Enter.
Refer to user manual for detailed description of all options.

Access key and Secret key are your identifiers for Amazon S3
Access Key: XXXXXXXXXXXXXX
Secret Key: XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

Encryption password is used to protect your files from reading
by unauthorized persons while in transfer to S3
Encryption password: XXXXX
Path to GPG program [/usr/bin/gpg]: 

When using secure HTTPS protocol all communication with Amazon S3
servers is protected from 3rd party eavesdropping. This method is
slower than plain HTTP and can't be used if you're behind a proxy
Use HTTPS protocol [No]: yes

New settings:
  Access Key: XXXXXXXXXXXXXX
  Secret Key: XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
  Encryption password: XXXXX
  Path to GPG program: /usr/bin/gpg
  Use HTTPS protocol: True
  HTTP Proxy server name:
  HTTP Proxy server port: 0

Test access with supplied credentials? [Y/n]
Please wait...
Success. Your access key and secret key worked fine 🙂

Now verifying that encryption works...
Success. Encryption and decryption worked fine 🙂

Save settings? [y/N] y
Configuration saved to '/home/saltycrane/.s3cfg'

二、配置cpanel自带备份工具,确保数据存在于/backup/cpbackup/,且每日备份在daily下面

请参考官方文档http://docs.cpanel.net/twiki/bin/view/AllDocumentation/WHMDocs/ConfigBackup

并创建日志文件夹

mkdir /var/log/backuplogs

三、写入脚本文件,例如保存为/root/dailybackup.sh

#!/bin/bash

##Notification email address
_EMAIL=your_email@domain.com

ERRORLOG=/var/log/backuplogs/backup.err`date +%F`
ACTIVITYLOG=/var/log/backuplogs/activity.log`date +%F`

##需要备份的文件夹地址
SOURCE=/backup/cpbackup/daily
##bucket的名称
BUCKETNAME=Backup_daily
##在bucket的文件夹名称
DESTINATION=`date +%F`

##备份周期、以下即系统自动删除S3内的3日前旧文件
DEGREE=3

#Clear the logs if the script is executed second time
:> ${ERRORLOG}
:> ${ACTIVITYLOG}

##Uploading the daily backup to Amazon s3
/usr/bin/s3cmd -r put ${SOURCE} s3://${BUCKETNAME}/${DESTINATION}/ 1>>${ACTIVITYLOG} 2>>${ERRORLOG}
ret2=$?

##Sent email alert
msg="BACKUP NOTIFICATION ALERT FROM `hostname`"

if [ $ret2 -eq 0 ];then
msg1="Amazon s3 Backup Uploaded Successfully"
else
msg1="Amazon s3 Backup Failed!!\n Check ${ERRORLOG} for more details"
fi
echo -e "$msg1"|mail -s "$msg" ${_EMAIL}

#######################
##Deleting backup's older than DEGREE days
## Delete from both server and amazon
#######################
DELETENAME=$(date  --date="${DEGREE} days ago" +%F)

/usr/bin/s3cmd -r --force del s3://${BUCKETNAME}/${DELETENAME} 1>>${ACTIVITYLOG} 2>>${ERRORLOG}

四、赋予脚本文件权限

chmod u+x /root/dailybackup.sh

五、crontab自动执行

crontab -e

#添加一行,以下为每隔两天在服务器时间为0点时自动执行这个脚本
0 0 */2 * * /bin/bash /root/dailybackup.sh

完成后可以执行sh /root/dailybackup.sh 看下是否正常运行。

六、数据的取回

#其中Yourbuckets改为您的buckets的名称,后面是对应的目录,根据自己需求填写
#yourestore是对应服务器上准备存放取回数据的文件夹
s3cmd -r get s3://Yourbuckets/2012-08-20  yourestore

发表评论

电子邮件地址不会被公开。 必填项已用*标注

用户的评价

SHU88提供专业服务,并且有丰富的建站经验,回复都挺及时的,以后继续品牌之路。 ------------- 杭州沃熔科技