備份是咱們運維人員最基本的平常工做,作好備份是穩定運維的一個重要環節。下面分享兩個使用過的簡單備份腳本:mysql
1)網站數據備份
將網站數據/var/www/vhost/www.kevin.com和/var/www/vhost/www.grace.com分別備份到:
/Data/code-backup/www.kevin.com和/Data/code-backup/www.grace.com下。web
[root@huanqiu_web5 code-backup]# cat web_code_backup.sh #!/bin/bash #備份網站數據 /bin/tar -zvcf /Data/code-backup/www.kevin.com/www.kevin.com_`date +%Y%m%d_%H%M%S`.tar.gz /var/www/vhosts/www.kevin.com /bin/tar -zvcf /Data/code-backup/www.grace.com/www.grace.com_`date +%Y%m%d_%H%M%S`.tar.gz /var/www/vhosts/www.grace.com #刪除一週以前的備份文件 find /Data/code-backup/www.kevin.com -type f -mtime +7 -exec rm -f {} \; find /Data/code-backup/www.grace.com -type f -mtime +7 -exec rm -f {} \; [root@huanqiu_web5 ~]# crontab -l #天天凌晨5點備份網站數據 0 5 * * * /bin/bash -x /Data/code-backup/web_code_backup.sh > /dev/null 2>&1 備份後的效果以下: [root@huanqiu_web5 ~]# ls /Data/code-backup/www.kevin.com/ www.kevin.com_20170322_174328.tar.gz [root@xqsj_web5 ~]# ls /Data/code-backup/www.grace.com/ www.grace.com_20170322_174409.tar.gz
2)數據庫備份(自動刪除10天前的備份文件)
數據庫服務使用的是阿里雲的mysql,遠程進行定時的全量備份,備份到本地,以防萬一。mysql數據庫遠程備份的數據最好打包壓縮sql
[root@huanqiuPC crontab]# pwd /Data/Mysql_Bakup/crontab [root@huanqiuPC crontab]# cat backup_db_wangshibo.sh #!/bin/bash MYSQL="/usr/bin/mysql" MYSQLDUMP="/usr/bin/mysqldump" BACKUP_DIR="/Data/Mysql_Bakup" #DB_SOCKET="/var/lib/mysql/mysql.sock" DB_hostname="110.120.11.9" DBNAME="wangshibo" DB_USER="db_wangshibo" DB_PASS="mhxzk3rfzh" TIME=`date +%Y%m%d%H%M%S` LOCK_FILE="${BACKUP_DIR}/lock_file.tmp" BKUP_LOG="/Data/Mysql_Backup/${TIME}_bkup.log" DEL_BAK=`date -d '10 days ago' '+%Y%m%d'` ##To judge lock_file if [[ -f $LOCK_FILE ]];then exit 255 else echo $$ > $LOCK_FILE fi ##dump databases## echo ${TIME} >> ${BKUP_LOG} echo "=======Start Bakup============" >>${BKUP_LOG} #${MYSQLDUMP} -h ${DB_hostname} -u${DB_USER} -p${DB_PASS} --databases ${DBNAME} | gzip -9 > ${BACKUP_DIR}/${TIME}.${DBNAME}.gz ${MYSQLDUMP} -h ${DB_hostname} -u${DB_USER} -p${DB_PASS} --databases ${DBNAME} |gzip -9 > ${BACKUP_DIR}/${TIME}.${DBNAME}.gz echo "=======Finished Bakup============" >>${BKUP_LOG} /bin/rm -f ${LOCK_FILE} ##del back 10 days before## /bin/rm -f ${BACKUP_DIR}/${DEL_BAK}*.gz
定時進行備份數據庫
[root@huanqiuPC Mysql_Bakup]# crontab -l 10 0,6,12,18 * * * /bin/bash /Data/Mysql_Bakup/crontab/backup_db_wangshibo.sh >/dev/null 2>&1
腳本執行後的備份效果以下tomcat
[root@huanqiuPC crontab]# cd /Data/Mysql_Bakup [root@huanqiuPC Mysql_Bakup]# ls 20161202061001.wangshibo.gz
同步線上數據庫到beta環境數據庫(覆蓋beta數據庫):
將上面定時備份的數據包拷貝到beta機器上,而後解壓,登錄mysql,source命令進行手動覆蓋。bash
再看一例 app
[root@backup online_bak]# cat rsync.sh (腳本中的同步:限速3M,保留最近一個月的備份) #!/bin/bash # ehr data backup---------------------------------------------------------- cd /data/bak/online_bak/192.168.34.27/tomcat_data/ /usr/bin/rsync -e "ssh -p22222" -avpgolr --bwlimit=3072 192.168.34.27:/data/tomcat7/webapps /data/bak/online_bak/192.168.34.27/tomcat_data/`date +%Y%m%d` /bin/tar -zvcf `date +%Y%m%d`.tar.gz `date +%Y%m%d` rm -rf `date +%Y%m%d` cd /data/bak/online_bak/192.168.34.27/tomcat_data/ NUM1=`ls -l|awk '{print $9}'|grep 2017|wc -l` I1=$( /usr/bin/expr $NUM1 - 30 ) ls -l|awk '{print $9}'|grep 2017|sed -n "1,$I1 p"|xargs rm -rf # zp data backup---------------------------------------------------------- cd /data/bak/online_bak/192.168.34.33/tomcat_data/ /usr/bin/rsync -e "ssh -p22222" -avpgolr --bwlimit=3072 192.168.34.33:/data/tomcat8/webapps /data/bak/online_bak/192.168.34.33/tomcat_data/`date +%Y%m%d` /bin/tar -zvcf `date +%Y%m%d`.tar.gz `date +%Y%m%d` rm -rf `date +%Y%m%d` cd /data/bak/online_bak/192.168.34.33/tomcat_data/ NUM2=`ls -l|awk '{print $9}'|grep 2017|wc -l` I2=$( /usr/bin/expr $NUM2 - 30 ) ls -l|awk '{print $9}'|grep 2017|sed -n "1,$I2 p"|xargs rm -rf cd /data/bak/online_bak/192.168.34.33/upload /usr/bin/rsync -e "ssh -p22222" -avpgolr --bwlimit=3072 192.168.34.33:/home/zrx_hr/upload /data/bak/online_bak/192.168.34.33/upload/`date +%Y%m%d` /bin/tar -zvcf `date +%Y%m%d`.tar.gz `date +%Y%m%d` rm -rf `date +%Y%m%d` cd /data/bak/online_bak/192.168.34.33/upload NUM3=`ls -l|awk '{print $9}'|grep 2017|wc -l` I3=$( /usr/bin/expr $NUM3 - 30 ) ls -l|awk '{print $9}'|grep 2017|sed -n "1,$I3 p"|xargs rm -rf # zabbix mysql backup---------------------------------------------------------- /bin/mkdir /data/bak/online_bak/192.168.16.21/mysql_data/`date +%Y%m%d` /data/mysql/bin/mysqldump -hlocalhost -uroot -pBKJK-@@@-12345 --databases zabbix > /data/bak/online_bak/192.168.16.21/mysql_data/`date +%Y%m%d`/zabbix.sql cd /data/bak/online_bak/192.168.16.21/mysql_data/ /bin/tar -zvcf `date +%Y%m%d`.tar.gz `date +%Y%m%d` rm -rf `date +%Y%m%d` cd /data/bak/online_bak/192.168.16.21/mysql_data/ NUM4=`ls -l|awk '{print $9}'|grep 2017|wc -l` I4=$( /usr/bin/expr $NUM4 - 30 ) ls -l|awk '{print $9}'|grep 2017|sed -n "1,$I4 p"|xargs rm -rf [root@backup online_bak]# pwd /data/bak/online_bak [root@backup online_bak]# ls 192.168.16.21 rsync.sh 192.168.34.27 192.168.34.33 [root@backup online_bak]# ll total 10K drwxr-xr-x 3 root root 23 Aug 19 17:47 192.168.16.21 drwxr-xr-x 4 root root 41 Aug 19 18:30 192.168.34.27 drwxr-xr-x 4 root root 37 Aug 19 18:17 192.168.34.33 -rwxr-xr-x 1 root root 6.3K Aug 19 19:20 rsync.sh [root@backup online_bak]# ll 192.168.16.21/ total 4.0K drwxr-xr-x 2 root root 28 Aug 19 19:43 mysql_data [root@backup online_bak]# ll 192.168.16.21/mysql_data/ total 1.5G -rw-r--r-- 1 root root 1.5G Aug 19 19:43 20170819.tar.gz [root@backup online_bak]# ll 192.168.34.27 total 4.0K drwxr-xr-x 2 root root 4.0K Aug 19 19:26 tomcat_data [root@backup online_bak]# ll 192.168.34.27/tomcat_data/ total 3.9G ...... -rw-r--r-- 1 root root 140M Aug 19 11:06 20170818.tar.gz -rw-r--r-- 1 root root 140M Aug 19 19:26 20170819.tar.gz [root@backup online_bak]# ll 192.168.34.33 total 8.0K drwxr-xr-x 2 root root 4.0K Aug 19 19:26 tomcat_data drwxr-xr-x 2 root root 28 Aug 19 19:30 upload [root@backup online_bak]# crontab -l # online backup 0 2 * * * /bin/bash -x /data/bak/online_bak/rsync.sh > /dev/null 2>&1
取一個目錄下,按照文件/目錄的修改時間來排序,取最後一次修改的文件 [work@qd-op-comm01 xcspam]$ ls bin xcspam-20170802145542 xcspam-20170807204545 xcspam-20170814115753 xcspam-20170818115806 xcspam-20170824162641 xcspam-20170831173616 xcspam xcspam-20170802194447 xcspam-20170808163425 xcspam-20170815191150 xcspam-20170821122949 xcspam-20170824165020 xcspam-20170831191347 xcspam-20170731154018 xcspam-20170803113809 xcspam-20170808195340 xcspam-20170815210032 xcspam-20170821153300 xcspam-20170829100941 xcspam-20170904105109 xcspam-20170801190647 xcspam-20170807150022 xcspam-20170809103648 xcspam-20170816141022 xcspam-20170822173600 xcspam-20170831135623 xcspam-20170911120519 xcspam-20170802142921 xcspam-20170807164137 xcspam-20170809111246 xcspam-20170816190704 xcspam-20170823101913 xcspam-20170831160115 xcspam-20170911195802 [work@qd-op-comm01 xcspam]$ ls -rtd xcspam* |tail -1 xcspam-20170911195802 [work@qd-op-comm01 xcspam]$ ls -rtd xcspam* |tail -2|head -1 //這是倒數第二個被修改的文件
自動刪除30天以前的備份數據, 即保留最近的30份備份數據, 腳本以下(這個能夠做爲通用腳本):運維
[root@qw-backup01 caiwu]# cat delete_30days_before.sh #!/bin/bash cd `pwd` NUM=`ls -l|awk '{print $9}'|wc -l` I=$( /usr/bin/expr $NUM - 31 ) ls -l|awk '{print $9}'|sed -n "1,$I p"|xargs rm -rf [root@qw-backup01 caiwu]# ls 201901100.des3 20190141.des3 20190150.des3 20190159.des3 20190168.des3 20190177.des3 20190186.des3 20190195.des3 20190133.des3 20190142.des3 20190151.des3 20190160.des3 20190169.des3 20190178.des3 20190187.des3 20190196.des3 20190134.des3 20190143.des3 20190152.des3 20190161.des3 20190170.des3 20190179.des3 20190188.des3 20190197.des3 20190135.des3 20190144.des3 20190153.des3 20190162.des3 20190171.des3 20190180.des3 20190189.des3 20190198.des3 20190136.des3 20190145.des3 20190154.des3 20190163.des3 20190172.des3 20190181.des3 20190190.des3 20190199.des3 20190137.des3 20190146.des3 20190155.des3 20190164.des3 20190173.des3 20190182.des3 20190191.des3 delete_30days_before.sh 20190138.des3 20190147.des3 20190156.des3 20190165.des3 20190174.des3 20190183.des3 20190192.des3 20190139.des3 20190148.des3 20190157.des3 20190166.des3 20190175.des3 20190184.des3 20190193.des3 20190140.des3 20190149.des3 20190158.des3 20190167.des3 20190176.des3 20190185.des3 20190194.des3 執行腳本 [root@qw-backup01 caiwu]# sh -x delete_30days_before.sh + cd /data/backup/caiwu ++ ls -l ++ awk '{print $9}' ++ wc -l + NUM=70 ++ /usr/bin/expr 70 - 31 + I=39 + ls -l + awk '{print $9}' + sed -n '1,39 p' + xargs rm -rf 再次查看, 發現只保留了30天以內的備份數據 [root@qw-backup01 caiwu]# ls 20190170.des3 20190174.des3 20190178.des3 20190182.des3 20190186.des3 20190190.des3 20190194.des3 20190198.des3 20190171.des3 20190175.des3 20190179.des3 20190183.des3 20190187.des3 20190191.des3 20190195.des3 20190199.des3 20190172.des3 20190176.des3 20190180.des3 20190184.des3 20190188.des3 20190192.des3 20190196.des3 delete_30days_before.sh 20190173.des3 20190177.des3 20190181.des3 20190185.des3 20190189.des3 20190193.des3 20190197.des3