1.在瀏覽器中運行http://XXX.XXX.XXX.XXX:9200/_flush,確保索引數據能保存到硬盤中。
2.原數據的備份。主要是elasticsearch數據目錄下的nodes目錄的備份。nodes目錄爲索引數據目錄。
3.將原集羣中的每一個elasticsearch節點下的data目錄拷貝至新的elasticsearch數據目錄下。
node
4 利用快照來備份還原。瀏覽器
下面是備份及還原的腳本,分別存成 esback.sh,esrestore.sh,並 chmod 777 esback.sh.給予執行權限bash
腳本以下:curl
-----自動備份elasticsearch數據並壓縮---
#!/bin/bash
filename=`date +%Y%m%d%H`
backesFile=es$filename.tar.gz
cd /home/elasticsearch/back
mkdir es_dump
cd es_dump
curl -XDELETE 192.168.1.7:9200/_snapshot/backup/$filename?pretty
echo 'sleep 30'
sleep 30
curl -XPUT 192.168.1.7:9200/_snapshot/backup/$filename?wait_for_completion=true&pretty
echo 'sleep 30'
sleep 30
cp /home/elasticsearch/snapshot/* /home/elasticsearch/back/es_dump -rf
cd ..
tar czf $backesFile es_dump/
rm es_dump -rf
-----自動解壓並還原elasticsearch數據---
#!/bin/bash
filename='XXXXXXX'
backesFile=es$filename.tar.gz
cd /home/elasticsearch/back
tar zxvf $backesFile
rm /home/elasticsearch/snapshot/* -rf
cp /home/elasticsearch/back/es_dump/* /home/elasticsearch/snapshot -rf
curl -XPOST 192.168.1.7:9200/users/_close
curl -XPOST 192.168.1.7:9200/products/_close
echo 'sleep 5'
sleep 5
curl -XPOST 192.168.1.7:9200/_snapshot/backup/$filename/_restore?pretty -d '{
"indices":"users"
}'
echo 'sleep 5'
sleep 5
curl -XPOST 192.168.1.7:9200/_snapshot/backup/$filename/_restore?pretty -d '{
"indices":"products"
}'
echo 'sleep 5'
sleep 5
curl -XPOST 192.168.1.7:9200/users/_open
curl -XPOST 192.168.1.7:9200/products/_open
rm es_dump -rf elasticsearch
---end----url
備份的腳本有幾個前提條件rest
1 先建立快照存儲庫索引
--建立快照存儲庫 backup--
curl -XPUT 192.168.1.7:9200/_snapshot/backup -d '
{
"type":"fs",
"settings":{"location":"/home/elasticsearch/snapshot"}
}'it
且/home/elasticsearch/snapshot 該目錄要有權限io
備份目錄 /home/elasticsearch/back要先建好
還原的時候是按索引來分別還原的,可改爲須要的方式