ES数据备份到HDFS

1.准备好HDFS(这里我是本机测试)

2.es 安装repository-hdfs插件

(如es为多节点需在每个节点都安装插件)

elasticsearch-plugin install repository-hdfs

3. 重启ES

4.创建快照仓库

PUT /_snapshot/backup_hdfs

{
  "type": "hdfs",
  "settings": {
    "uri": "hdfs://localhost:8020/",
    "path": "elasticsearch/respositories/my_hdfs_repository",
    "conf.dfs.client.read.shortcircuit": "true"
  }
}

5.查看快照仓库

GET /_snapshot/_all

6.创建快照

PUT /_snapshot/backup_hdfs/snapshot_1

{
  "indices": "logstash-2018-08-08,index_2",//注意不设置这个属性,默认是备份所有
  "ignore_unavailable": true,
  "include_global_state": false
}

7.恢复快照

POST /_snapshot/backup_hdfs/snapshot_1/_restore

{
  "indices": "zhangmingli", //指定索引恢复,不指定就是所有
  "ignore_unavailable": true,//忽略恢复时异常索引
  "include_global_state": false//是否存储全局转态信息,fasle代表有一个或几个失败,不会导致整个任务失败
}

8.删除快照

DELETE /_snapshot/backup_hdfs/snapshot_1

 

 

欢迎指正,交流沟通,共同进步!对您有帮助的话点下推荐~~
原文地址:https://www.cnblogs.com/gaoyawei/p/9437482.html