您的位置:首页 > 数据库 > Mongodb

mongodb的备份与恢复详解

2022-03-04 15:35 1441 查看

 

简单

Mongodb导出与导入

1: 导入/导出可以操作的是本地的mongodb服务器,也可以是远程的.
所以,都有如下通用选项:
-h host 主机
--port port 端口
-u username 用户名
-p passwd 密码


2: mongoexport 导出json格式的文件
问: 导出哪个库,哪张表,哪几列,哪几行?

-d 库名
-c 表名
-f field1,field2...列名
-q 查询条件
-o 导出的文件名
-- csv 导出csv格式(便于和传统数据库交换数据)

例:
[root@localhost mongodb]# ./bin/mongoexport -d test -c news -o test.json
connected to: 127.0.0.1
exported 3 records
[root@localhost mongodb]# ls
bin dump GNU-AGPL-3.0 README test.json THIRD-PARTY-NOTICES
[root@localhost mongodb]# more test.json
{ "_id" : { "$oid" : "51fc59c9fecc28d8316cfc03" }, "title" : "aaaa" }
{ "_id" : { "$oid" : "51fcaa3c5eed52c903a91837" }, "title" : "today is sataday" }
{ "_id" : { "$oid" : "51fcaa445eed52c903a91838" }, "title" : "ok now" }


例2: 只导出goods_id,goods_name列
./bin/mongoexport -d test -c goods -f goods_id,goods_name -o goods.json

例3: 只导出价格低于1000元的行
./bin/mongoexport -d test -c goods -f goods_id,goods_name,shop_price -q ‘{shop_price:{$lt:200}}’ -o goods.json

注: _id列总是导出

Mongoimport 导入

-d 待导入的数据库
-c 待导入的表(不存在会自己创建)
--type csv/json(默认)
--file 备份文件路径

例1: 导入json
./bin/mongoimport -d test -c goods --file ./goodsall.json

例2: 导入csv
./bin/mongoimport -d test -c goods --type csv -f goods_id,goods_name --file ./goodsall.csv

./bin/mongoimport -d test -c goods --type csv --headline -f goods_id,goods_name --file ./goodsall.csv


mongodump 导出二进制bson结构的数据及其索引信息
-d 库名
-c 表名
-f field1,field2...列名

例:
mongodum -d test [-c 表名] 默认是导出到mongo下的dump目录

规律:
1:导出的文件放在以database命名的目录下
2: 每个表导出2个文件,分别是bson结构的数据文件, json的索引信息
3: 如果不声明表名, 导出所有的表


mongorestore 导入二进制文件
例:
./bin/mongorestore -d test --directoryperdb dump/test/ (mongodump时的备份目录)

二进制备份,不仅可以备份数据,还可以备份索引,
备份数据比较小.

详细过程

检查环境

[mongod@mcw01 ~]$ ps -ef|grep mongo
root      16595  16566  0 10:57 pts/0    00:00:00 su - mongod
mongod    16596  16595  0 10:57 pts/0    00:00:00 -bash
mongod    16683      1  1 12:06 ?        00:01:20 mongod --dbpath=/mongodb/data --logpath=/mongodb/log/mongodb.log --port=27017 --logappend --fork --auth
mongod    16758  16596  0 13:27 pts/0    00:00:00 ps -ef
mongod    16759  16596  0 13:27 pts/0    00:00:00 grep --color=auto mongo
[mongod@mcw01 ~]$ kill -2 16683
[mongod@mcw01 ~]$ ps -ef|grep mongo
root      16595  16566  0 10:57 pts/0    00:00:00 su - mongod
mongod    16596  16595  0 10:57 pts/0    00:00:00 -bash
mongod    16761  16596  0 13:27 pts/0    00:00:00 ps -ef
mongod    16762  16596  0 13:27 pts/0    00:00:00 grep --color=auto mongo
[mongod@mcw01 ~]$  mongod --dbpath=/mongodb/data --logpath=/mongodb/log/mongodb.log --port=27017 --logappend --fork
about to fork child process, waiting until server is ready for connections.
forked process: 16765
child process started successfully, parent exiting
[mongod@mcw01 ~]$ ps -ef|grep mongo
root      16595  16566  0 10:57 pts/0    00:00:00 su - mongod
mongod    16596  16595  0 10:57 pts/0    00:00:00 -bash
mongod    16765      1 11 13:28 ?        00:00:01 mongod --dbpath=/mongodb/data --logpath=/mongodb/log/mongodb.log --port=27017 --logappend --fork
mongod    16782  16596  0 13:28 pts/0    00:00:00 ps -ef
mongod    16783  16596  0 13:28 pts/0    00:00:00 grep --color=auto mongo
[mongod@mcw01 ~]$ mongo
MongoDB shell version: 3.2.8
connecting to: test
Server has startup warnings:
2022-03-04T13:28:03.832+0800 I CONTROL  [initandlisten]
2022-03-04T13:28:03.832+0800 I CONTROL  [initandlisten] ** WARNING: /sys/kernel/mm/transparent_hugepage/enabled is 'always'.
2022-03-04T13:28:03.832+0800 I CONTROL  [initandlisten] **        We suggest setting it to 'never'
2022-03-04T13:28:03.832+0800 I CONTROL  [initandlisten]
2022-03-04T13:28:03.832+0800 I CONTROL  [initandlisten] ** WARNING: /sys/kernel/mm/transparent_hugepage/defrag is 'always'.
2022-03-04T13:28:03.832+0800 I CONTROL  [initandlisten] **        We suggest setting it to 'never'
2022-03-04T13:28:03.832+0800 I CONTROL  [initandlisten]
2022-03-04T13:28:03.832+0800 I CONTROL  [initandlisten] ** WARNING: soft rlimits too low. rlimits set to 4096 processes, 65535 files. Number of processes should be at least 32767.5 : 0.5 times number of files.
2022-03-04T13:28:03.832+0800 I CONTROL  [initandlisten]
> use test;
switched to db test
> show tables;
bar
foo
shop
stu
tea
> db.stu.find();
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc079"), "sn" : 1, "name" : "student1" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc07a"), "sn" : 2, "name" : "student2" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc07b"), "sn" : 3, "name" : "student3" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc07c"), "sn" : 4, "name" : "student4" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc07d"), "sn" : 5, "name" : "student5" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc07e"), "sn" : 6, "name" : "student6" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc07f"), "sn" : 7, "name" : "student7" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc080"), "sn" : 8, "name" : "student8" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc081"), "sn" : 9, "name" : "student9" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc082"), "sn" : 10, "name" : "student10" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc083"), "sn" : 11, "name" : "student11" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc084"), "sn" : 12, "name" : "student12" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc085"), "sn" : 13, "name" : "student13" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc086"), "sn" : 14, "name" : "student14" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc087"), "sn" : 15, "name" : "student15" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc088"), "sn" : 16, "name" : "student16" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc089"), "sn" : 17, "name" : "student17" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc08a"), "sn" : 18, "name" : "student18" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc08b"), "sn" : 19, "name" : "student19" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc08c"), "sn" : 20, "name" : "student20" }
Type "it" for more
> db.stu.find().count();
1000
>

导出为json数据

[mongod@mcw01 ~]$ ls /mongodb/bin/
bsondump  mongod     mongoexport  mongoimport  mongoperf     mongos     mongotop
mongo     mongodump  mongofiles   mongooplog   mongorestore  mongostat
[mongod@mcw01 ~]$ #导出命令,指定库test 指定表(集合)stu,指定导出字段sn,name ,指定查询条件(跟db.stu.find(查询条件)一样的做法),然后指定输出到文件test.stu.json
[mongod@mcw01 ~]$ #查询条件查出什么就能导出什么
[mongod@mcw01 ~]$ mongoexport  -d test -c stu -f sn,name -q '{sn:{$lte:1000}}' -o ./test.stu.json
2022-03-04T13:41:01.667+0800    connected to: localhost
2022-03-04T13:41:01.943+0800    exported 1000 records  #导出了1000条记录
[mongod@mcw01 ~]$ ls
test.stu.json
[mongod@mcw01 ~]$ tail test.stu.json  #如下导出的是json格式数据,每个文档都是一个json数据,占一行
{"_id":{"$oid":"6220ff64ae6f9ca7b52cc457"},"sn":991.0,"name":"student991"}
{"_id":{"$oid":"6220ff64ae6f9ca7b52cc458"},"sn":992.0,"name":"student992"}
{"_id":{"$oid":"6220ff64ae6f9ca7b52cc459"},"sn":993.0,"name":"student993"}
{"_id":{"$oid":"6220ff64ae6f9ca7b52cc45a"},"sn":994.0,"name":"student994"}
{"_id":{"$oid":"6220ff64ae6f9ca7b52cc45b"},"sn":995.0,"name":"student995"}
{"_id":{"$oid":"6220ff64ae6f9ca7b52cc45c"},"sn":996.0,"name":"student996"}
{"_id":{"$oid":"6220ff64ae6f9ca7b52cc45d"},"sn":997.0,"name":"student997"}
{"_id":{"$oid":"6220ff64ae6f9ca7b52cc45e"},"sn":998.0,"name":"student998"}
{"_id":{"$oid":"6220ff64ae6f9ca7b52cc45f"},"sn":999.0,"name":"student999"}
{"_id":{"$oid":"6220ff64ae6f9ca7b52cc460"},"sn":1000.0,"name":"student1000"}
[mongod@mcw01 ~]$

如果需要将数据导入MySQL,那么就导出为csv格式,这个格式的就默认没有导出_id

[mongod@mcw01 ~]$ mongoexport  -d test -c stu -f sn,name -q '{sn:{$lte:1000}}' --csv -o ./test.stu.csv
2022-03-04T13:59:47.086+0800    csv flag is deprecated; please use --type=csv instead
2022-03-04T13:59:47.088+0800    connected to: localhost
2022-03-04T13:59:47.252+0800    exported 1000 records
[mongod@mcw01 ~]$ ls
test.stu.csv  test.stu.json
[mongod@mcw01 ~]$ tail test.stu.csv
991,student991
992,student992
993,student993
994,student994
995,student995
996,student996
997,student997
998,student998
999,student999
1000,student1000
[mongod@mcw01 ~]$

导入json格式数据

mongoimport -d test -c animal --type json --file ./test.stu.json

[mongod@mcw01 ~]$ ls
test.stu.csv  test.stu.json
[mongod@mcw01 ~]$ mongo
MongoDB shell version: 3.2.8
connecting to: test
.......
> use test;
switched to db test
> show tables;  #查看没有animal的表
bar
foo
shop
stu
tea
>
bye
[mongod@mcw01 ~]$ ls
test.stu.csv  test.stu.json
[mongod@mcw01 ~]$ #指定导入到库test,指定导入到表animal,这个表还不存在。指定导入的数据类型是json,指定导入数据的json文件
[mongod@mcw01 ~]$
[mongod@mcw01 ~]$ mongoimport -d test -c animal --type json --file ./test.stu.json
2022-03-04T14:09:02.530+0800    connected to: localhost
2022-03-04T14:09:02.914+0800    imported 1000 documents
[mongod@mcw01 ~]$ mongo
MongoDB shell version: 3.2.8
connecting to: test
......
> use test;
switched to db test
> show tables;
animal
bar
foo
shop
stu
tea
> db.animal.find();  #查看导入成功
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc079"), "sn" : 1, "name" : "student1" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc07a"), "sn" : 2, "name" : "student2" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc07b"), "sn" : 3, "name" : "student3" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc07c"), "sn" : 4, "name" : "student4" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc07d"), "sn" : 5, "name" : "student5" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc07e"), "sn" : 6, "name" : "student6" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc07f"), "sn" : 7, "name" : "student7" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc080"), "sn" : 8, "name" : "student8" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc081"), "sn" : 9, "name" : "student9" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc082"), "sn" : 10, "name" : "student10" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc083"), "sn" : 11, "name" : "student11" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc084"), "sn" : 12, "name" : "student12" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc085"), "sn" : 13, "name" : "student13" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc086"), "sn" : 14, "name" : "student14" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc087"), "sn" : 15, "name" : "student15" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc088"), "sn" : 16, "name" : "student16" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc089"), "sn" : 17, "name" : "student17" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc08a"), "sn" : 18, "name" : "student18" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc08b"), "sn" : 19, "name" : "student19" }
{ "_id" : ObjectId("6220ff62ae6f9ca7b52cc08c"), "sn" : 20, "name" : "student20" }
Type "it" for more
> db.animal.find().count();  #查看导入数量没有问题
1000
>

导入csv格式的数据

csv导出的时候,是整行整列的。第一行有字段名称
[mongod@mcw01 ~]$ mongoexport  -d test -c stu -f sn,name -q '{sn:{$lte:10}}' --csv -o ./test.stu.csv  #我们导出十行数据
2022-03-04T14:17:42.807+0800    csv flag is deprecated; please use --type=csv instead
2022-03-04T14:17:42.814+0800    connected to: localhost
2022-03-04T14:17:42.818+0800    exported 10 records
[mongod@mcw01 ~]$ ls
test.stu.csv  test.stu.json
[mongod@mcw01 ~]$ more test.stu.csv  #查看十行数据
sn,name
1,student1
2,student2
3,student3
4,student4
5,student5
6,student6
7,student7
8,student8
9,student9
10,student10
[mongod@mcw01 ~]$
[mongod@mcw01 ~]$ mongoimport -d test -c bird --type csv --file ./test.stu.csv  #指定类型是csv,导入报错缺少字段和头
2022-03-04T14:23:28.638+0800    error validating settings: must specify --fields, --fieldFile or --headerline to import this file type
2022-03-04T14:23:28.638+0800    try 'mongoimport --help' for more information
[mongod@mcw01 ~]$
[mongod@mcw01 ~]$ mongoimport -d test -c bird --type csv -f sn,name --file ./test.stu.csv  #-f指定字段名称了,但是导入的数量不对,是11行
2022-03-04T14:24:56.680+0800    connected to: localhost
2022-03-04T14:24:56.738+0800    imported 11 documents
[mongod@mcw01 ~]$
[mongod@mcw01 ~]$ mongo #进入之后查看,它把csv的第一行也作为数据导入了,但是我们其实是不需要这行数据的。
.......
> use test
switched to db test
> db.bird.find();
{ "_id" : ObjectId("6221b0b8310ae65086fee5e7"), "sn" : "sn", "name" : "name" }
{ "_id" : ObjectId("6221b0b8310ae65086fee5e8"), "sn" : 1, "name" : "student1" }
{ "_id" : ObjectId("6221b0b8310ae65086fee5e9"), "sn" : 2, "name" : "student2" }
{ "_id" : ObjectId("6221b0b8310ae65086fee5ea"), "sn" : 3, "name" : "student3" }
{ "_id" : ObjectId("6221b0b8310ae65086fee5eb"), "sn" : 4, "name" : "student4" }
{ "_id" : ObjectId("6221b0b8310ae65086fee5ec"), "sn" : 5, "name" : "student5" }
{ "_id" : ObjectId("6221b0b8310ae65086fee5ed"), "sn" : 6, "name" : "student6" }
{ "_id" : ObjectId("6221b0b8310ae65086fee5ee"), "sn" : 7, "name" : "student7" }
{ "_id" : ObjectId("6221b0b8310ae65086fee5ef"), "sn" : 8, "name" : "student8" }
{ "_id" : ObjectId("6221b0b8310ae65086fee5f0"), "sn" : 9, "name" : "student9" }
{ "_id" : ObjectId("6221b0b8310ae65086fee5f1"), "sn" : 10, "name" : "student10" }
>
> db.bird.drop() #我们将这个表删除,重新导入
true
> db.bird.find();
>
[mongod@mcw01 ~]$ mongoimport -d test -c bird --type csv -f sn,name --headerline --file ./test.stu.csv
2022-03-04T14:30:52.953+0800    error validating settings: incompatible options: --fields and --headerline
2022-03-04T14:30:52.953+0800    try 'mongoimport --help' for more information
[mongod@mcw01 ~]$ mongoimport --help  #可能 版本不一样,加了--headerline就不需要加-f指定字段名称了。它直接以第一行为字段名称
--headerline                                use first line in input source as the field list (CSV
and TSV only)
[mongod@mcw01 ~]$ mongoimport -d test -c bird --type csv  --headerline --file ./test.stu.csv  #添加头参数--headerline。
2022-03-04T14:31:36.099+0800    connected to: localhost
2022-03-04T14:31:36.130+0800    imported 10 documents
[mongod@mcw01 ~]$ mongo
.........
> use test
switched to db test
> db.bird.find()
{ "_id" : ObjectId("6221b248310ae65086fee5f2"), "sn" : 1, "name" : "student1" }
{ "_id" : ObjectId("6221b248310ae65086fee5f3"), "sn" : 2, "name" : "student2" }
{ "_id" : ObjectId("6221b248310ae65086fee5f4"), "sn" : 3, "name" : "student3" }
{ "_id" : ObjectId("6221b248310ae65086fee5f5"), "sn" : 4, "name" : "student4" }
{ "_id" : ObjectId("6221b248310ae65086fee5f6"), "sn" : 5, "name" : "student5" }
{ "_id" : ObjectId("6221b248310ae65086fee5f7"), "sn" : 6, "name" : "student6" }
{ "_id" : ObjectId("6221b248310ae65086fee5f8"), "sn" : 7, "name" : "student7" }
{ "_id" : ObjectId("6221b248310ae65086fee5f9"), "sn" : 8, "name" : "student8" }
{ "_id" : ObjectId("6221b248310ae65086fee5fa"), "sn" : 9, "name" : "student9" }
{ "_id" : ObjectId("6221b248310ae65086fee5fb"), "sn" : 10, "name" : "student10" }
>

二进制方式导出(mongodump)

mongodump导出bson数据 ,更快捷,适用于mongodb的备份恢复,不适合用于可读性好和其它数据库中使用它。它还能导出索引

[mongod@mcw01 ~]$ ls
test.stu.csv  test.stu.json
[mongod@mcw01 ~]$ mongodump -d test -c tea
2022-03-04T15:07:49.018+0800    writing test.tea to
2022-03-04T15:07:49.023+0800    done dumping test.tea (4 documents)
[mongod@mcw01 ~]$ ls
dump  test.stu.csv  test.stu.json
[mongod@mcw01 ~]$ ls dump/  #不指定路径,会在当前目录生成dump目录,备份文件放在目录下面
test
[mongod@mcw01 ~]$ ls dump/test/
tea.bson  tea.metadata.json
[mongod@mcw01 ~]$
[mongod@mcw01 ~]$ tail dump/test/tea.metadata.json  #索引信息也导出了
{"options":{},"indexes":[{"v":1,"key":{"_id":1},"name":"_id_","ns":"test.tea"},{"v":1,"key":{"email":"hashed"},"name":"email_hashed","ns":"test.tea"}]}[mongod@mcw01 ~]$
[mongod@mcw01 ~]$ tail -2 dump/test/tea.bson #不好读的数据,
b@163.com+_idb!¢˲K«½©Ȃemail
c@163.com_idb!¦˲K«½©[mongod@mcw01 ~]$

当不指定表的时候,会导出当前库所有表
[mongod@mcw01 ~]$ ls
dump  test.stu.csv  test.stu.json
[mongod@mcw01 ~]$ rm -rf dump/
[mongod@mcw01 ~]$ mongodump -d test
2022-03-04T15:14:37.145+0800    writing test.bar to
2022-03-04T15:14:37.146+0800    writing test.stu to
2022-03-04T15:14:37.166+0800    writing test.animal to
2022-03-04T15:14:37.178+0800    writing test.bird to
2022-03-04T15:14:37.561+0800    done dumping test.bird (10 documents)
2022-03-04T15:14:37.562+0800    writing test.tea to
2022-03-04T15:14:37.621+0800    done dumping test.tea (4 documents)
2022-03-04T15:14:37.621+0800    writing test.foo to
2022-03-04T15:14:37.666+0800    done dumping test.foo (2 documents)
2022-03-04T15:14:37.667+0800    writing test.shop to
2022-03-04T15:14:37.674+0800    done dumping test.stu (1000 documents)
2022-03-04T15:14:37.682+0800    done dumping test.animal (1000 documents)
2022-03-04T15:14:37.701+0800    done dumping test.shop (2 documents)
2022-03-04T15:14:37.781+0800    done dumping test.bar (10000 documents)
[mongod@mcw01 ~]$ ls
dump  test.stu.csv  test.stu.json
[mongod@mcw01 ~]$ ls dump/
test
[mongod@mcw01 ~]$ ls dump/test/  #每个表都有连两个文件
animal.bson           bar.metadata.json   foo.bson           shop.metadata.json  tea.bson
animal.metadata.json  bird.bson           foo.metadata.json  stu.bson            tea.metadata.json
bar.bson              bird.metadata.json  shop.bson          stu.metadata.json
[mongod@mcw01 ~]$

整个库的恢复(二进制方式mongorestore )

> use test
switched to db test
> db.dropDatabase()  #删除测试库
{ "dropped" : "test", "ok" : 1 }
> show dbs;
admin  0.000GB
local  0.000GB
shop   0.000GB
>
bye
[mongod@mcw01 ~]$

下面方法我的是不能用的
[mongod@mcw01 ~]$ ls
dump  test.stu.csv  test.stu.json
[mongod@mcw01 ~]$ ls dump/
test
[mongod@mcw01 ~]$ ls dump/test/
animal.bson           bar.metadata.json   foo.bson           shop.metadata.json  tea.bson
animal.metadata.json  bird.bson           foo.metadata.json  stu.bson            tea.metadata.json
bar.bson              bird.metadata.json  shop.bson          stu.metadata.json
[mongod@mcw01 ~]$ mongorestore -d test --directoryperdb dump/test
2022-03-04T15:19:45.238+0800    error parsing command line options: --dbpath and related flags are not supported in 3.0 tools.
See http://dochub.mongodb.org/core/tools-dbpath-deprecated for more information
2022-03-04T15:19:45.239+0800    try 'mongorestore --help' for more information
[mongod@mcw01 ~]$ #查看下怎么回事
[mongod@mcw01 ~]$ mongorestore --help #原来是变了
--dir=<directory-name>                      input directory, use '-' for stdin

正确的恢复方法:
[mongod@mcw01 ~]$ ls
dump  test.stu.csv  test.stu.json
[mongod@mcw01 ~]$ ls dump/
test
[mongod@mcw01 ~]$ ls dump/test/  #数据都备份在这里
animal.bson           bar.metadata.json   foo.bson           shop.metadata.json  tea.bson
animal.metadata.json  bird.bson           foo.metadata.json  stu.bson            tea.metadata.json
bar.bson              bird.metadata.json  shop.bson          stu.metadata.json
[mongod@mcw01 ~]$ mongorestore -d test --dir dump/test  #指定备份目录,然后恢复数据
2022-03-04T15:22:10.488+0800    building a list of collections to restore from dump/test dir
2022-03-04T15:22:10.530+0800    reading metadata for test.bar from dump/test/bar.metadata.json
2022-03-04T15:22:10.582+0800    restoring test.bar from dump/test/bar.bson
2022-03-04T15:22:10.663+0800    reading metadata for test.animal from dump/test/animal.metadata.json
2022-03-04T15:22:10.663+0800    reading metadata for test.stu from dump/test/stu.metadata.json
2022-03-04T15:22:10.664+0800    reading metadata for test.bird from dump/test/bird.metadata.json
2022-03-04T15:22:10.774+0800    restoring test.animal from dump/test/animal.bson
2022-03-04T15:22:10.828+0800    restoring test.stu from dump/test/stu.bson
2022-03-04T15:22:10.949+0800    restoring test.bird from dump/test/bird.bson
2022-03-04T15:22:10.958+0800    restoring indexes for collection test.bird from metadata
2022-03-04T15:22:10.964+0800    finished restoring test.bird (10 documents)
2022-03-04T15:22:10.964+0800    reading metadata for test.shop from dump/test/shop.metadata.json
2022-03-04T15:22:11.044+0800    restoring indexes for collection test.stu from metadata
2022-03-04T15:22:11.045+0800    restoring test.shop from dump/test/shop.bson
2022-03-04T15:22:11.048+0800    restoring indexes for collection test.shop from metadata
2022-03-04T15:22:11.076+0800    finished restoring test.shop (2 documents)
2022-03-04T15:22:11.076+0800    reading metadata for test.tea from dump/test/tea.metadata.json
2022-03-04T15:22:11.114+0800    restoring indexes for collection test.animal from metadata
2022-03-04T15:22:11.123+0800    finished restoring test.stu (1000 documents)
2022-03-04T15:22:11.123+0800    reading metadata for test.foo from dump/test/foo.metadata.json
2022-03-04T15:22:11.165+0800    finished restoring test.animal (1000 documents)
2022-03-04T15:22:11.165+0800    restoring test.tea from dump/test/tea.bson
2022-03-04T15:22:11.196+0800    restoring indexes for collection test.tea from metadata
2022-03-04T15:22:11.196+0800    restoring test.foo from dump/test/foo.bson
2022-03-04T15:22:11.248+0800    restoring indexes for collection test.foo from metadata
2022-03-04T15:22:11.249+0800    finished restoring test.tea (4 documents)
2022-03-04T15:22:11.253+0800    finished restoring test.foo (2 documents)
2022-03-04T15:22:11.805+0800    restoring indexes for collection test.bar from metadata
2022-03-04T15:22:11.807+0800    finished restoring test.bar (10000 documents)
2022-03-04T15:22:11.807+0800    done
[mongod@mcw01 ~]$
[mongod@mcw01 ~]$ mongo  #检查数据恢复的正常
......
> show dbs;
admin  0.000GB
local  0.000GB
shop   0.000GB
test   0.001GB
> use test;
switched to db test
> show tables;
animal
bar
bird
foo
shop
stu
tea
> db.tea.find();
{ "_id" : ObjectId("622180a312caf24babbda9c8"), "email" : "a@163.com" }
{ "_id" : ObjectId("622180a912caf24babbda9c9"), "email" : "b@163.com" }
{ "_id" : ObjectId("622184a212caf24babbda9ca"), "email" : "c@163.com" }
{ "_id" : ObjectId("622185a612caf24babbda9cc") }
>

 

内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: