Mongo导出、导入

1.mongodb 数据导出:

connection options:
  /h, /host:<hostname>                            mongodb host to connect to (setname/host1,host2 for replica sets)
      /port:<port>                                server port (can also use --host hostname:port)

authentication options:
  /u, /username:<username>                        username for authentication
  /p, /password:<password>                        password for authentication
      /authenticationDatabase:<database-name>     database that holds the user's credentials
      /authenticationMechanism:<mechanism>        authentication mechanism to use

namespace options:
  /d, /db:<database-name>                         database to use
  /c, /collection:<collection-name>               collection to use

uri options:
      /uri:mongodb-uri                            mongodb uri connection string

output options:
  /f, /fields:<field>[,<field>]*                  comma separated list of field names (required for exporting CSV) e.g. -f "name,age"
      /fieldFile:<filename>                       file with field names - 1 per line
      /type:<type>                                the output format, either json or csv (defaults to 'json') (default: json)
  /o, /out:<filename>                             output file; if not specified, stdout is used
      /jsonArray                                  output to a JSON array rather than one object per line
      /pretty                                     output JSON formatted to be human-readable
      /noHeaderLine                               export CSV data without a list of field names at the first line

querying options:
  /q, /query:<json>                               query filter, as a JSON string, e.g., '{x:{$gt:1}}'
      /queryFile:<filename>                       path to a file containing a query filter (JSON)
  /k, /slaveOk                                    allow secondary reads if available (default true) (default: false)
      /readPreference:<string>|<json>             specify either a preference name or a preference json object
      /forceTableScan                             force a table scan (do not use $snapshot)
      /skip:<count>                               number of documents to skip
      /limit:<count>                              limit the number of documents to export
      /sort:<json>                                sort order, as a JSON string, e.g. '{x:1}'
      /assertExists                               if specified, export fails if the collection does not exist(default: false)

2.mongodb数据导入:

connection options:
  /h, /host:<hostname>                            mongodb host to connect to
                                                  (setname/host1,host2 for
                                                  replica sets)
      /port:<port>                                server port (can also use
                                                  --host hostname:port)
authentication options:
  /u, /username:<username>                        username for authentication
  /p, /password:<password>                        password for authentication
      /authenticationDatabase:<database-name>     database that holds the
                                                  user's credentials
      /authenticationMechanism:<mechanism>        authentication mechanism to
                                                  use

namespace options:
  /d, /db:<database-name>                         database to use
  /c, /collection:<collection-name>               collection to use

uri options:
      /uri:mongodb-uri                            mongodb uri connection string

input options:
  /f, /fields:<field>[,<field>]*                  comma separated list of
                                                  fields, e.g. -f name,age
      /fieldFile:<filename>                       file with field names - 1 per
                                                  line
      /file:<filename>                            file to import from; if not
                                                  specified, stdin is used
      /headerline                                 use first line in input
                                                  source as the field list (CSV
                                                  and TSV only)
      /jsonArray                                  treat input source as a JSON
                                                  array
      /parseGrace:<grace>                         controls behavior when type
                                                  coercion fails - one of:
                                                  autoCast, skipField, skipRow,
                                                  stop (defaults to 'stop')
                                                  (default: stop)
      /type:<type>                                input format to import: json,
                                                  csv, or tsv (defaults to
                                                  'json') (default: json)
      /columnsHaveTypes                           indicated that the field list
                                                  (from --fields, --fieldsFile,
                                                  or --headerline) specifies
                                                  types; They must be in the
                                                  form of
                                                  '<colName>.<type>(<arg>)'.
                                                  The type can be one of: auto,
                                                  binary, bool, date, date_go,
                                                  date_ms, date_oracle, double,
                                                  int32, int64, string. For
                                                  each of the date types, the
                                                  argument is a datetime layout
                                                  string. For the binary type,
                                                  the argument can be one of:
                                                  base32, base64, hex. All
                                                  other types take an empty
                                                  argument. Only valid for CSV
                                                  and TSV imports. e.g.
                                                  zipcode.string(),
                                                  thumbnail.binary(base64)

ingest options:
      /drop                                       drop collection before
                                                  inserting documents
      /ignoreBlanks                               ignore fields with empty
                                                  values in CSV and TSV
      /maintainInsertionOrder                     insert documents in the order
                                                  of their appearance in the
                                                  input source
  /j, /numInsertionWorkers:<number>               number of insert operations
                                                  to run concurrently (defaults
                                                  to 1) (default: 1)
      /stopOnError                                stop importing at first
                                                  insert/upsert error
      /mode:[insert|upsert|merge]                 insert: insert only. upsert:
                                                  insert or replace existing
                                                  documents. merge: insert or
                                                  modify existing documents.
                                                  defaults to insert
      /upsertFields:<field>[,<field>]*            comma-separated fields for
                                                  the query part when --mode is
                                                  set to upsert or merge
      /writeConcern:<write-concern-specifier>     write concern options e.g.
                                                  --writeConcern majority,
                                                  --writeConcern '{w: 3,
                                                  wtimeout: 500, fsync: true,
                                                  j: true}'
      /bypassDocumentValidation                   bypass document validation

 示例:

mongoexport.exe -h 127.0.0.1 -u admin -p xxx -d test -c user -o user.json --type=json

mongoimport.exe -h 127.0.0.1 -u admin -p xxx -d test_bak -c user --file=dm_task.json --type=json

  示例:

mongoexport.exe -h 127.0.0.1 -u admin -p xxx -d test -c user -o user.csv --type=csv --fields="_id,username,birthday,gender,email,phone"
mongoimport.exe -h 127.0.0.1 -u admin -p xxx -d test -c user_bak --file=user.csv --type=csv --headerline

mongoexport 在不指定输出的情况下输出到stdout,mongoimport不指定输入的情况下从stdin读入数据;所以可以配套使用而不需要本地转储;

 示例:

mongoexport.exe -h 127.0.0.1 -u admin -p xxx -d test -c user | mongoimport.exe -h 127.0.0.1 -u admin -p xxx -d test_bak -c user

注意:在数据这个过程中可能会出现因为数据量太大导致连接超时拒绝连接;需要记住数据的位置;然后再第二次导入的时候使用skip跳过已经导过的数据;

示例:

mongoexport.exe -h 127.0.0.1 -u admin -p xxx -d test -c user --skip=234567|mongoimport.exe -h 127.0.0.1 -u admin -p xxx -d test_bak -c user

TIPS:本机性能较好,较空闲的话,可以多开几个线程进行数据操作使用-j 参数;

示例:

mongoexport.exe -h 127.0.0.1 -u admin -p xxx -d test -c user --skip=234567|mongoimport.exe -h 127.0.0.1 -u admin -p xxx -d test_bak -c user -j 4

原文地址:https://www.cnblogs.com/huaizhi/p/12016424.html