Deprecation Warnings in MongoDB's Node.js API

This post is 4 years old. (Or older!) Code samples may not work, screenshots may be missing and links could be broken. Although some of the content may be relevant please take it with a pinch of salt.

If we install the latest version of the MongoDB Node.js API (v3.x.x and above), it is likely that we encounter some deprecation warning messages. These could be related to how we connect to the database or even how we attempt to execute queries.

Such warnings are a natural artefact of the API being updated and changed by the engineers of MongoDB, and we are in a lucky position to be able to fix these in an easy manner.

Note that most of these deprecation warnings seem to appear in version 3.x.x and above of the Node.js API in conjunction with MongoDB version 4.x and above.

db.collection is not a function

One of the most common errors that we can encounter when updating to the latest version of MongoDB after not doing such updates for a long period. In older versions of the API, the connect() method returned a database object directly, however in the latest version we get an object detailing information about the client, where we can access the database property. It is the database property that will give us access to the collection information amongst other things.

The correct implementation should look like this:

const MongoClient = require('mongodb').MongoClient;
const url = 'mongodb://localhost:27017';

MongoClient.connect(url)
.then((client) => {
const db = client.db('my-db'); // select database
const collection = db.collection('my-collection'); // select collection
return collection.find({}).limit(1).toArray(); // then query
})
.then((response) => console.log(response))
.catch((error) => console.error(error));

DeprecationWarning: current URL string parser is deprecated

It seems that MongoDB will now enforce the port number to be part of the connection string. This means that the old URL parser is no longer going to be applicable and it will be slowly retired. To use the new URL parser, we can add an option to the connect() method:

const MongoClient = require('mongodb').MongoClient;
const url = 'mongodb://localhost:27017';

// added { useNewUrlParser: true }
MongoClient.connect(url, { useNewUrlParser: true })
.then((client) => {
const db = client.db('my-db');
const collection = db.collection('my-collection');
return collection.find({}).limit(1).toArray();
})
.then((response) => console.log(response))
.catch((error) => console.error(error));

DeprecationWarning: collection.[method] is deprecated

There are a bunch of methods that fall into this category:

  • collection.insert - use insertOne, insertMany or bulkWrite instead
  • collection.save - use insertOne, insertMany, updateOne, or updateMany instead
  • collection.update - use updateOne, updateMany, or bulkWrite instead
  • collection.remove - use deleteOne, deleteMany, or bulkWrite instead

For all these deprecation warnings we also get multiple suggestions that essentially advises us to pick the right method to do the right job - this makes a lot of sense and makes the code somewhat more readable.

Some of these new methods are very straightforward - for example, if we wish to insert one single record, we should use insertOne. But some methods are not as quite straightforward - for example, what is the difference between insertMany and bulkWrite when it comes to inserting data.

Let's go ahead and review the differences between them.

insertMany / updateMany / deleteMany vs bulkWrite

Interestingly the warnings that we have seen in the previous section have all suggested that we use bulkWrite - now that's curious. How is bulkWrite different from insertMany or updateMany?

The difference lies within the function signatures. With insertMany (and all the other functions except updateMany) we can specify multiple documents to be inserted/updated/deleted as an array, while for bulkWrite we can specify multiple operations.

The accepted operations are the following: insertOne, updateOne, updateMany, deleteOne, deleteMany and replaceOne. So really bulkWrite can do all of these operations, and can do multiple of them.

Let's take a look at an example of this:

// insertMany
MongoClient.connect(url, { useNewUrlParser: true })
.then((client) => {
const db = client.db('star-wars');
const collection = db.collection('characters');
const documents = [
{
name: 'Jack',
age: 23,
},
{
name: 'Kate',
age: 29,
},
];
return collection.insertMany(documents);
})
.then((response) => console.log(response))
.catch((error) => console.error(error));

The above code snippet returns the details of the operation as well as the IDs of the documents inserted:

{ result: { ok: 1, n: 2 },
  ops:
    [ { name: 'Jack', age: 23, _id: 5b9fcf4d2f38660e22a01b54 },
      { name: 'Kate', age: 29, _id: 5b9fcf4d2f38660e22a01b55 } ],
  insertedCount: 2,
  insertedIds:
    { '0': 5b9fcf4d2f38660e22a01b54, '1': 5b9fcf4d2f38660e22a01b55 } }
// bulkWrite with insertOne
MongoClient.connect(url, { useNewUrlParser: true })
.then((client) => {
const db = client.db('star-wars');
const collection = db.collection('characters');
const documents = [
{
name: 'Jack',
age: 23,
},
{
name: 'Kate',
age: 29,
},
];
return collection.bulkWrite([
{
insertOne: {
document: documents[0],
},
},
{
insertOne: {
document: documents[1],
},
},
]);
})
.then((response) => console.log(response))
.catch((error) => console.error(error));

Note that it'd make more sense to also include the operation in the data to be inserted, but in this article, we are sticking to the example throughout and accessing the right document data via their array indexes.

The above code returns a BulkWriteResult object detailing the operations (in our case we only did an update):

BulkWriteResult {
  result:
    { ok: 1,
      writeErrors: [],
      writeConcernErrors: [],
      insertedIds: [ [Object], [Object] ],
      nInserted: 2,
      nUpserted: 0,
      nMatched: 0,
      nModified: 0,
      nRemoved: 0,
      upserted: [] },
  insertedCount: 2,
  matchedCount: 0,
  modifiedCount: 0,
  deletedCount: 0,
  upsertedCount: 0,
  upsertedIds: {},
  insertedIds:
    { '0': 5b9fd080f935210e3c59a35f, '1': 5b9fd080f935210e3c59a360 },
  n: 2 }

From the above, it's clear that bulkWrite can chain multiple operations easily with little effort:

MongoClient.connect(url, { useNewUrlParser: true })
.then((client) => {
const db = client.db('star-wars');
const collection = db.collection('characters');
const documents = [
{
name: 'Jack',
age: 23,
},
{
name: 'Kate',
age: 29,
},
];
return collection.bulkWrite([
{
insertOne: {
document: documents[0],
},
},
{
insertOne: {
document: documents[1],
},
},
{
updateOne: {
filter: { name: 'Kate' },
update: { $set: { age: 25 } },
},
},
]);
})
.then((response) => console.log(response))
.catch((error) => console.error(error));

The above operation does the insert an updates a single document - in fact, it updates the document inserted earlier, and returns the following data structure:

// trimmed
BulkWriteResult {
  result:
    { ok: 1,
      insertedIds: [ [Object], [Object] ],
      nInserted: 2,
      nMatched: 1,
      nModified: 1,
      upserted: [] },
  insertedCount: 2,
  matchedCount: 1,
  modifiedCount: 1,
// ...

Conclusion

In this article, we have seen how to circumvent some of the errors and warnings that we may receive when updating to the latest version of MongoDB as well as using the newest version of their Node.js API.