Quantcast
Channel: MongoDB | Blog
Viewing all articles
Browse latest Browse all 2423

Solving The Trigger Challenge

$
0
0

In week three of Eliot’s MongoDB World Weekly Challenge, you were tasked with setting up MongoDB so that actions were executed as data was inserted into the database using Triggers. Triggers are great for reacting to changes in the data so let's look at how we can put them to work, solve that problem and learn more about this great Stitch feature.

We will assume you've already got your database cluster created previously. If not, create an MongoDB Atlas M0 tier free trial cluster to work though this solution.

Create the Stitch Application

Before we start working with our data we need to create a Stitch application which will host our Database Trigger.

Create Stitch App

The application should be linked to your cluster and the deployment model can be local or global. Once the application is created, a unique App ID will be created. This App ID is the answer to question one. See. It is that simple.

Getting the App ID

Now that we have our Stitch application linked to our cluster, we can create the Stitch Trigger.

Create the Stitch Trigger

A Stitch Trigger uses MongoDB change streams to react to events in our system and push these notifications to downstream systems. In our use case, we are capturing real-time inserts and counting sales that meet our given criteria.

Select Triggers from the left-hand menu, we will use the Database Trigger for this use case. This is just one of three types of triggers that Stitch offers. The Database Trigger generates events based on database operations such as inserts, updates, and so on so you can act on database changes. The authentication trigger executes based on Stitch user events such as create and login so you can act on user activity . Finally, a scheduled trigger will execute Stitch Functions at a user-defined time interval. These are useful for database maintenance commands such as cleaning up old data or creating reports at set times. The Stitch Functions are executed in a the application's context so they also have access to all the third-party services offered by Stitch.

Knowing the three types of triggers offered by Stitch and how to use them not only allows us to be more productive developers but is also the key to answering the last question.

Adding a Trigger

Let’s give our Database Trigger a name, like any good variable name the more descriptive the better. For our challenge, we want to count all documents created in the sales collection of the sample_supplies database which originated from the Denver or San Diego store locations which were purchased in store or on the phone and used a coupon.

For this trigger we will only look at inserts against the sales collection in the sample_supplies database. The great thing about this is the database does not have to exist for the trigger to start. With this current configuration, the trigger will process each sale as it comes in the proper order, but we want to find specific data, so before we create our new function, let's scroll down to the advanced features and update our match expression:

Match Expression in Trigger

Now the trigger will only fire when new a new sales document is created which matches the search criteria:

{
  "fullDocument.storeLocation": {
    "$in":["Denver","San Diego"]
  },
  "fullDocument.purchaseMethod":{
    "$in":["In store","Phone"]
  },
  "fullDocument.couponUsed":true
}

The fullDocument is a field set in the change stream which contains the newly created document. We can search for specific fields of this full document, and only those documents that match will be sent to the function (which we will create next) for processing. Something worth noting is that we are getting the full document even though we did not turn on full document processing. This is because a create will create a change event for the trigger with the full document. Compare this with an update which will only return the fields that have changed unless we explicitly ask for the full document. When we do that, it forces the trigger do a lookup on the original document.

  • A common mistake we saw was the omitting of the fullDocument field in the match expression. The fullDocument field is required as the match is being run not on the document that fired the trigger but rather on the change event document that the trigger captured.

  • One common match expression we saw had explicit use of the $and operator. A match in MongoDB uses a logical AND by default when looking at multiple fields. Calling it out can make things more readable, so this is more of a code styling preference and no points were removed if a different style was used.

Now, of course, you could perform a match like this in your function but it would make the function needlessly complex. More importantly, every new sale document would call the function, meaning it would do unnecessary extra processing.

When the trigger conditions are met, we want this trigger to execute a Stitch Function. Stitch Functions are written in JavaScript and execute in a serverless style on the Stitch platform.

Create the Stitch Function

For this part of the challenge, we simply want a count of the documents that met the criteria. We could do a query after all the sales have been loaded, but using a trigger with a function gives us a “real-time” or “at the moment” view of the data. We could expand on this example to create real-time dashboards. We could even use Charts as we saw in the previous challenge.

From the Stitch Trigger page, let's link a function. We can link an existing function or create a new function. For this challenge, we will create a new function:

Creating a Linked Function

This will expand an editor window with the prepopulated code for a trigger:

Initial code template for a Function

The code template here shows some useful examples of how to work the with the changeEvent that is sent to the function. There's some great information here, but for this challenge we will not be using any of it. We simply need to get a total count of the number of times a document was created with the criteria described previously. We can't, though, just add one to a variable. Variables created in the function scope cease to exist when the function has been run. That means we need to store the running total in the database. Using the global context associated with our Stitch application we can get a handle to a collection where we will write that running total:

exports = function(changeEvent) {
  /* 1 */
  let collection = context.services
     .get('mongodb-atlas')
     .db('sample_supplies')
     .collection('sales_output');
  /* 2 */
  return collection.updateOne(
    {_id: 1},
    {$inc: {total: 1}},
    {upsert: true}
  ).then(result => {
      return result;
  }).catch(err => {
      return err;
  });
};

The first line, /* 1 */ gets the MongoDB Stitch service using the mongodb-atlas name. This name is created by default. We then get the sample_supplies database from the MongoDB Stitch service. Finally, we get a handle to a collection called sales_output which we store in the variable collection.

On second line, /* 2 */, we use the updateOne method of the collection object to increment the value in the total field by 1 for a single document with a query filter of {_id: 1}. If this document does not exist, it will be created because of the {upsert: true} arguments.

The updateOne method returns a promise. We chain the .then() method to simply return the results to the caller. We also chain the .catch() method, which catches errors, to return the error to the caller. Note that the caller, in this case, is the trigger we created.

This function is another of the answers we were looking for though we are not looking for the exact lines above, just something similar.

A couple of common mistakes (not necessarily wrong, just extra code) we saw were:

  • There was an extra check to the changeEvent object for an “insert” event. This check is not needed if we configure our trigger to only notify insert events.
  • There were validations on the changeEvent.fullDocument field to ensure the document met the query criteria. This is also not needed if we setup our match criteria correctly.
  • There were a couple entries where the updateOne method calls used an open query filter of {}. Although this does not cause an issue with this challenge as we are only storing one document in the collection, it could introduce a bug when we have more writers executing on the collection. We did not deduct any points for this difference.

Though these mistakes may be considered cosmetic, it is a best practice to only include meaningful instructions in cloud based serverless environments like Stitch Functions. In the cloud computing world we are charged for compute time of each function, reducing this compute time not only improves performance but can also save money (albeit minimal for each call) over time.

There's one last step to go in creating the function. Once we save it, let's go and edit the Function's settings. On the left hand menu click Functions, select the our newly created function, and go to its Settings page.

Function Settings

Ensure the function is marked as private so that only internal services can execute this function. Then enable run as system which allows this function to bypass any service rules that might be configured. We will learn more about services rules in an upcoming challenge.

Once finished and the changes are saved, we are now ready to load our sample data. Go back to the Atlas application page and load the sample data for the cluster.

Loading Sample Data

As the data loads, it will create various sample databases and collections. One of those will be the sample_supplies database and its sales collection where it will insert new documents, firing our trigger which in turn will call our function. Once the data is completely loaded we can go the Collections tab to view our data. Go into the sample_supplies database and look at the sales_output collection. You should see something like this:

Results in the sales_output collection

As easy as 1, 2, 3. The final answer we are looking for is 123, as there were 123 events that matched the criteria we specified.

Wrapping up

That’s it for the solution to the Triggers Challenge. I hope we were able to show how exciting and easy Triggers are to use. The next challenge in Eliot’s Weekly MongoDB World Challenge is coming on Wednesday May 22nd, and every Wednesday up to MongoDB World. Join in, up your MongoDB skills and you could win a prize.


Viewing all articles
Browse latest Browse all 2423

Trending Articles