In my role as a Solutions Architect for MongoDB I meet a lot of customers and prospects. Recently I was speaking to an existing MongoDB Atlas customer in the UK who was very happy with the service but wanted to gain more insight into the monthly spend. They pointed out how Atlas provides a REST API to access invoice data but it was not possible to automatically feed the details into MongoDB Charts to visualise the data. That got me thinking...
Creating a Billing Dashboard
This two-part article will demonstrate how you can build a Billing Dashboard leveraging MongoDB Charts to visualise the spend within a MongoDB Atlas Organization. This tool will utilise MongoDB Stitch to invoke the MongoDB Atlas API endpoint to retrieve invoice data (see the API documentation for 'invoices'
), inserting the data it receives into a MongoDB Atlas cluster which will be used as the source for the MongoDB Charts Dashboard. Scheduled Stitch Triggers will also be used to ensure the invoice data is retrieved automatically each day in order to keep the dashboard up to date.
For reference MongoDB Atlas is MongoDB's global and fully automated Database as a Service (DBaaS). It allows you to create and manage the lifecycle of secure, resilient MongoDB clusters using an intuitive UI and/or a declarative RESTful API. These clusters run MongoDB (the database) - the same database you would deploy yourself only managed by MongoDB Inc (the company). MongoDB Atlas is offered as a pay-as-you-go service or via prepaid annual credits, with various support options available if required.
Tightly integrated with MongoDB Atlas are two other services - MongoDB Stitch and MongoDB Charts.
MongoDB Stitch offers various higher-level functions on top of the database, including Stitch Functions and Stitch Triggers. We will use these functions and triggers as the glue to hold our end-to-end solution together.
MongoDB Charts on the other hand allows you to build charts and dashboards, leveraging the data inside your MongoDB Atlas clusters. MongoDB Charts differs from other charting and BI tools in that it is natively integrated with MongoDB and in particular it understands the MongoDB document model, avoiding the need for you to flatten your data into a tabular model, potentially losing embedded relationship between that data.
The first part of this article will show how to retrieve invoice data from MongoDB Atlas. The second part will show how to automate this process and actually build a MongoDB Charts dashboard to visualise the data.
Setup
To get started the first thing we will need is a MongoDB Atlas cluster so head to https://cloud.mongodb.com and login to your account or create a new account if required.
Create an Atlas Cluster
Once logged in, navigate to an Atlas Project in your organization (create a new Project if required). It's best to choose a Project where you have the 'Project Owner'
role so you can create clusters and modify the Project settings, including the ability to create/modify database users.
Create a new cluster in this Atlas Project in one of the supported regions – a free M0
cluster will do. This will be used to store the invoice data that we will ultimately visualise in MongoDB Charts. If you'd rather reuse an existing cluster that's fine too (just remember you may need to change the 'db'
variable below to avoid overwriting existing data).
In part two of this article we'll also need to log in to this cluster via the Mongo Shell, so make sure you create and/or have access to a user who can create a MongoDB View on the billing database
Create a Programmatic API Key
To access the MongoDB Atlas Invoice API endpoint we will need a suitable API key. We will do so by creating an organization-scoped programmatic API key at the Atlas 'Organization'
level, so navigate back up to the 'Organization'
level in Atlas and select 'Access'
on the left navigation panel before switching to the 'API Keys'
tab.
From here create a new key using the green 'Create API Key'
button. If you have some keys already created and you need to create a new one, click the green 'Manage'
button instead and select 'Create API Key'
from there. Provide a 'Description'
for this API Key and select the following permissions at a minimum:
Organization Billing Admin
Organization Read Only
If you don't see these options you may be trying to create an API Key at the Project level. Navigate back up to the Organization level and select 'Access'
from there.
Note: You may not have rights to create organization-scoped programmatic API Keys. If this is the case please ask your Atlas admin to create a suitable key for you.
Remember to make a note of the 'Public Key'
and the 'Private Key'
. The latter will not be shown again and will be needed to make API calls. As this API Key will only be used from MongoDB Stitch there is no need to add an 'API Whitelist'
entry for this key.
At this point we are ready to create our MongoDB Stitch application so navigate back to the Atlas Project we created earlier and switch to Stitch by clicking the 'Stitch'
option on the left navigation bar.
Stitch Setup
Inside the Stitch UI click the 'Create New Application'
button to create a new Stitch Application. Fill out the dialog as follows:
- Select a suitable
'Application Name'
, e.g.,'billing'
. - Make sure your
M0
cluster is selected in the'Link to Cluster'
dropdown. - Leave the
'Stitch Service Name'
as is, i.e.,'mongodb-atlas'
- Optional: Select
'Local'
for the'Deployment Model'
- Switch to the (closest) region where your cluster is running in the
'Select a Primary Region'
dropdown
Click the green 'Create'
button and wait for the Stitch application to initialize.
Stitch Values & Secrets
'Values'
allow us to assign particular values to variables used in our functions, while 'Secrets'
allow us to protect sensitive information, such as private key values.
To define the required Values and Secrets for our (soon to be defined Stitch Functions) switch to 'Values & Secrets'
using the left navigation menu.
From here switch to the 'Secrets'
tab and create a new Secret. Give the Secret a name. We'll use 'privateKeySecret'
for the purposes of this article. Set its value to the Private Key value that was created earlier.
Switch back to the 'Values'
tab and create two new values. We'll call them 'publicKey'
and 'privateKey'
respectively.
'publicKey'
will be a'Plain Text'
type with the value set to the Public Key from above.'privateKey'
will be a'Secret'
type. Select the Secret we created above in the'Secret Name'
dropdown list.
Note: When defining 'Plain Text'
values remember to enclose the Value in quotation marks so you can save the changes.
We will also store the ID of the organization we are going to retrieve the invoice data for in another Value. Name this value 'orgid'
, set it to a 'Plain Text'
value and set its value to the ID for the organization you wish to query the invoice data for.
Note: To retrieve this ID open the Atlas Organization and select 'Settings'
from the left-side navigation menu in the main MongoDB Atlas UI. Copy the value of the 'Organization ID'
available on the 'General Settings'
tab into the 'orgid'
value in Stitch.
Lastly we will create one additional 'Value'
where we can store the name of the database we will use to store our invoice data:
- Name:
'db'
, Value Type:'Plain Text'
, Value:'billing'
Stitch HTTP Service
Next we need to create a Stitch 'Service'
in order to be able to make HTTP calls to the Atlas API. To do this navigate to 'Services'
using the left navigation menu in Stitch. Click the green 'Add a Service'
button and create a new 'HTTP'
service, named 'http'
. No 'Incoming Webhooks'
or 'Rules'
are required for this service.
Stitch Functions
Now that we have our credentials available in MongoDB Stitch and access to an HTTP service, it's time to create some functions which we can use to access the Atlas API and populate our database. Switch to 'Functions'
in the left navigation bar in order to create these functions.
Note: when creating a function we only need to specify its name and body. We may optionally set the function to 'Private'
in the functions 'Settings'
tab but no other options need to be changed from the defaults.
Function: httpGet()
This will be our base function which we will use to call the Atlas API.
To create this function click the green 'Create New Function'
button and name it 'httpGet'
. Next we will switch to the 'Function Editor'
view and replace the contents with the following code:
httpGetDigest = function(http, url, authHeader) {
const username = context.values.get("publicKey");
const apiKey = context.values.get("privateKey");
const realm = authHeader.match(/realm="(.*?)"/)[1];
const nonce = authHeader.match(/nonce="(.*?)"/)[1];
const qop = authHeader.match(/qop="(.*?)"/)[1];
const ha1 = utils.crypto.hash('md5', `${username}:${realm}:${apiKey}`).toHex();
const path = url.match(/:\/\/.*?(\/.*)/)[1];
const ha2 = utils.crypto.hash('md5', `GET:${path}`).toHex();
const cnonce = Math.random().toString().substr(2, 14);
const nc = '00000001';
const hash = utils.crypto.hash('md5', `${ha1}:${nonce}:${nc}:${cnonce}:${qop}:${ha2}`).toHex();
const digestHeader = `Digest username="${username}", realm="${realm}", nonce="${nonce}", uri="${path}", qop=${qop}, nc=${nc}, cnonce="${cnonce}", response="${hash}", algorithm=MD5`;
return http.get({ url: url, headers: { 'Authorization': [ digestHeader ], 'Content-Type': ['application/json'] } });
};
exports = function(http, url) {
return http.get({ url: url })
.then(resp => {
const authHeader = resp.headers['Www-Authenticate'].toString();
return httpGetDigest(http, url, authHeader);
});
};
Click the green 'Save'
button to save these functions.
In the code above we define two functions - a main function and a utility function. Looking at the main function first, we perform an HTTP GET against the URL provided. As MongoDB Atlas requires authentication the response will include authentication details in the header. We extract this data into an 'authHeader'
string which we then pass into the second function 'httpGetDigest'
. This function performs an HTTP GET with MD5 DIGEST authentication leveraging the details in the 'authHeader'
parameter, along with the API details we stored as Values above. The details of this function are a little beyond the scope of this article – a full understanding is left as an exercise for the reader.
With these building blocks in place we can now start querying the MongoDB Atlas API.
Function: getBillingData()
This is the main function which will retrieve the invoice data from our Atlas organization. We will retrieve data for the current month (this is stored in the 'pending'
endpoint) which we will 'upsert'
into our collection.
To do this create a new function called 'getBillingData'
and replace the body as follows:
exports = function() {
const http = context.services.get("http");
const orgid = context.values.get("orgid");
const url = `https://cloud.mongodb.com/api/atlas/v1.0/orgs/${orgid}/invoices/pending`;
const db = context.values.get("db");
var collection = context.services.get("mongodb-atlas").db(db).collection("billingdata");
return context.functions.execute("httpGet", http, url)
.then(({ body }) => {
const doc = JSON.parse(body.text());
if (doc.error) {
console.error(`Error ${doc.error}: '${doc.detail}'`);
return null;
} else {
console.log(`getBillingData: Upserting new doc with '${doc.lineItems.length}' lineItems`);
return collection.updateOne({ "id": doc.id }, doc, { "upsert": true });
}
})
.catch(err => console.error(`Failed to insert billing doc: ${err}`));
};
This function retrieves some of the Stitch Values and the 'http'
Service we defined earlier which it then uses to issue an HTTP GET against the Atlas API endpoint for the specified Atlas Organization to retrieve the latest, aka pending, invoice data. The response from this call is then parsed. If an error is detected we report the error string and return (this can happen for example if you are using an unauthorised key or you have mistyped the private key password), otherwise the body is used as a JSON document to upsert into a MongoDB Atlas collection. By upserting the data it means we can repeatedly call the same API endpoint and update the data on a day-to-day basis.
Note: by replacing the API endpoint above with the specific endpoint for a given month it is possible to prepopulate the database with data from previous months (each month has a unique 'id'
field so will end up as a separate MongoDB document). Details on how to retrieve the endpoints for this historic data are left as an exercise for the reader.
You can test this function by clicking the 'Run'
button at the bottom of the function editor. All going well it should produce output similar to the following in the 'Result'
window:
> logs:
getBillingData: Upserting new doc with 597 lineItems
> result:
{
"matchedCount": {
"$numberInt": "0"
},
"modifiedCount": {
"$numberInt": "0"
},
"upsertedId": {
"$oid": "5cd0218114297120022f4a05"
}
}
> result (JavaScript):
EJSON.parse('{"matchedCount":{"$numberInt":"0"},"modifiedCount":{"$numberInt":"0"},"upsertedId":{"$oid":"5cd0218114297120022f4a05"}}')
Running the function again will result in an update rather than an insert, with a response similar to the following:
> logs:
getBillingData: Upserting new doc with 597 lineItems
> result:
{
"matchedCount": {
"$numberInt": "1"
},
"modifiedCount": {
"$numberInt": "1"
}
}
> result (JavaScript):
EJSON.parse('{"matchedCount":{"$numberInt":"1"},"modifiedCount":{"$numberInt":"1"}}')
Function: getProjectData()
Because the invoice endpoints for the Atlas API reference projects using internal ID's rather than human readable names we will also retrieve some Project related data so we can convert this into something more consumable for our Dashboard. To do so we will create a new function called 'getProjectData'
with a function body as follows:
processDoc = function(doc, collection) {
return doc.results.reduce(function(sequence, r) {
return sequence.then(function() {
const id = r.id;
const name = r.name;
return collection.updateOne({"id": id}, {"id": id, "name": name}, { "upsert": true})
.then(result => {
const { matchedCount, modifiedCount, upsertedId } = result;
if (upsertedId) {
console.log(`Document not found. Inserted a new document with _id: '${upsertedId}' and name '${name}'`);
} else {
console.log(`Successfully updated document with _id: '${id}' to '${name}'`);
}
});
});
}, Promise.resolve());
};
exports = function() {
const http = context.services.get("http");
const orgid = context.values.get("orgid");
const url = `https://cloud.mongodb.com/api/atlas/v1.0/orgs/${orgid}/groups`;
const db = context.values.get("db");
var collection = context.services.get("mongodb-atlas").db(db).collection("projectdata");
return context.functions.execute("httpGet", http, url)
.then(({ body }) => {
const doc = JSON.parse(body.text());
if (doc.error) {
console.error(`Error ${doc.error}: '${doc.detail}'`);
return null;
} else {
console.log(`getProjectData: Upserting ${doc.results.length} project documents`);
return processDoc(doc, collection);
}
})
.catch(err => console.error(`Failed to insert project docs: ${err}`));
};
This is a little more complex than before. Again we’re creating two functions - a main function and an auxiliary helper function. The main function follows the same general structure as before but targets a different REST endpoint. Assuming no errors have occurred, the result of this API call is then passed into the helper function, 'processDoc'
. This function uses a javascript 'reduce'
function and 'Promises'
to iterate over the array elements in the 'results'
field of the input document and upserts a new document into the 'projectdata'
collection for each element. [Note: This function is based on the details presented in this article from Google: JavaScript Promises: an Introduction]
Click the 'Run'
button at the bottom after saving these functions to test them. All going well the 'Result'
window will show something similar to the following:
> logs:
getProjectData: Upserting 2 project documents
Document not found. Inserted a new document with id: '5cadfde549358e05f57dcb17' and name 'myProject1'
Document not found. Inserted a new document with id: '5aaf9ecb9701543aa700152c' and name 'myProject2'
Subsequent attempts to run this function will result in slightly different output as data is updated:
> logs:
getProjectData: Upserting 2 project documents
Successfully updated document with id: '5cadfde549358e05f57dcb17' to 'myProject1'
Successfully updated document with id: '5aaf9ecb9701543aa700152c' to 'myProject2'
Function: refresh()
The last function we will create is one which will simply call our two main functions. We can then use this to hook up to our Atlas Trigger allowing the trigger to update all the data in a single call.
To do this we will create a new function called 'refresh'
with the following body:
exports = function(){
context.functions.execute("getBillingData");
context.functions.execute("getProjectData");
};
Running this function will result in both functions being executed and the data in the database being updated accordingly.
Creating a Scheduled Trigger
In order to have the data refresh on a daily basis (aligned with the refresh rate of invoice data in MongoDB Atlas) we will create a new Stitch Scheduled Trigger. To do so navigate to 'Triggers'
on the left navigation pane in your MongoDB Stitch application and click the green 'Add a Trigger'
button.
Select 'Scheduled Triggers'
for the 'Trigger Type'
. Give the new trigger a name, e.g., 'dailyBillingTrigger'
, and ensure it is 'Enabled'
. For 'Schedule Type'
change to 'Advanced'
and paste the following into the 'CRON schedule'
box:
15 6/24 * * *
Lastly, select 'refresh'
as the function to invoke in the 'Function'
dropdown before saving the new trigger.
This will result in a daily trigger that fires at 6:15am each morning and will refresh the data in our MongoDB cluster based on the latest invoice data ensuring our dashboard is always up-to-date. For more details about the format required for the 'CRON schedule'
box above see the Wikipedia article on 'cron'
.
Concluding Part One
In the first part of this article we looked at how we can use MongoDB Stitch Functions to query the invoice data for a MongoDB Atlas organization using the MongoDB Atlas API and how to store that data in a MongoDB Atlas cluster.
In the second part of this article we will show how to build a Dashboard in order to visualise this data using MongoDB Charts in order to provide a live view of the Atlas spend.