Major retail brands have long been using various forms of AI, for example statistical analysis and machine learning models, to better serve their customers. But with its high barriers to entry, one key channel has been slower to embrace the technology. By connecting large and small brands with customers, e-commerce marketplaces such as Amazon, Mercado Libre, and Shopify are among the fastest growing retail routes to market. Since 2016, GoBots has been working to extend the benefits of AI to any retailer on any marketplace. It uses AI, analytics, and MongoDB Atlas to make e-commerce easier, more convenient, and smarter for brands serving Latin America.
“We are building an AI-driven customer service platform that revolutionizes e-commerce experiences,” says Victor Hochgreb, Co-Founder and CEO of GoBots. “Our solution makes the benefits of AI available to any retailer, whether large or small. With our GoBots natural language understanding (NLU) model, retailers automate customer interactions such as answering questions and resolving issues through intelligent assistants. At the same time, they leverage data analytics to offer personalized customer experiences.”
Hochgreb goes on to say, “GoBots increases engagement and conversion rates for over 600 clients across Latin America, including Adidas, Bosch, Canon, Chevrolet, Dell, Electrolux, Hering, HP, Nike, and Samsung.”
Check out our AI resource page to learn more about building AI-powered apps with MongoDB.
Exploring GoBot's AI stack
GoBots’ custom NLU models are built using the Rasa framework. Hochgreb says, “We have a neural network trained on over 150 million question-answer examples and more than 50 bots — specialists in different segments — to understand more specific questions.”
Models are fine tuned with data from the retailer's own product catalog and website corpus. The model runtime is powered by a PyTorch microservice on Google Cloud. The larger GoBots platform is built with Kotlin and orchestrated by Kubernetes, providing the company with cloud freedom as its business expands and evolves.
The GoBots AI assistants kick into action as soon as a customer asks a question on the marketplace site, with the questions stored in MongoDB Atlas. GoBots’ natural language models are programmatically called via a REST API to perform tasks like named entity recognition (NER), user intent detection, and question-answer generation with all inferences also stored in MongoDB. If the models are able to generate an answer with high confidence, the GoBots service will respond directly to the customer in real time. In case of a low confidence response, the models flag the question to a customer service representative who receives a pre-generated suggested response. They can then verify the response and reply to the customer.
Increasingly the company’s engineers are also evaluating the capabilities of large language models (LLMs) to respond to customer questions. It is testing both commercial models from OpenAI as well as open source models such as Llama-2 and Mixtral hosted on Hugging Face.
With all question-answer pairs from the different models written to the MongoDB Atlas database, the data is used to further tune the natural language models while also guiding model evaluations. The company has also recently started using Atlas Vector Search to identify and retrieve semantically similar answers to past questions. The search results power a co-pilot-like experience for customer service representatives and provide in-context training to its fleet of LLMs.
Having our source data and metadata stored and synced side by side with our vector embeddings dramatically accelerates how quickly my developers build with AI. It also improves the quality of the outputs we return to customers, driving higher conversions and customer satisfaction.
Victor Hochgreb, Co-Founder and CEO of GoBots
Why MongoDB?
With the power of MongoDB’s developer data platform and flexibility of MongoDB’s document model, GoBots builds higher-performing AI-powered applications faster:
-
MongoDB Atlas provides a single data platform that serves multiple operational and AI use cases. This includes user data and product catalogs as well as a store for AI model inferences, outputs of multiple AI models for experimentation and evaluation purposes, a data source for fine-tuning models, and for vector search. The company is evaluating the use of Atlas Triggers for invoking AI model API calls in an event-driven manner as the underlying data changes.
-
The field of natural language processing is rapidly progressing with new AI models released all the time. Finding the right AI model for a use case that balances the performance-price tradeoff requires experimentation on historical data. The flexibility provided by MongoDB’s document model allows the development team to continually enrich historical questions with outputs generated by different models and compare the results. This means that they are not blocked behind complex schema changes that would otherwise slow down the pace of harnessing new data in their models for training and inference.
-
The question-answer pairs output by the company’s NLU models and LLMs are complex data structures with many nested entities and arrays. Being able to persist these directly to the database without first having to transform them into a tabular structure improves developer productivity and reduces application latency.
It was this flexibility that was behind the decision to use MongoDB from the outset. “We were building fast, continually testing new features to scale what worked and kill what didn’t,” says Hochgreb. “Only MongoDB provided the developer ease of use and flexibility to meet my time-to-market demands”.
The company initially ran MongoDB itself before upgrading to MongoDB Atlas in 2019. “The company was growing fast and I wanted to focus my engineering team on building, not operating. That is exactly what Atlas and its managed service enabled us to do,” says Hochgreb. “With Atlas we were able to maintain high uptime in the face of constant service scaling, with deep monitoring and observability into our platform. In the first year of running in MongoDB Atlas we were able to avoid hiring a full-time infrastructure engineer, and instead redirected the resource into my development team, building new customer features.”
GoBots has been able to expand MongoDB usage to deliver even higher value features in its platform over time. It uses MongoDB’s app-driven intelligence to power dashboards that help retailers track questions and complaints, identify opportunities, measure marketing activities, and optimize the customer journey across the marketplace. Its adoption of Atlas Vector Search is the latest example of how the company is expanding application functionality without losing the benefits of building and running on the single, unified Atlas developer data platform.
The results and what's next
By working with hundreds of customers running on Latin America’s largest marketplaces, GoBots has built a compelling track record of achievement:
By using GoBots AI for ecommerce with MongoDB Atlas, customers have grown sales conversions by 40% and reduced time to customer response by 72%.
Looking forward, GoBots adoption of generative AI and vector search will further drive results across the retail marketplace experience. Being part of MongoDB’s AI Innovators Program provides GoBots with free Atlas credits along with access to live technical reviews, helping the company de-risk AI developments.
If you are building your own AI-powered apps, apply for the program and take MongoDB Atlas for a spin. It's the quickest way to see why retailers around the world use MongoDB.