Quantcast
Channel: MongoDB | Blog
Viewing all articles
Browse latest Browse all 2423

Welcome to MongoDB.local NYC 2024!

$
0
0

AI promises to upend how enterprises operate and reach customers… if only they could first find the "On" button. Despite the tremendous promise of AI, most companies still find themselves in the experimentation phase, working through proofs of concept, hampered by unfamiliar technologies that don't work well together. But MongoDB is uniquely positioned to help developers turn all this AI noise into "signal" that benefits customers.

This week at MongoDB .local NYC, thousands of developers and executives—representing Fortune 500 companies and cutting-edge startups—have gathered to discuss and demonstrate the real-world successes they've had building on MongoDB's developer data platform. MongoDB is fast becoming the industry’s go-to memory database for retrieval-augmented generation (RAG) and agentic systems, offering a unified data model across the entire AI stack. But this isn’t just a technology story, as important as that is. MongoDB also now offers essential programs and services to make AI much more accessible.

In short, MongoDB is taking developers from experimentation to impact, and advancing our long-standing mission of making it easy to work with data.

Demystifying AI

Businesses are eager to adopt generative AI, but they don’t know where to start. The AI landscape is incredibly complex—and seems to get more so by the minute. This complexity, coupled with limited in-house AI expertise and concerns about the performance and security risks of integrating disparate technologies, is keeping too many organizations on the sidelines.

MongoDB can help. To get organizations started, we’re announcing the MongoDB AI Applications Program (MAAP). With MAAP, we give customers the blueprints and reference architectures to easily understand how to build AI applications. We also take on the heavy lifting of integrating MongoDB's developer data platform with leading AI partners like Anthropic, Cohere, Fireworks AI, Langchain, LlamaIndex, Nomic, Anyscale, Credal.ai, and Together AI, all running on the cloud provider of your choice. MAAP will be available to customers in early access starting in July.

In addition to MAAP, we’re also introducing two new professional services engagements to help you build AI-powered apps quickly, safely, and cost-effectively:

  • An AI Strategy service that leverages experts to help customers identify the highest-impact AI opportunities and to create specific plans on how to pursue them.

  • For customers who have already identified use cases to pursue, an AI Accelerator service that brings expert consulting—from solution design through prototyping—to enable customers to execute their AI application roadmap from idea to production.

Once developers get to building AI apps, they’ll find that MongoDB allows them to speak the data “language” of AI. Our developer data platform unifies all different data types alongside your real-time operational data—including source data, vector embeddings, metadata, and generated data—and supports a broad range of use cases.

Not only do we give developers the most intuitive way to work with their data, we also keep improving where they can do so. Many developers first experience MongoDB in a local environment before moving to a fully managed cloud service like MongoDB Atlas. So, I'm excited to share that we will be introducing full-text search and vector search in MongoDB Community Edition later this year, making it even easier for developers to quickly experiment with new features and streamlining end-to-end software development workflows when building AI applications. These new capabilities also enable support for customers who want to run AI-powered apps on devices or on-premises.

As customers begin to mature these applications, cost becomes an important consideration. Last year, we introduced dedicated nodes for Atlas Search on AWS. Using dedicated nodes, customers can isolate their vector search workloads and scale them up or down independently from operational workloads, improving performance and ensuring high availability. By giving customers workload isolation without data isolation, they can manage resources efficiently without additional complexity. Today, we’re announcing Atlas Search nodes on all three cloud providers, which customers can configure programmatically using the Atlas CLI or our Infrastructure-as-Code integrations.

Real-time and highly performant

Though AI rightly claims center stage at MongoDB .local NYC this week, it's not the only way we're helping developers. From real-time fraud detection, to predictive maintenance, to content summarization, customers need to efficiently process large volumes of high-velocity data from multiple sources. Today, we’re also announcing the general availability of Atlas Stream Processing, the public preview of Atlas Edge Server, and improved performance of time series workloads with MongoDB 8.0. Together, these capabilities enable customers to design applications that solve virtually any business challenge.

These are just a few of the things we're announcing this week. Whether you’re just dipping your toes into the world of generative AI or are well on your way, MongoDB’s developer data platform, strong and diverse network of partners, and proven industry solutions will give you a competitive edge in a fast-moving market. Please take a minute to see what we've built for you, so that you can more easily build for your customers.

Enjoy the conference, and we hope to see you soon!

To see more announcements and get the latest product updates, visit our What’s New page. And head to the MongoDB.local hub to see where we’re stopping along our 2024 world tour.


Viewing all articles
Browse latest Browse all 2423

Trending Articles