Elasticsearch, once the default engine for search and log analytics, is beginning to lose favour. As modern workloads demand real-time insights and AI-native architectures, Elasticsearch’s indexing-heavy, denormalised model is showing its limits.

Initially adopted as a flexible full-text search solution, Elasticsearch is now being reconsidered in favour of systems that offer better write performance, integrated vector search, and lower operational complexity.

From large-scale observability to semantic search and AI-driven retrieval, companies are increasingly turning to platforms like Apache Pinot, PostgreSQL, and MongoDB’s vector engine. The demand for scalable, low-latency, and AI-ready infrastructure is driving this shift.

Beyond Keyword Matching: What Companies Are Building Instead

Darwinbox, the HR tech unicorn serving over 3.3 million users across more than 1,000 enterprise clients, has moved entirely to MongoDB for transactional data, analytics, and search. With AI now central to its platform, the company transitioned from Elasticsearch to MongoDB’s vector search capabilities to support intelligent agents and semantic services at scale.

Speaking at the MongoDB.local event in Bengaluru, Prithvi Raju Alluri, vice president of engineering at Darwinbox, said, “Traditionally, we used to use Elasticsearch for search. That was an index-based search. We have switched to MongoDB, and anything that is there in the system is available as an embedding.”

This change enabled features such as semantic stack ranking, AI-assisted candidate shortlisting, and generative job descriptions. Darwinbox uses MongoDB’s native vector indexing to support agent memory and semantic retrieval—resulting in a search experience that understands user intent, not just keywords.

“When MongoDB launched embeddings, we thought it was an obvious choice. Why? Because 100% of the data is sitting in MongoDB,” Alluri said. “Why should I create a pipeline, create embeddings of that, store it somewhere and process it?”

Darwinbox has since built a pipeline where any data injected into MongoDB is immediately available as an embedding in MongoDB Vector. Within six months, the company developed over 48 AI agents to automate HR tasks such as interview scheduling and leave planning.

“We serve 1,100 customers, not one. Everything we build has to scale to thousands of tenants, and MongoDB has helped us keep that simplicity while delivering intelligence,” Alluri said.

Uber also moved away from Elasticsearch in 2023 when building Healthline, its crash analytics platform. Designed to process 1,500 mobile crashes per second and handle 36 terabytes of data daily, the platform required fast root-cause aggregation across multiple dimensions. Uber replaced Elasticsearch with Apache Pinot, supported by Kafka, Flink, and Spark, due to Pinot’s lower latency and stronger performance across long time ranges.

Pinot’s hybrid table format and real-time filtering proved ideal for large-scale diagnostics, although Uber had to develop custom tools for handling nested data and replication.

Instacart, on the other hand, faced challenges related to frequent updates in inventory and pricing. Its Elasticsearch-based system struggled with high-frequency write loads and suffered from the overhead of a denormalised model.

The company migrated to a sharded PostgreSQL setup with pgvector to support both keyword and semantic search. “A normalised data model allowed us to have a 10x reduction in write workload,” Instacart said.

With vector embeddings stored directly in SQL tables, Instacart integrated machine learning models more efficiently. The result was a faster, cost-effective search experience.

In A/B tests comparing the previous FAISS-based semantic layer with pgvector, Instacart recorded a 6% drop in zero-result searches and an uplift in revenue. The new setup also reduced the operational burden of maintaining separate systems for vector and keyword queries.

Search Is Becoming an AI Problem

These transitions reflect a broader trend: search is no longer just a standalone function. It now underpins how platforms deliver recommendations, extract insights, and power user interactions. Traditional engines, built for logs and static indexes, often fall short when paired with AI workflows.

For many organisations, Elasticsearch is no longer suited to AI-centric tasks. Tools like MongoDB with embedding-aware search, PostgreSQL with normalised data models, and Pinot with fast analytical capabilities are emerging as better-aligned alternatives.

The post Why Companies Are Moving Away from Elasticsearch appeared first on Analytics India Magazine.