Observability Tools can Now Monitor LLMs Along with DevOps Environments
For cloud-native businesses, observability platforms are essential for gaining insights into application performance. DevOps practices thrive on the foundation of observability, empowering teams to iterate rapidly and deliver high-quality applications.
The Indian startup ecosystem is rich with cloud-native businesses. New Relic, one of India’s leading observability platforms, serves close to 12,000 customers.
“I’ve visited India multiple times, and the fact that you can get things delivered in under 10 minutes is unparalleled. Trust me, I’ve been to many places, and this does not happen anywhere.
“Therefore, the demand for the applications and performance skyrockets here. A lot of these companies – the Swiggies and Ola Cabs of the world – use our observability platform,” Ashan Willy, CEO at New Relic, told AIM in an exclusive interview.
New Relic, with over 80,000 customers worldwide, caters to startups as well as large enterprises. Willy believes AI will bring a transformational change in the observability space. “I think at every inflexion point, observability took a step forward, and it will be the same with AI,” he said.
LLM Observability is here
New Relic’s comprehensive platform offers over 30 capabilities, delivering a seamless and connected experience throughout the various layers of the technology stack at every stage of the software lifecycle.
The company is now adding generative AI to the mix.
Last year, the company introduced New Relic Grok, which, according to them, is the world’s first generative AI assistant designed specifically for observability.
Willy believes that most companies today leverage LLMs in some way, and hence, “Observability offers the chance to monitor applications comprehensively, providing insights into both the application’s end-to-end performance as well as the LLMs.”
For cloud-native businesses, LLMs are often integrated with their existing operations. New Relic AI monitors AI-specific metrics like token usage, costs, and response quality, as well as integrates with traditional application performance monitoring.
Other players in the space, such as DataDog and Dynatrace, have also introduced LLM observability solutions as part of their platforms.
Moreover, generative AI will enhance predictive analysis and reduce false positives, leading to more accurate and actionable insights.
“One common challenge with AI/ML is false positives, leading to unwanted alerts and actions based on inaccurate information. However, with generative AI providing contextual insights, predictive analysis becomes more feasible.
“Today, nobody can drive responses on top of the AI information yet, but that’s going to change.”
AI is Democratising Observability
New Relic Grok allows users to ask questions about their systems in natural language and get insights. Willy believes this will democratise observability in a big way.
“Why shouldn’t you have more of a Google-like UI where you come in and ask a question in plain English rather than learning some sort of crazy language, right?
“So I think the first thing it does is democratise observability. Under the New Relic AI umbrella, we released something called Grok, an AI assistant before [Elon] Musk stole the name,” Willy laughed.
Moreover, AI also presents a huge opportunity for companies in this space to grow. “If there are roughly 30 million developers in the world, only about 2 million of them deal with observability on a daily basis. This presents us with a big opportunity,” he said.
“How is Jensen Huang going to make his next trillion dollars? It’s not by selling chips to the data centre, the hyperscalers, because they are going to white label that. So, he’s got to move that stuff to the edge. You want to move it close to the edge, which means there’s more things to observe,” Willy pointed out.
The Industry is Asking for a Consumption Pricing Model
In recent years, as cloud costs rose, so did the expenses associated with observability, presenting a significant challenge for numerous cloud-native enterprises.
To address this, New Relic has introduced a price consumption or compute model, which means customers pay only for what they consume.
“We’ve introduced things like telemetry data, pipelines, all of that, and cost management tools. I think that’s a big thing. So how do I observe everything I have in a way that makes it effective for me?
“Hence, we have introduced the consumption-based pricing model which customers love, but the investors, not so much. But I think over time, if customers like it, investors will come to like it as well.”
Furthermore, Willy also points out that New Relic is the only company in the core observability space providing a consumption-based pricing model.
New Relic is Embracing Open Source
The domain of open source continues to hold sway, retaining relevance in various domains in contemporary times, including observability.
New Relic, too, is embracing open source. “A lot of our customers are asking us to embrace open source and open telemetry, which is a portion of open source that is very relevant for the observability space. We’re embracing that in a big way,” Willy said.
Modern cloud-native applications are distributed, making the capture and export of telemetry data complicated. Open Telemetry standardises telemetry data generation, collection, and exportation.
With vendor-neutral APIs, SDKs, and tools, this open-source framework enables organisations to instrument applications universally, promoting data delivery to diverse observability backends.
Willy’s statement does hold true as Relic is one of the major contributors to the open telemetry project. The company has open-sourced many of its agents, integrations and SDKs under open source licences.
The post Observability Tools can Now Monitor LLMs Along with DevOps Environments appeared first on AIM.




