OpenAI last week released a slew of updates. Out of these, the function calling in the Chat Completions API was the most important one. The feature uses external APIs and tools with OpenAI’s API. This, surprisingly, has a striking resemblance with Langchain, which also performs similar action, bringing us to question its relevance when building autonomous agents.

Machine learning expert Santiago said that using function calling, we can define custom functions the model can use to answer questions. “For example, we can now have GPT solve queries requiring real-time data, like stock prices, weather information, or sports results,” he added. He said that this new ‘calling function’ feature was only possible using Langchain previously. But, now it is natively available as part of the API. 

In other words, with this new update, users now have the choice to bypass the third party (i.e. Langchain) and directly built using OpenAI’s API – which is available for its gpt-3.5-turbo and gpt-4 models. 

Relevance of Langchain 

LLM expert Derek Haynes said: “Pretty stoked with OpenAI’s function-calling. My previously massive ReAct-based agent prompt has been reduced to this. It’s a fine-tuned GPT for agents … makes me wonder if Langchain is worth the complexity?”

In his blog post, Haynes explained the need for Langchain, and the role of ‘calling function’ update introduced by OpenAI. He said that ReAct presents an approach where LLMs like GPT-3.5-turbo and GPT-4 can use external tools to perform tasks, observe the outcomes, and use those outcomes to inform the next actions. 

For example, a model can use Zapier to search for unopened emails and OpenAI’s models to summarise them. However, using this approach requires users to do considerable work, including listing available tools in a format that the LLM understands, applying custom code to parse the response and extract useful data, and guiding the LLM to perform tasks accurately. 

Haynes believes that this process can result in complex, brittle string parsing code. He said that is where Langchain comes into the picture, bringing structure to this process, but this in turn can make the code less readable. 

One of the users who goes by the name ‘fbrncci’ on HackerNews said that the deeper and more complex the application becomes, the more of a risk Langchain seems to become to keeping it maintainable. 

Enters function calling 

On their website, OpenAI provided examples and applications of the update. (as seen below) 

“Function calling within your own applications is one of the interesting waves that we’re seeing with language models, which is increasing their capabilities outside of just normal text generation,” said AI expert Greg Kamradt, in an explainer video – Function Calling via ChatGPT API – First Look With LangChain. 

He said a lot of systems need to make decisions as well and when you use a language model as a reasoning engine, freeform text isn’t the best way to talk to other computers. It’s better if you can do it in a JSON format  and that’s what function calling is a step towards.

Here’s a quick glimpse on how function calling works – 

On a sprint mode 

As soon as OpenAI released this new update, Langchain was pretty quick to respond with support one hour later. While users are free to choose between the two, it would be easier to use with its variety of features and plugins. And they aren’t the only one offering these services, with Huggingface keeping up as well. 
But, one question remains, did OpenAI steal Langchain’s limelight. “Yes, OpenAI took this from Langchain,” said Santiago. He, however, said that this will not kill Langchain. “There’s much more functionality there,” he added, questioning OpenAI’s plans of syphoning off from them and integrating it natively on its API.

The post Is This the End of Langchain? appeared first on Analytics India Magazine.