Jailbreaking allows the AI agent to play a certain role, and by setting hard rules for the character, it is possible to trick the AI into breaking its own rules

The post This Could Be The End of Bing Chat appeared first on Analytics India Magazine.