BERT has set a new benchmark for NLP tasks. And, this has been documented quite well over the past six months. Bidirectional Encoder Representations from Transformers or BERT, which was open sourced last year, offered a new ground to embattle the intricacies involved in understanding the language models. BERT used WordPiece embeddings with a 30,000…

The post How Good Is BERT For Filling The Gap Between Accuracy Scores & Language Comprehension? appeared first on Analytics India Magazine.