Archives for Google designed A Lite BERT (ALBERT)

01 Oct

Google’s NLP-Powered Pretraining Method ALBERT Is Leaner & Meaner

image-7406
image-7406

Natural Language Processing (NLP) is one of the most diversified domains in emerging tech. Last year, search engine giant Google open-sourced a technique known as Bi-directional Encoder Representations from Transformers (BERT) for NLP pre-training. This model helped the researchers to train a number of state-of-the-art models in about 30 minutes on a single Cloud TPU,…

The post Google’s NLP-Powered Pretraining Method ALBERT Is Leaner & Meaner appeared first on Analytics India Magazine.