Google AI has recently released a new pre-training tool for natural language processing (NLP): BERT. The tool helps to bridge the gap between available data and the larger amounts of data generally needed for modern deep learning-based NLP models.
- Leverages data from millions or billions of annotated training examples
- Reduces the need for human-labeled training data
- Helps close the gap in available data for NLP tasks
- Allows for improved accuracy with deep learning-based NLP models
The service can be used to generate text using GPT (Generative Pre-trained Transformer) technology. This allows users to create large corpora of text quickly and accurately, without needing to manually label each piece of text. It also enables more accurate machine translation and question answering systems.