123b is a unique methodology to language modeling. This system exploits a transformer-based design to generate grammatical content. Engineers at Google DeepMind have designed 123b as a robust resource for a spectrum of NLP tasks. Use cases of 123b include text summarization Adaptation 123b requires extensive datasets Performance of 123b demo