Viewing a single comment thread. View all comments

drawkbox t1_jdage34 wrote

Transformers, the T in GPT was invented at Google during Google Brain. They made possible this round of progress.

> Transformers were introduced in 2017 by a team at Google Brain[1] and are increasingly the model of choice for NLP problems,[3] replacing RNN models such as long short-term memory (LSTM). The additional training parallelization allows training on larger datasets. This led to the development of pretrained systems such as BERT (Bidirectional Encoder Representations from Transformers) and GPT (Generative Pre-trained Transformer), which were trained with large language datasets, such as the Wikipedia Corpus and Common Crawl, and can be fine-tuned for specific tasks.

Bard will most likely win long term, though I wish it was just called Google Brain. Bard is a horrible name.

>

8

jeffyoulose t1_jdbhqs6 wrote

I think they are going to name them by alphabet. The next version will start with a C, then D etc.

My guess after BARD. We'll have CAT or CARL or CARD.

2