Viewing a single comment thread. View all comments

Relative_Rich8699 t1_j007y6y wrote

It told me today it's platform is based on BERT and GPT-2. Trained on 400 mil conversations. Intimated more developments are coming, so sounds like they're willing to keep up.

0

katiecharm t1_j00db5z wrote

I haven’t used it yet but I highly doubt it’s GPT2 if it’s impressive. GPT2 is a neat trick, but I wouldn’t call it impressive here in 2022.

4

oopiex t1_j00z2f6 wrote

When i tried it, it wasn't impressive

1

Relative_Rich8699 t1_j02bg4c wrote

Agree, but if it's using LaMDA or something more advanced than BERT/GPT-2 why is it hallucinating and giving me incorrect information about its own platform?

1

fingin t1_j031gnr wrote

Even GPT-4 will make silly mistakes. That's what happens when a model is trained to find probable word sequeces instead of actually having knowledge of language like people do.

1

Relative_Rich8699 t1_j033bjo wrote

Yes. But I was speaking to "the company's" bot on purpose and I would only say that it should be trained with company data for those questions. When I inquire about ducks it can use the world's written word.

1

fingin t1_j03181d wrote

I asked the character.ai bot what model it used it told me, T5. Insisted even. Regardless of the veracity of this, all of these models use tranformer-based architecture, with improvement between versions of models being due to more parameters (and correspondingly larger and higher quality training data sets). Crazy to think in two months we might be at GPT4 level and laugh about this tech we are blown away with today

1