Viewing a single comment thread. View all comments

martianunlimited t1_jegwj6v wrote

ChatGPT (and other GPT3.5 based transformers) has 175 billion parameters and wouldn't even fit in a dedicated RTX4090. and before you say why not run just a smaller model, the performance of ChatGPT is highly dependant on it's size, (which is why people outside the machine learning community don't hear much about GPT-1 and GPT-2. And while there are efforts to make the model smaller (see ALPACA) , you would still need a top-of-the-line GPU to fit these smaller models, taking away from things that people are more concerned about, the graphics.

So the practical implementation of incorporating ChatGPT in to games would be to have send the chat response to a server, and suffer a whole lot of latency for the response. It's possible, but it wouldn't be a good gaming experience. Wait (at least) 10 years, when consumer grade hardware has the capacity of datacenter grade hardware (that's assuming we don't hit the end of Moore's law first) then you might find it more common place.

3