Viewing a single comment thread. View all comments

WonderFactory t1_jeaotp1 wrote

I'm actually adding ChatGPT NPCs to the Unreal Engine 5 game I'm developing at the moment, it's a rogue like set in a post singularity world, gameplay is similar to Hades so there's plenty of dialogue in the game. at the minute it's difficult to get a model to run on the local PC so I'm using OpenAI's API. There are challenges like latency while you're waiting for the API to return, it's also quite expensive so releasing a free demo of the game is out of the question. It could potentially cost several dollars per user in API fees over the life of the game so that will of course limit your pricing flexibility, you can only reduce the price so much in steam sales etc. I'm hoping though that inference costs will come down by the time the game is finished.

I haven't posted any footage with the GPT dialogue added to the game but I might post it here in a couple of weeks.

28

sumane12 t1_jeb1wh1 wrote

>I'm hoping though that inference costs will come down by the time the game is finished.

Genius. This is exactly what will happen, and I'm glad someone has the forethought to develop the product, before the underlying technology is ready, because it will be there.

13

psdwizzard t1_jebn0zq wrote

you may be able to run a custom version of Llama instead of using GPT. You could even train in on GPT3.5. Just tell GPT its a character from the game, build your data set from its answers and use it like Alpaca does.

5

maven_666 t1_jed51is wrote

Licensing issues with llama seem to make this not workable but I expect a fully open version soon.

4

alexiuss t1_jeb63mr wrote

Why not release it so user can enter their own API to make it work? I'm super interested in helping you develop this stuff, pm me.

2

2Punx2Furious t1_jecz2b2 wrote

I think you could get around the latency issue by having the generated dialogue come in form of letters that you receive in-game, which would feel a lot more natural than a slow conversation. Or have some cutscenes in between the prompt and the answers. As for the price, it should probably be an optional setting, and maybe the price should be offset by a subscription or ads, as much as I hate them, but in this case it would be difficult to do otherwise, unless you plan to foot the bill of your users forever.

1

FoniksMunkee t1_jed4826 wrote

How do you restrict the AI's answers to say further a plot point or stay in character?

1