Viewing a single comment thread. View all comments

FaceDeer t1_ja23lku wrote

I'd be happy with it just running on my home computer's GPU, I could use my phone as a dumb terminal to talk with it.

This is amazing. I keep telling myself I shouldn't underestimate AI's breakneck development pace, and I keep being surprised anyway.

60

Z1BattleBoy21 t1_ja2jtse wrote

I think LLMs running on a phone would be really interesting for assistants; AFAIK siri is required to on hardware only

12

duffmanhb t1_ja2nzfa wrote

Siri was exclusively cloud based for the longest time. They only brought over basic functions to local hardware.

11

NoidoDev t1_ja5sjcu wrote

>I'd be happy with it just running on my home computer's GPU

This, but as a separate server or rig for security reasons. As external brain for you robowaifus and maybe other devices like housekeeping robots at home.

1