[D] Is there an affordable way to host a diffusers Stable Diffusion model publicly on the Internet for "real-time"-inference? (CPU or Serverless GPU?) Submitted by OkOkPlayer t3_zdfrnw on December 5, 2022 at 6:52 PM in MachineLearning 13 comments 8
machineko t1_iz7x0mh wrote on December 7, 2022 at 3:05 AM How "cheap" does it have to be? Cheapest would be to deploy it on your own using: https://github.com/stochasticai/x-stable-diffusion. Let me if you need more help on real-time inference. Permalink 1
Viewing a single comment thread. View all comments