Submitted by gyurisc t3_z9kt6f in MachineLearning

I am looking for ML cloud providers for my hobby projects. I found replicate dot com but I would like to try other providers. What are the best / most used/stable providers out there? I am not looking for free options and am happy to pay.

13

Comments

You must log in or register to comment.

techmavengeospatial t1_iyhb01b wrote

Have you thought about buying your own hardware?

For $700-900 You can get recertified dell R820 40 cores/80 threads 256gb ram server with RAID10 SAS drives (8 or 16)

Or if you need GPU $1,100 For an HP Z840 WORKSTATION 24cores/ 48 threads and 128gb ram and 8gb workstation GPU. (we got lucky found one that had 512gb ram and dual GPU)

You can pick these up multiple places or Amazon and Amazon offers no interest financing for 12 months equal payments on amazing credit card. They come with 90 day warranty but can buy an extended warranty

This will pay for itself if you are actively training models and also working with any big data as clouds charge for bandwidth in and out of their systems and offcourse storage

−4

machineko t1_iyisdnp wrote

It would be nice if you can try out stochastic.ai and provide suggestions on how to improve it. I'd be happy to explain how to build ML cloud infrastructure yourself too.

2

thundergolfer t1_iyk3df7 wrote

Try modal.com.

Modal is an ML-focused serverless cloud, and much more general than replicate.com which just allows you to deploy ML model endpoints. But still extremely easy to use.

It's the platform that this openai/whisper podcast transcriber is built on: /r/MachineLearning/comments/ynz4m1/p_transcribe_any_podcast_episode_in_just_1_minute/.

Or here's an example of doing serverless batch inference: modal.com/docs/guide/ex/batch_inference_using_huggingface.

This example from Charles Frye runs Stable Diffusion Dream Studio on Modal: twitter.com/charles_irl/status/1594732453809340416

2

Flag_Red t1_iyviyh0 wrote

A 3090 VM around $0.50 an hour on a lot of providers (less if you look around). If you're a hobbyist experimenting 6 hours a day for a week that's $21.00.

Compare to $1100 for the machine you quoted.

2

gyurisc OP t1_izmxidw wrote

>It would be nice if you can try out
>
>stochastic.ai
>
> and provide suggestions on how to improve it. I'd be happy to explain how to build ML cloud infrastructure yourself too.

Thanks for your reply. I would be interested to learn more about building your own infrastructure

1

gyurisc OP t1_izmxw61 wrote

>Try modal.com.
>
>Modal is an ML-focused serverless cloud, and much more general than replicate.com which just allows you to deploy ML model endpoints. But still extremely easy to use.
>
>It's the platform that this openai/whisper podcast transcriber is built on: /r/MachineLearning/comments/ynz4m1/p_transcribe_any_podcast_episode_in_just_1_minute/.
>
>Or here's an example of doing serverless batch inference: modal.com/docs/guide/ex/batch_inference_using_huggingface.
>
>This example from Charles Frye runs Stable Diffusion Dream Studio on Modal: twitter.com/charles_irl/status/1594732453809340416

this looks real nice. I will give it a try

2

Flag_Red t1_izn20hg wrote

Typically secure if it's available, community cloud if not. Have a look on "browse servers" for the community cloud instances, their specs can range quite a bit so make sure to get one that fits your use-case.

2