Submitted by gyurisc t3_z9kt6f in MachineLearning

I am looking for ML cloud providers for my hobby projects. I found replicate dot com but I would like to try other providers. What are the best / most used/stable providers out there? I am not looking for free options and am happy to pay.

13

Comments

You must log in or register to comment.

Cheap_Meeting t1_iyhla19 wrote

Colab

8

thundergolfer t1_iyk3y2p wrote

You can't deploy Colab, only link people to the notebook?

3

gyurisc OP t1_izi8zpm wrote

so I cannot upload a notebook and use it from my service, right?

1

thundergolfer t1_izj8k0i wrote

no, you won't be able to make requests to the notebook.

2

gyurisc OP t1_izmyns5 wrote

thanks for the clarification. so this one has a limited use for creating a website and use it as an API

1

machineko t1_iyisdnp wrote

It would be nice if you can try out stochastic.ai and provide suggestions on how to improve it. I'd be happy to explain how to build ML cloud infrastructure yourself too.

2

gyurisc OP t1_izmxidw wrote

>It would be nice if you can try out
>
>stochastic.ai
>
> and provide suggestions on how to improve it. I'd be happy to explain how to build ML cloud infrastructure yourself too.

Thanks for your reply. I would be interested to learn more about building your own infrastructure

1

thundergolfer t1_iyk3df7 wrote

Try modal.com.

Modal is an ML-focused serverless cloud, and much more general than replicate.com which just allows you to deploy ML model endpoints. But still extremely easy to use.

It's the platform that this openai/whisper podcast transcriber is built on: /r/MachineLearning/comments/ynz4m1/p_transcribe_any_podcast_episode_in_just_1_minute/.

Or here's an example of doing serverless batch inference: modal.com/docs/guide/ex/batch_inference_using_huggingface.

This example from Charles Frye runs Stable Diffusion Dream Studio on Modal: twitter.com/charles_irl/status/1594732453809340416

2

gyurisc OP t1_izmxw61 wrote

>Try modal.com.
>
>Modal is an ML-focused serverless cloud, and much more general than replicate.com which just allows you to deploy ML model endpoints. But still extremely easy to use.
>
>It's the platform that this openai/whisper podcast transcriber is built on: /r/MachineLearning/comments/ynz4m1/p_transcribe_any_podcast_episode_in_just_1_minute/.
>
>Or here's an example of doing serverless batch inference: modal.com/docs/guide/ex/batch_inference_using_huggingface.
>
>This example from Charles Frye runs Stable Diffusion Dream Studio on Modal: twitter.com/charles_irl/status/1594732453809340416

this looks real nice. I will give it a try

2

thundergolfer t1_izol4cw wrote

Please do! DM me your email and I'll approve your account.

1

techmavengeospatial t1_iyhb01b wrote

Have you thought about buying your own hardware?

For $700-900 You can get recertified dell R820 40 cores/80 threads 256gb ram server with RAID10 SAS drives (8 or 16)

Or if you need GPU $1,100 For an HP Z840 WORKSTATION 24cores/ 48 threads and 128gb ram and 8gb workstation GPU. (we got lucky found one that had 512gb ram and dual GPU)

You can pick these up multiple places or Amazon and Amazon offers no interest financing for 12 months equal payments on amazing credit card. They come with 90 day warranty but can buy an extended warranty

This will pay for itself if you are actively training models and also working with any big data as clouds charge for bandwidth in and out of their systems and offcourse storage

−4

Damitrix t1_iylp66l wrote

I don't think it's remotely feasible for most people to just buy a server for their projects, even excluding a GPU on top.

3

techmavengeospatial t1_iym0rxh wrote

The same thing can be said for someone never standing up server in AWS, GCP, AZURE, VPS, DIGITAL OCEAN.

I don't get your point.

1

Flag_Red t1_iyviyh0 wrote

A 3090 VM around $0.50 an hour on a lot of providers (less if you look around). If you're a hobbyist experimenting 6 hours a day for a week that's $21.00.

Compare to $1100 for the machine you quoted.

2

techmavengeospatial t1_iyvmnho wrote

Your cost did not account for storage and bandwidth

1

Flag_Red t1_iyvpa6i wrote

Using Runpod as an example: bandwidth is free and storage is $0.0013 per GB hour.

If you use a 100GB disk that's an extra $2.87.

2

gyurisc OP t1_izmy3n7 wrote

Is the 3090 VM on AWS? Sounds really interesting. I might give it a try

1

Flag_Red t1_izmzog4 wrote

I use RunPod.

2

gyurisc OP t1_izn1gln wrote

Thanks, I signed up. Are you using the community cloud or the secure one?

1

Flag_Red t1_izn20hg wrote

Typically secure if it's available, community cloud if not. Have a look on "browse servers" for the community cloud instances, their specs can range quite a bit so make sure to get one that fits your use-case.

2

gyurisc OP t1_izmy6am wrote

Interesting idea. Where do you look for these used computers? Are you adding GPUs to the machine or just using them as it is?

1