Submitted by alla_n_barakat t3_ztkv5a in deeplearning

Dear redditors

I have problem facing me.

I'm trying to run big models such as codeparrot on google colab, but the free version doesn't have enough ramand i need to run tests on it and some other models for my thesis (I'm a graduate student in University of Benghazi, which is in libya)

How do i solve this problem ?

Update:
For those wondering why haven't i ask my university for help, here is the answer:
"unfortunately, my university doesn't provide anything to students. It is as if we live in the stone age here. We don't have servers nor free internet access. The university will never provide any solution that might help me".

6

Comments

You must log in or register to comment.

ReallySeriousFrog t1_j1dzzvg wrote

You could try to ask your University if they have a computer cluster/server for these things or alternatively ask if they would reimburse the cost for colab resources

1

alla_n_barakat OP t1_j1e0gh6 wrote

>alternatively

unfortunately, my university doesn't provide anything to students
It is as if we live in the stone age here. We don't have servers nor free internet access.

The university will never provide any solution that might help me

0

ReallySeriousFrog t1_j1e12c3 wrote

Oh man, I wasn't aware - how does your supervisor or professors and post docs in AI department use large models? Maybe they can share a bit of their compute with you?

I am not aware of any free server space with enough GPU/RAM sadly

Alternatively, is it maybe an option to use a smaller version of the model. Maybe there is a model with trimmed weights?

1

deepneuralnetwork t1_j1g6h4o wrote

Your option short of acquiring your own DL rig is to pay for a GPU VM instance on AWS, Azure or GCP.

1

stuv_x t1_j1gnjaf wrote

Downsize your inputs. Or try a free GPU on paperspace.

2

chengstark t1_j1gqy3z wrote

Look up some model compression techniques, use smaller batch sizes etc. sorry for your situation, it is very hard to do proper work without the proper tools.

2