Viewing a single comment thread. View all comments

ggf31416 t1_jac61sd wrote

2060 has 6GB of VRAM, right?

It should be possible to train with that amount https://huggingface.co/docs/transformers/perf_train_gpu_one#optimizer

If you need to train from scratch (most people will just finetune) this will take a while, original training took 90 hours in 8xV100, each one should be faster than your GPU https://www.arxiv-vanity.com/papers/1910.01108/

2

ahiddenmessi2 OP t1_jaciwqg wrote

Thanks for your reply. My goal is to train the transformer to read a specific programming language so I I guess there is no pre trained model available. Seems I have to train it from scratch on my laptop GPU :(

Edit: and yes it has 6gb only

1

ggf31416 t1_jacq8pl wrote

For reference a RTX 3090 can be rented as low as ~ $0.25/hour at vast.ai with just a credit card if you are in a hurry (AWS and GCP require a quota increase to use GPUs), or you may be able to get free credits for research at major cloud providers.

2