incrediblediy
incrediblediy t1_ir4ssxg wrote
Reply to [R] Google Colab alternative by Zatania
what is the "sequence length" in BERT ?
incrediblediy t1_iqp8nmt wrote
> Price: $3,800
still you can get multiple RTX3090 24 GB at this price, I have seen even brand new cards are going for US$900
> Is 64 GB of RAM really needed for most moderate deep/machine learning projects or should 32 GB with a reduced batch size work fine?
you will be training on GPU, so batch size is limited by 16 GB VRAM not RAM.
incrediblediy t1_ir7bljy wrote
Reply to comment by minimaxir in [R] Google Colab alternative by Zatania
Yeah! I mean what is the seq_length used by OP :) also the batch size :) I have tried seq_length = 300 but with a small batch size in Colab, specially with AdamW instead of Adam