Submitted by macORnvidia t3_zmwxkh in deeplearning

laptop for Data Science and Scientific Computing: proart vs legion 7i vs thinkpad p16/p1-gen5

I'm looking at four laptop for DS. Not really interested in gaming, just the gpu, good cpu and massive ram. So that kind of brings me to the gaming laptop segment.

Main uses:

  • Data preprocessing, Prototyping cuda, rapids ai for accelerating classical data science and machine learning, DL inferencing, building conda enabled containers, 3D modeling/rendering and simulations using python, NLP, openCV, pytorch
  1. Thinkpad p16: 4200$/3900$ (64 vs 32 gb ram)

64gb/32gb ddr5, i9 12900hx, rtx a4500 16gb vram, 1 TB, 3480 vs 2400, 230W power adapter

  1. Thinkpad p1 gen5: 3900$

32gb ddr5, i9 12900h vpro, rtx 3080ti 16gb vram, 1 TB, 2560 vs 1600, 230W power adapter

  1. Asus Proart studiobook: 2999$

32gb ddr5, i7 12700h, rtx 3080ti 16gb vram, 2 TB, 3840 vs 2400 4K OLED, 330W power adaptor

  1. Legion 7i: 3500$

32gb ddr5, i9 12900hx, rtx 3080ti 16gb vram, 2 TB, 2560 vs 1600 165hz, 300W power adaptor

I love how beautiful and robust legion 7i is but based on the price difference I'm also leaning towards asus proart in case i7 12th gen isn't too bad to work with.

1

Comments

You must log in or register to comment.

lazazael t1_j0e2bfs wrote

laptop is either a macbook or thinkpad for me, i'd buy for portability and use cloud resources/ university servers for actual computation

3

lazazael t1_j0fr6zz wrote

if not the cloud because u don't want ongoing payment than remote compute on a desktop with 256 ram&4090 heating the office, an instance like that is a beast for ML compared to these...slim contenders, university professors usually do that, they buy a heavy lifter for a few of them to use freely from their mb airs or thinkpads

1

elbiot t1_j0nnc82 wrote

Yeah I brought a used gaming desktop from Facebook and kept my 6 year old laptop. Crazy specs for the price and came with a 3060 with 12gb vram. I recommend a gpu with more vram vs one that's "faster" because it won't be fast if it can't load the model at all

1

sayoonarachu t1_j0wjj7v wrote

You could probably look at the 11th gen legion 7i which is cheaper than their new 12th gen ones. They're not 3080TI but the difference between 3080 and 3080 ti, last I check was very minimal like 5% performance difference.

I personally have the 11th gen version after comparing a bunch of gaming laptops and use it for programming in Unreal Engine and deep learning and playing with Stable Diffusion, etc. Main pro? Like you said, the looks. I love the simple minimal non gaming laptop appeal of the legions. 😅

Also, you'd probably wanna research if all the laptops you've listed are actually able to run the 3080s at their max rating of 150w (previously known as max-q i believe). Some oems won't advertise it. The legion 7i 3080s are though.

1

sayoonarachu t1_j0yn75o wrote

I've only starting learning DL a month ago so mostly have been doing simple ANN. But inferencing larger param NLP models, GANS, Diffusion models, etc is fine. It's no desktop 3090s or enterprise grade GPU but for a laptop it's by far the best on the market. For example, the largest Parque file I've cleaned in pandas was about 7 million rows and about 10gb in size of just text. It can run queries through it in a few seconds.

Guess it depends on what kind of data science or dl you're looking to do. The 3080s probably won't be able to fine tune something like BLOOM model but can fine tune stable diffusion models with enough optimizations.

For modeling in blender or procedural generation in something like Houdini, I haven't had issues. I've made procedurally generated 20km height maps in Houdini to export to Unreal Engine and was not a problem.

1

macORnvidia OP t1_j0z24ie wrote

>. For example, the largest Parque file I've cleaned in pandas was about 7 million rows and about 10gb in size of just text. It can run queries through it in a few seconds.

Using rapids? Like cudf?

1

sayoonarachu t1_j0zlw4i wrote

No. I was just using pandas (cpu) for simple quick regex and removing and replacing text rows. It was just for a hobby project. The data was scraped from Midjourney and Stable diffusion discord so there were millions of rows of duplicate prompts and poor quality prompts which I had pandas delete and in the end the number of unique rows with more than 50 characters amounted to about 700k which was then used to train gpt-neo 125m.

I didn't know about cudf. Thanks 😅

1