Submitted by MyActualUserName99 t3_xt80zc in deeplearning

I've been looking into getting a new laptop for personal use as well as for deep/machine learning. I ran across the Lambda Tensorbook laptop (https://lambdalabs.com/deep-learning/laptops/tensorbook/customize) specifically designed for deep learning with the following specs:

Lambda Tensorbook

CPU: 14-core Intel I7

GPU: NVIDIA 3080 TI w/ 16 GB VRAM

RAM: 64 GB 4800 MHz

Memory: 2 TB NVME Gen4 SSD

Display: 15.6" 1440p 240 HZ

System: Ubuntu 22.04 + Windows 10 Pro with pre-installed Tensorflow, CUda, Pytorch, etc.

Price: $5,000

At first glance, it looks really good but the price is a bit high. I started looking at gaming laptops and ran across an Alienware x15 R2 (https://www.dell.com/en-us/shop/dell-laptops/alienware-x15-r2-gaming-laptop/spd/alienware-x15-r2-laptop/wnr2x15cto10ssb) with the following specs:

Alienware x15 R2

CPU: 14-core Intel I9

GPU: NVIDIA 3080 TI w/ 16 GB VRAM

RAM: 32 GB 5200 MHz

Memory: 2 TB NVME Gen4 SSD

Display: 15.6" 1080p 165 HZ

System: Windows 10 Pro

Price: $3,800

Compared against each other, Lambda has a better display (which I don't care much about) and more RAM (32 GB), but is $1,200 more expensive while Alienware also has Intel I9 CPU. Although the pre-installations are nice, I can always download tensorlfow, pytorch, and cuda manually for Alienware. I personally don't think I'll be needing the extra 32 GB of RAM as I can always use cloud services for very large projects, so it seems that the Alienware best fits my needs.

Are there any downsides to using a game laptop for deep learning rather than a specifically designed deep learning laptop? Is 64 GB of RAM really needed for most moderate deep/machine learning projects or should 32 GB with a reduced batch size work fine?

0

Comments

You must log in or register to comment.

cma_4204 t1_iqogqbg wrote

I have a msi laptop with a rtx3070 and 32gb of ram. It is plenty for experimenting and training small models/datasets. Once I have something working and need more power I rent gpus by the hour from lambda labs or run pod. For $1-2/hr I’m able to get around 80gb of gpu power. Long story short I would go for the cheaper and use the cloud when needed

9

obsoletelearner t1_iqorxoe wrote

If you're seriously considering doing deep learning build yourself a server, with the budget you specified you can build a decent machine

Also if had to choose one of the two I'd Alienware

3

BobDope t1_iqowbfc wrote

Chrome book and log in to your cloud provider

0

yannbouteiller t1_iqozhsf wrote

Alienware. Take minimum RAM/SSD et replace those by yourself (you can do this on the 17'' version, double-check that this is also true for the 15'' version. I think I remember some crap with the x15 like the RAM being soldered). You get the Dell on-site guarantee, and the machine is much cooler probably in both senses.

The real issue with both machines is that your GPU is soldered to the motherboard and thus will likely kill your laptop eventually.

Also, good to know before you buy, I got myself an AW x17 R2 for prototyping and gaming, and I realized that the built-in speakers make the chassis vibrate and create a terrible crackling noise if you use them at mid to high volume. This defect seems to be present on the whole series. Also, the webcam is crap, and the battery doesn't last long. Not sure if the Lambda laptop is any better in these regards, though.

A better bet might be the MSI Raider GE76 (if they have a 15 inches equivalent), but it looks a bit more flashy / less professional, you don't get on-site repairs, and the power supply is less transportable I think.

1

incrediblediy t1_iqp8nmt wrote

> Price: $3,800

still you can get multiple RTX3090 24 GB at this price, I have seen even brand new cards are going for US$900

> Is 64 GB of RAM really needed for most moderate deep/machine learning projects or should 32 GB with a reduced batch size work fine?

you will be training on GPU, so batch size is limited by 16 GB VRAM not RAM.

3

Chigaijin t1_iqp975j wrote

The Tensorbook is only $3500 unless you're looking at the dual boot model. I bought one and I've had no regrets, incredible machine. Maybe check out Sentdex on YouTube's review, thought it was pretty good.

1

Icy-Put177 t1_iqpgnpl wrote

How’s the price for msi setup? Does it warm much when GPU runs? Never had use a laptop with GPU, just want to get one for hands on small experiments.

1

Icy-Put177 t1_iqpgyi6 wrote

Who made these laptops? Are they any recognized brand? What type of warranties did you get? And where to take for tech support for any issue?

Thanks in advance.

2

Final-Rush759 t1_iqpujzr wrote

A desktop with 3090 is much better and cheaper. Buy a big case with good airflow. Or go for 4090.

3

majh27 t1_iqpw8pw wrote

build a desktop or pay for hosting, get a macbook pro or one of those popOS laptops

1

nutpeabutter t1_iqpxj3a wrote

Kinda frustrating how half of the help posts here are requrests for laptops. Like have they not bothered to do even the tiniest bit of research?? At the same price you could get an equivalently speced desktop/server AND an additional laptop, with the added bonus of being able to run long training sessions without needing to disrupt it.

2

cyberpunkstrategy t1_iqq2vh4 wrote

Because laptops are the suboptimal choice is exactly why people ask for advice when they face situational constraints.

Take me: I miss my home built tower terribly but due to a change in life situation something immobile became untenable. Its not the life situation I would choose, nor the setup, but it's the required setup for the situation.

2

tired_fella t1_iqqejcl wrote

Give Oryx Pro a thought too! for similar spec, you can get similar price as the Alienware with Ubuntu preinstalled. Sure, it is based on Clevo laptops, but they also did some customizations specifically to make things work with Ubuntu perfectly.

But in general, desktop PCs are preferred for deep learning. In my case, I have a ITX machine for side projects. Sadly, the GPU innit is outdated...

1

Best_Definition_4385 t1_iqr6aca wrote

>The Tensorbook is only $3500 unless you're looking at the dual boot model

This is not a comment about you, it's just a general comment. Considering the fact that setting up a dual boot system takes minimal time and expertise, if someone decides to spend $500 for a dual boot model, do you really think they have the computer skills that would need a powerful laptop? I mean, they can't figure out how to dual boot but they want to do deep learning? lmao

3

Chigaijin t1_iqu05wy wrote

Razer makes them with input from Lambda Labs, a deep learning company that hosts cloud GPUs and supports onsite builds as well. Lambda provides support and have been very helpful the few times I've needed to reach them. Base model (Linux) is $3.5k with a year warranty, $4.1k for the same with a 2 year warranty, and $5k for a dual Linux/windows machine that has a 3 year warranty. All machines have the same specs, so it's really support/warranty you're paying for.

2