Comments

You must log in or register to comment.

Thud2 t1_iyity3k wrote

The big question is how they handle dust

116

jumpsteadeh t1_iyle3dq wrote

They have little fans on them

24

unrealcyberfly t1_iylo7sx wrote

Blowing out dust isn't that much of a problem if you can open the laptop with ease. But that is a big if.

2

Thud2 t1_iyn0ytq wrote

>Blowing out dust isn't that much of a problem if you can open the laptop with ease.

This is true with normal fans and heat sinks but how do you know that's the case with this type of cooler?

1

Dash_Lambda t1_iyiwqho wrote

Hmm. Looks like it might work on a similar principle to those piezoelectric fans LTT did a video on a while back. I'd be very interested in seeing how they arranged the oscillators to pull air through like that.

Really there's a lot of technical information about the design that the article really doesn't go into. I'm also interested in the heat exchanger, since I would imagine they're generating comparatively lower airflow but also making everything very compact.

Kind of excited to see if thia goes anywhere.

65

Thud2 t1_iyq8x4e wrote

> Looks like it might work on a similar principle to those piezoelectric fans

That was my first thoughgt too.

Edit; Actually I'm pretty sure it operates on the same principle.

1

[deleted] t1_iyj1gki wrote

[removed]

−60

CoalFries t1_iyj5b0k wrote

You're thinking of the peltier coolers. The piezoelectric fans that they are mentioning are the ones that vibrate like you mentioned

37

Neo_Techni t1_iykc9h5 wrote

> You're thinking of the peltier coolers

ah. probably, I can't recall names worth a damn

6

marcocom t1_iyju4g5 wrote

Remember peltiers!? And how we used to have to scotch-guard the entire Mobo to avoid condensation issues?

We’ve come a long way baby

5

ChronWeasely t1_iyj6vqa wrote

"No."

My least favorite way to start a reply. Especially when wrong. Which you are.

Also electricity directly into cold? Sounds like a violation of the first law of thermodynamics. Some heat must be dumped somewhere.

35

Samarium149 t1_iyj9bpn wrote

I mean, technically you can convert electricity into "cold" using extremely precise laser frequencies and atomic resonances to convert thermal energies into photonic light.

Cooling a CPU using this method is probably overkill.

8

CygnusX-1-2112b t1_iyjddvd wrote

Unless you're trying to overclock a pentium duocore to 10ghz without giving our solar system a second star.

12

flyingtrucky t1_iykhsjv wrote

Peltier coolers have been a thing in missiles since the 70s. They aren't particularly efficient but they're lightweight.

1

Neo_Techni t1_iykcg21 wrote

> Sounds like a violation of the first law of thermodynamics. Some heat must be dumped somewhere.

No. I thought of that too, but that comes from the generation/transmission of electricity.

Also I had it mixed up with peltier coolers
https://lairdthermal.com/products/thermoelectric-cooler-modules

−2

Dash_Lambda t1_iykmp4a wrote

Are you saying Peltier coolers remove heat by generating electricity? Because they absolutely can if you apply a temperature difference across them, but to force the heat to move they actually have to convert some electricity into heat in the process.

That's the difference between thermoelectric coolers and thermoelectric generators, one uses electrical energy to move heat and the other turns the movement of heat into electrical energy.

4

Neo_Techni t1_iylhnjd wrote

> Are you saying Peltier coolers remove heat by generating electricity?

No, I said the opposite. That by pumping electricity into them they remove heat.

−1

Jazzlike-Patience-15 t1_iyj6ska wrote

You have thermoelectric and piezoelectric mixed up.

5

FireteamAccount t1_iyjapoq wrote

It creates a gradient. One side hot, one cold. It helps to move heat away from the cold side but overall it net generates heat.

3

Dash_Lambda t1_iyjqdx7 wrote

I actually wasn't aware LTT did a video on Peltier coolers.

Just to clarify: Peltier coolers are heat pumps, they move heat from one side of the plate to the other. This means that like any other cooling system they don't generate cold, they remove heat. That must then be dissipated into the air by a more traditional cooler.

They're more interesting I think for power generation. The same principle is used to power the Mars rovers using a brick of plutonium (called a "radio-thermal generator"). It's really cool.

LTT more recently did a video on a piezoelectric fan. If you haven't seen it you should check it out, it's also cool.

4

wateryparsley_18 t1_iyj07r6 wrote

Laptops can get pretty hot these days. However reducing the Cpu usage through Power setting helped massively dropped down from 100% usage to 99% and the temps went down from 101°C on max fan speed to 78-79 on default and as low as 70-69 with max fan speed.

35

Golden_Lynel t1_iyj690j wrote

>from 100% usage to 99%

Isn't that just disabling turbo/boost clocks?

22

SoarinPastTheMoon t1_iyjp4gd wrote

I know this sub has a huge anti apple bias but the fact their processors are even competing with the highest end CPU’s while having a TDP of 10-40 watts and barely ever turning on the fans is probably the better approach than redesigning fans.

37

Hattix t1_iyk3kz7 wrote

It's not going to happen, and why is a fundamental aspect of why AMD64 (x86) is different to ARMv7/v8/v9.

ARM gets parallelism at the instruction level (ILP), it has small instructions, very suited for out of order execution. A single thread can get a lot of parallelism, so the CPU doesn't need a lot of overhead to wrestle the ILP bear. The ease at which ARM (and some other RISC archs, like POWER) gets ILP is part of the reason why mandated parallelism in things like VLIW and EPIC didn't do very well: It just wasn't necessary.

AMD64 is very different, it has all that x86 baggage on it. Instructions have all kinds of modes, dependencies, tags, etc., and this makes them a lot more interdependent than ARM instructions are. So, pulling ILP out of AMD64 is a lot more difficult than ARM, and the CPU has to spend a lot more resource in doing it. Even then, it doesn't get the same degree of ILP ARM can achieve.

AMD64 gets more of its parallelism from TLP: Thread level parallelism. There's a reason all performance AMD64 processors from AMD and Intel support simultaneous multithreading (SMT/HyperThreading^(tm)), this is where most of their parallelism comes from. SMT shows a significant performance improvement in almost all cases, meaning execution slots are going spare when SMT isn't in use, which further means there isn't enough ILP to saturate the processor's capability.

This isn't usually the case on ARM, most ARM cores are designed to "race to sleep" and fill as much execution resource as possible. The CPU's awake and clocked up, it darned well better use that time as productively as possible, there's a power budget to worry about!

So, while-ever we're using AMD64, which will be more or less forever as far as the immediate future is concerned, similar performance on AMD64 will always need more power than it will on ARM.

33

volcano_margin_call t1_iyknwxw wrote

You’re the guy who created the silicon chip at Apple lurking on Reddit, aren’t you?

16

magick_68 t1_iylogdi wrote

Sorry for my ignorance but i read that as "We will never switch to arm because arm is better". Maybe i need an ELI5?

6

Hattix t1_iym2a26 wrote

ARM, for low power and performance efficiency, is better, and that is why. It gets more ILP, and ILP is more efficient than TLP (which is more efficient than PLP, but yeah).

AMD64 is entrenched. It's what Windows works on. It's what the entire PC ecosystem works on. The PlayStation and Xbox runs on AMD64. Whether we like it or not, it is here to stay.

I deliberately didn't discuss any switch to ARM, as it's almost certainly not going to happen and I was responding to someone who was saying that making more efficient CPUs was a better idea than better fans.

It is, of course, but it isn't going to happen to the level ARM allows it to.

5

Golden_Lynel t1_iyjqf0u wrote

Yeah, I wish we led more with the idea of efficiency in engineering CPUs

3

BusinessBear53 t1_iykhia2 wrote

My guess is it's just the way the companies have headed due to their main products.

Apple mainly deals in their phones and tablets while still making some computers on the side so naturally they'd want to invest more into their mobile platforms where cooling is passive. Thermal and power efficiency is the primary focus while trying to bump up speed at the same time.

Intel on the other hand deals with systems that have cooling systems. Servers and desktop coolers get pretty beefy so can handle a lot. Raw performance wins and thermals are slightly less important. Sure they have CPU's for laptops and NUCs which lack good cooling but those tend to get lower specced CPU's.

I built my new PC last week and saw that the trend in CPU's has done a 180. My first PC was from 2013 and buying an unlocked CPU to overclock for better performance was the norm. Power pull and thermals be damned.

These days the unlocked CPU's boost themselves to a pre set limit but pull big power and generate heaps of heat. Now some people undervolt to try keep thermals under control without sacrificing too much performance.

7

SoarinPastTheMoon t1_iyjqn45 wrote

I really hope so man. Been running windows arm on my new MacBook and since I have programs that are x86 dependent, the virtualization layer in windows arm is pretty inefficient compared to Rosetta. Even then, my fans don’t turn on.

4

Golden_Lynel t1_iyjqvj3 wrote

Linux

(I know its a meme but I use gentoo btw)

1

SoarinPastTheMoon t1_iyjr3b1 wrote

I would if I didn’t rely on windows only excel extensions and school specific windows programs but looks like we’re getting gpu acceleration with Asahi very soon.

2

swisstraeng t1_iyk4au1 wrote

It's not the efficiency, but a different architecture. The main issue is that, all software made on windows wouldn't be compatible if you change the CPU architecture.

Yeah, all CPUs use the X86 architecture. Or more modern versions that are backward compatible.

Apple's M1 and newer CPUs are not using X86, instead they use the same architecture as our smartphones. Which is more energy efficient, but costs more to produce if you want the same computing power.

1

gredr t1_iyk6cuv wrote

If only the companies who spend (and make) billions designing and producing CPUs realized that we would like more efficient CPUs!

−1

SvenTropics t1_iylc7t8 wrote

You can't compare power usage of a x86 core and an ARM because one was designed for high performance, and the other was designed for low power consumption. It's also why every cell phone uses an ARM processor.

It's like comparing the fuel economy of a Ford 350 and a Prius. Sure, if all you want to do is drive around town, the Prius will get you there just as fast, and it'll do it for 1/3 the gas.

2

Disastrous-Spell-135 t1_iym3n7b wrote

> You can’t compare power usage of a x86 core and an ARM because one was designed for high performance, and the other was designed for low power consumption. It’s also why every cell phone uses an ARM processor.

Why not, when ARM chips get similar or better performance? When the output is the same and one consumes far less power for it, why shouldn’t you compare the power usage?

1

rakehellion t1_iz6lfro wrote

So you make the CPU cooler by making it slower? Yeah, that's how they work.

1

ofimmsl t1_iyjl512 wrote

I'm not a fan

12

weedsport69 t1_iykfdot wrote

Personally, I'm blown away by it

5

timg528 t1_iykjjmz wrote

I can't imagine that this kind of cooling solution would be anywhere near cost effective.

Didn't LTT do a video on a hand sized one that cost over $1k? Imagine trying to miniaturize that and the associated cost increase.

7

Andreiy31 t1_iyklkjv wrote

Pretty sure it only cost like that because of very low supply and low demand. Over time the cost will go down if they mass manufacture it.

2

timg528 t1_iykmmsy wrote

Yes, the problem is getting to that point. You might have a few, very expensive laptops that are early adopters, but I see it being an uphill battle to get them the capital to build out the facilities necessary to produce these in large enough quantities to precipitate affordability.

2

Dash_Lambda t1_iymdmxv wrote

I mean, any new product has that hurdle. The big thing right now isn't how much they have in place, it's how big their potential market is. Those fans LTT covered are made for highly specialized applications with near-nonexistent markets, but these are targeting a wide variety of consumer electronics.

On top of that, if I remember correctly, the article has quotes from Intel reps saying they're very excited to work with them to use these new coolers. If they have the attention of Intel they should be able to expect a certain level of initial adoption right off the bat.

It happens with lots of things, when there's almost no market for it it's insanely expensive and you have very few choices but the moment that market opens up the choices grow and the cost plummets. They usually get a lot better too just due to sheer R&D investment.

1

timg528 t1_iymo48t wrote

We'll see.

Personally, I don't think the demand for solid-state fans is enough to overcome the engineering, manufacturing, and marketing challenges.

It certainly wasn't when Purdue tried in 2002, or Thorrn in 2008, or GE in 2012.

2

Dash_Lambda t1_iymrszk wrote

Huh, I didn't know about those previous attempts.

A decade is a lot of time for the consumer electronics industry, so I don't think attempts from 10-20 years ago mean all that much for a project today, not to mention the importance of marketing, timing, and industry relationships in stuff like this. "Past performance is not a predictor of future results."

That said, I won't claim to have any idea how this particular product will turn out. So yeah, we'll see.

On a side note, I like the more technical information in those sources. I read through the GE one and the idea of alternating intake/exhaust through one hole makes more sense to me than somehow puling the air in one direction through a plate, but an arrangement like that seems like it would lose some performance to heat building up at the intake... I'd be very interested to see how a fully developed commercial version works in a laptop.

1

zoinkability t1_iyjt38t wrote

>The cooling system is called the “AirJet” chip, and it comes from Frore Systems, which has begun collaborating on the technology with Intel.

Or you could, you know, use a more efficient architecture so you don't need all that cooling in the first place. For bonus points you get more battery life.

4

g-nice4liief t1_iylwkuw wrote

Create one. And make sure it becomes mainstream, is easy to implement. Maybe Risc-V ?

/s

i mean if it was that easy i think we would already have a solution or seen a market switch to a different architecture. Apple came from PowerPC -> Intel -> ARM. It ain't easy creating a architecture. let alone maintain and upgrade it. Even apple with all it's money hasn't created their own architecture. So that should say alot

1

zoinkability t1_iym82we wrote

If it wasn’t obvious I was referring to ARM

3

g-nice4liief t1_iymccp6 wrote

ARM won't make the money x86/AMD64 currently make. As long as ARM won't run legacy code bases or libraries, developers won't make the switch.

Even though the M1 and M2 are great chips, they haven't changed much regarding developers switching or abandoning x86/AMD64.

Even though .net 7 has native support for ARM, no developer is going to read 200.000+ lines of code to port .Net 3 or .Net 4 functions to .Net 7. That's the sad reality. There needs to be a breakthrough in ARM so it can execute x86/ARM64 and than we should have a great future regarding power efficiency.

2

murdok03 t1_iynja00 wrote

You have no idea how much money already goes into fabs and architecture design, you're forgetting AMD Athlon had 100W TDP as well.

1

snakebite2017 t1_iyk4jal wrote

Didn't GE have a similar solution?

1

mailslot t1_iylpsbz wrote

Cool. Now every app can be built in Node.is and shipped on Electron.

My fans cry in Slack and VSCode.

1

toshgiles t1_iymput4 wrote

Or design better power management and CPUs.

My M1 Max Mac doesn’t have a fan and doesn’t get hot even when running photoshop and Lightroom simultaneously and charging…

1

john-douh t1_iynft8j wrote

Slap this tech on an Nvidia card… voilà! you got yourself an expensive convention oven! Use to re-flow circuit boards or even cook some pizza rolls!

1

ethicsg t1_izvnylx wrote

Reminds me of the electrostatic bellows style one from the 2000s.

1