Comments

You must log in or register to comment.

AutoModerator t1_j9m89e1 wrote

Welcome to r/science! This is a heavily moderated subreddit in order to keep the discussion on science. However, we recognize that many people want to discuss how they feel the research relates to their own personal lives, so to give people a space to do that, personal anecdotes are allowed as responses to this comment. Any anecdotal comments elsewhere in the discussion will be removed and our normal comment rules apply to all other comments.

I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

1

unswsydney OP t1_j9m8a13 wrote

G'day r/science! A team of our researchers, alongside ANU and Nottingham Trent University colleagues have developed a proof-of-concept technology that could eventually supersede LCDs and LED.

The tech – which has extraordinary light scattering properties – would replace the liquid crystal layer and would not require the polarisers, which are responsible for half of wasted light intensity and energy use in displays.

“Our pixels are made of silicon, which offers a long life span in contrast with organic materials required for other existing alternatives. Moreover, silicon is widely available, CMOS* compatible with mature technology, and cheap to produce.”

You can take a deep dive into the research paper here: https://www.nature.com/articles/s41377-023-01078-6

69

tornpentacle t1_j9maa6n wrote

Hmm, light scattering? I'm not in this field, does anyone mind explaining what that means in this context? It sounds like it wouldn't yield a clear display, but that doesn't seem to be the case based on the context—hence my curiosity!

14

EMitch02 t1_j9maekl wrote

Would my eyes be able to tell the difference?

24

PPatBoyd t1_j9mdgci wrote

I see these advances as most potentially useful in VR contexts; we've basically maxed out relevant screen resolution for normal displays in the effective arc lengths observable by the human eye.

38

u9Nails t1_j9mfpql wrote

When we say a 10 fold greater resolution, what are we comparing that with? Do you have a pixel density (ppi) figure that you can measure?

8

jodido999 t1_j9mgxx3 wrote

I wonder how much 8,000k cables will cost?

3

merlinsbeers t1_j9mjcwv wrote

What's "100X thinner?" The switching layer or the whole panel?

Because we have OLED display panels so thin they can be rolled and folded, already.

And this doesn't say they eliminated backlighting, just polarization, which is only needed because liquid crystal layers don't block light they just twist its polarization axis so it's 90 degrees from the polarized sheet in the next layer.

Also, calling something "CMOS compatible" is like calling it "IBM PC compatible." Not the flex it's meant to appear a to be.

4

3_50 t1_j9mmd01 wrote

> The tech – which has extraordinary light scattering properties – would replace the liquid crystal layer and would not require the polarisers, which are responsible for half of wasted light intensity and energy use in displays.

5

dingo1018 t1_j9mnb8h wrote

I think it's referring to CMOS manufacturing processes isn't it? As in they don't have to sink billions into new fabrication tech rather it's materials science that current processes could adapt into established knowledge base, so like they didn't reinvent the wheel they just made better wheels?

11

mrlolloran t1_j9mqys7 wrote

As a former led video wall tech: I am so glad I don’t work in that industry. How tf are you supposed to work with something that small in anything resembling a cost effective manner

3

cloudsandclouds t1_j9my4lb wrote

> Our metasurfaces are controlled via electrically driven localised transparent heaters that switch the metasurface optical properties by biased voltages <5 V. By applying an asymmetric driving voltage, we achieve flash heating, leading to 625 μs modulation time. It is worth mentioning that such a modulation time is more than 10-fold faster than the detection limit of the human eye (13 ms). Therefore, despite the operational temperature of ~200 °C, it can still be integrated with CMOS devices.

Huh—so does this mean that it’s impractical for something like phone touchscreens? That seems awfully hot.

Really cool work in any case! :)

1

kiwinutsackattack t1_j9myinj wrote

Anytime I see a breakthrough on resolution all I can see in my head about is 2 detectives telling a computer guy to keep enhancing a grainy security camera footage till they see the reflection of the killer in the side mirror of a car parked across the street.

6

merlinsbeers t1_j9n1yf8 wrote

Yeah. It was a big thing in the 80s. If you could piggyback a CMOS manufacturer's process you could bootstrap a product line easily.

Now it's not that big a deal because the fab equipment manufacturers can deal with exotic processes, and leading edge processes are themselves extremely exotic compared to something generically CMOS.

It's like plugging a full-color display or automatic transmission. Kind of sad.

3

rajrdajr t1_j9nfeol wrote

> calling something “CMOS compatible” is like calling it “IBM PC compatible.” Not the flex it’s meant to appear a to be.

CMOS compatibility is quite the flex when considering optical technologies.

6

Wizardof_oz t1_j9nof46 wrote

I’m guessing it works similarly to how a chameleon changes color or how peacock feathers shine blue

A chameleon’s skin doesn’t show colors through pigments. It rather changes structure to scatter light at different wavelengths, trapping some light, while letting some of it through, showing colors.

I’m just guessing that’s how this tech works though. I always wondered why we didn’t go in that direction to create color anyways. Might not be what’s going on here

14

zepolen t1_j9ntqsh wrote

What is the heat profile of this? Won't it likely exhibit the same issues as plasma/oled ie. burn in/burn out.

1

DigiMagic t1_j9odemj wrote

Your tech does not require polarizers, but requires heaters. What is the total final efficiency, compared to LCDs? Why are you expecting that something heated up to 200 Celsius (and constantly re-heated and re-cooled?) will have a long life span?

1

red75prime t1_j9oh8jf wrote

Could you clarify this passage? "We believe it is time for LCD and LED displays to be phased out"

I understand LCD part, but LED? You still need light source for your tunable metasurface display.

1

taralundrigan t1_j9oryne wrote

This is what innovation is these days. Lame. Another product. A smaller TV. Who even needs better resolution?

How about products that last? No more planned obsolescence. How about less plastic? How about less consumption in general?

−1

waglawye t1_j9ou0q9 wrote

Make it transparant so we can use windows as tv screen.

1

UniversalMomentum t1_j9ozhgm wrote

I say skip right to holograms. Things like 8k 2D is mostly worthless because our eyes don't care about 4k vs 8k enough. We need other senses to get involved. Like a TV that produces smells would be a lot more immersive than just 8k. A good sound system will be more immersive than going from 4k to 8k and the tech looks far more prone to failure.

1

unswsydney OP t1_j9qo157 wrote

Hi there u/tornpentacle, here's a response from Prof Andrey Miroshnichenko, a lead researcher in the Nanophotonics team at UNSW Canberra.

&#x200B;

>LCD screens use backlight illumination, and light propagates through a liquid crystal cell before entering our eyes. By changing the properties of the liquid crystal cell, the light can be blocked or not. Here we eliminate the relatively thick liquid crystal cell, maintaining the ability to control light propagation properties, making it thinner and lighter.

2

sambes06 t1_j9r40fk wrote

You can almost hear the boners popping at r/virtualreality

1