Comments

You must log in or register to comment.

unswsydney OP t1_j9m8a13 wrote

G'day r/science! A team of our researchers, alongside ANU and Nottingham Trent University colleagues have developed a proof-of-concept technology that could eventually supersede LCDs and LED.

The tech – which has extraordinary light scattering properties – would replace the liquid crystal layer and would not require the polarisers, which are responsible for half of wasted light intensity and energy use in displays.

“Our pixels are made of silicon, which offers a long life span in contrast with organic materials required for other existing alternatives. Moreover, silicon is widely available, CMOS* compatible with mature technology, and cheap to produce.”

You can take a deep dive into the research paper here: https://www.nature.com/articles/s41377-023-01078-6

69

tornpentacle t1_j9maa6n wrote

Hmm, light scattering? I'm not in this field, does anyone mind explaining what that means in this context? It sounds like it wouldn't yield a clear display, but that doesn't seem to be the case based on the context—hence my curiosity!

14

Wizardof_oz t1_j9nof46 wrote

I’m guessing it works similarly to how a chameleon changes color or how peacock feathers shine blue

A chameleon’s skin doesn’t show colors through pigments. It rather changes structure to scatter light at different wavelengths, trapping some light, while letting some of it through, showing colors.

I’m just guessing that’s how this tech works though. I always wondered why we didn’t go in that direction to create color anyways. Might not be what’s going on here

14

Fishydeals t1_j9o7cc4 wrote

I think that should highlight the supposedly excellent viewing angles of this display technology. Really not sure though.

3

unswsydney OP t1_j9qo157 wrote

Hi there u/tornpentacle, here's a response from Prof Andrey Miroshnichenko, a lead researcher in the Nanophotonics team at UNSW Canberra.

​

>LCD screens use backlight illumination, and light propagates through a liquid crystal cell before entering our eyes. By changing the properties of the liquid crystal cell, the light can be blocked or not. Here we eliminate the relatively thick liquid crystal cell, maintaining the ability to control light propagation properties, making it thinner and lighter.

2

BellerophonM t1_j9nach4 wrote

What's the overall thermal profile of such a device at screen sizes given that it's based on flash heating elements?

2

financialmisconduct t1_j9nq0dd wrote

These are emissive only right?

Could they potentially be worked into a reflective display?

1

DigiMagic t1_j9odemj wrote

Your tech does not require polarizers, but requires heaters. What is the total final efficiency, compared to LCDs? Why are you expecting that something heated up to 200 Celsius (and constantly re-heated and re-cooled?) will have a long life span?

1

red75prime t1_j9oh8jf wrote

Could you clarify this passage? "We believe it is time for LCD and LED displays to be phased out"

I understand LCD part, but LED? You still need light source for your tunable metasurface display.

1

UniversalMomentum t1_j9oypsw wrote

LED is an LCD display with LED lights instead of fluorescent. They are really just LED-lit LCD displays. The LED allows more even lighting, more brightness and longer lifespan in theory.

3

hellure t1_jacf8bt wrote

but OLED? Which is better, this or that?

1

EMitch02 t1_j9maekl wrote

Would my eyes be able to tell the difference?

24

PPatBoyd t1_j9mdgci wrote

I see these advances as most potentially useful in VR contexts; we've basically maxed out relevant screen resolution for normal displays in the effective arc lengths observable by the human eye.

38

WilfordGrimley t1_j9mli5m wrote

I could see this tech being used in tandem with pre-foviated rendering in VR contexts to deliver extreme detail with efficient GPU usage.

8

3_50 t1_j9mmd01 wrote

> The tech – which has extraordinary light scattering properties – would replace the liquid crystal layer and would not require the polarisers, which are responsible for half of wasted light intensity and energy use in displays.

5

PPatBoyd t1_j9mo9hp wrote

I suppose with regards to OP question they would be able to see the difference in a thinner display with lower energy use, totally!

7

unswsydney OP t1_j9qoihl wrote

Here's what Professor Andrey Miroshnichenko had to say, u/PPatBoyd:

>Great suggestion! Indeed, it would be possible to make curved (or even flexible) screens using such devices.

1

Xe6s2 t1_j9mbur4 wrote

No but your wallet will!

8

upx t1_j9q7jz1 wrote

It’s the thinnest and lightest ever. We think you’re gonna love it!

1

unswsydney OP t1_j9qo87w wrote

Thanks for the questions u/EMitch02, here's Professor Andrey Miroshnichenko.

>Depending on the size of the screen. Generally, it should enable sharper images even on large screens.

1

u9Nails t1_j9mfpql wrote

When we say a 10 fold greater resolution, what are we comparing that with? Do you have a pixel density (ppi) figure that you can measure?

8

kiwinutsackattack t1_j9myinj wrote

Anytime I see a breakthrough on resolution all I can see in my head about is 2 detectives telling a computer guy to keep enhancing a grainy security camera footage till they see the reflection of the killer in the side mirror of a car parked across the street.

6

yoda_jedi_council t1_j9mcseb wrote

So, when will I be able to roll my screen into my bag ?

4

merlinsbeers t1_j9mjcwv wrote

What's "100X thinner?" The switching layer or the whole panel?

Because we have OLED display panels so thin they can be rolled and folded, already.

And this doesn't say they eliminated backlighting, just polarization, which is only needed because liquid crystal layers don't block light they just twist its polarization axis so it's 90 degrees from the polarized sheet in the next layer.

Also, calling something "CMOS compatible" is like calling it "IBM PC compatible." Not the flex it's meant to appear a to be.

4

dingo1018 t1_j9mnb8h wrote

I think it's referring to CMOS manufacturing processes isn't it? As in they don't have to sink billions into new fabrication tech rather it's materials science that current processes could adapt into established knowledge base, so like they didn't reinvent the wheel they just made better wheels?

11

merlinsbeers t1_j9n1yf8 wrote

Yeah. It was a big thing in the 80s. If you could piggyback a CMOS manufacturer's process you could bootstrap a product line easily.

Now it's not that big a deal because the fab equipment manufacturers can deal with exotic processes, and leading edge processes are themselves extremely exotic compared to something generically CMOS.

It's like plugging a full-color display or automatic transmission. Kind of sad.

3

rajrdajr t1_j9nfeol wrote

> calling something “CMOS compatible” is like calling it “IBM PC compatible.” Not the flex it’s meant to appear a to be.

CMOS compatibility is quite the flex when considering optical technologies.

6

jodido999 t1_j9mgxx3 wrote

I wonder how much 8,000k cables will cost?

3

urmomaisjabbathehutt t1_j9oixsd wrote

worth the cost, perfect for displaying the compressed 720p HD programs from cable TV

1

mrlolloran t1_j9mqys7 wrote

As a former led video wall tech: I am so glad I don’t work in that industry. How tf are you supposed to work with something that small in anything resembling a cost effective manner

3

lesssthan t1_j9mvv2i wrote

Fuckin TV wallpaper. Finally.

3

waglawye t1_j9otxxk wrote

Next step, transparant on demand

2

lesssthan t1_j9owx3q wrote

Nice. :-) I see borderless windows at the flip of a switch. The window is real and there, but you can set the rest of the wall to look transparent whenever you want.

1

UniversalMomentum t1_j9ozhgm wrote

I say skip right to holograms. Things like 8k 2D is mostly worthless because our eyes don't care about 4k vs 8k enough. We need other senses to get involved. Like a TV that produces smells would be a lot more immersive than just 8k. A good sound system will be more immersive than going from 4k to 8k and the tech looks far more prone to failure.

1

AutoModerator t1_j9m89e1 wrote

Welcome to r/science! This is a heavily moderated subreddit in order to keep the discussion on science. However, we recognize that many people want to discuss how they feel the research relates to their own personal lives, so to give people a space to do that, personal anecdotes are allowed as responses to this comment. Any anecdotal comments elsewhere in the discussion will be removed and our normal comment rules apply to all other comments.

I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

1

cloudsandclouds t1_j9my4lb wrote

> Our metasurfaces are controlled via electrically driven localised transparent heaters that switch the metasurface optical properties by biased voltages <5 V. By applying an asymmetric driving voltage, we achieve flash heating, leading to 625 μs modulation time. It is worth mentioning that such a modulation time is more than 10-fold faster than the detection limit of the human eye (13 ms). Therefore, despite the operational temperature of ~200 °C, it can still be integrated with CMOS devices.

Huh—so does this mean that it’s impractical for something like phone touchscreens? That seems awfully hot.

Really cool work in any case! :)

1

zepolen t1_j9ntqsh wrote

What is the heat profile of this? Won't it likely exhibit the same issues as plasma/oled ie. burn in/burn out.

1

waglawye t1_j9ou0q9 wrote

Make it transparant so we can use windows as tv screen.

1

sambes06 t1_j9r40fk wrote

You can almost hear the boners popping at r/virtualreality

1

taralundrigan t1_j9oryne wrote

This is what innovation is these days. Lame. Another product. A smaller TV. Who even needs better resolution?

How about products that last? No more planned obsolescence. How about less plastic? How about less consumption in general?

−1