Viewing a single comment thread. View all comments

3SquirrelsinaCoat t1_j9d4l6r wrote

I can imagine scenarios. Say we're building a new jet engine. Prototyping is expensive so automatically we're iterating with a digital twin. Currently that's done through 2D interfaces, maybe augmented reality at best, and nonstop video conferences. That is ripe for improvement. A jet engine is going to be a large engineering team with global assets, depending on which part of the engine is being developed at any one time. And instead of a bunch of engineers standing over an actual piece of machinery or using computers and talking over the phone, they are in a perfect duplication of a real world lab, except when they make a mistake or drop something or whatever, it doesn't matter, and it also doesn't matter where in the world anyone is.

That's still a little bit ahead of us but not by much. Valid and valuable use case for, idk, next-gen engineering call it. That's one hypothetical where a "metaverse" (which is just a 3d environment with extra sensors) is useful, bringing together AI, VR, advanced computing, haptics, all of it, into a new way of working. That makes sense to me.

What doesn't make sense is asking someone to pay for the experience. Large companies can afford this shit, and if there's breakthrough innovations, I think it will come from the industrial space funded entirely by R&D.

1