DxLaughRiot

DxLaughRiot t1_ja9dh5c wrote

I’m trying to say we can’t even figure out perfect something as simple as “don’t sexually abuse children”. That was supposed to be easy!

More what I’m trying to say is it’s naive to think that with as messy and complicated life is that anyone will ever agree on a universal moral framework. Humanity has tried to for millennia, and typically what happens when people try to trot out their new super awesome objective morality is that people go to war over whether it’s right or not.

1

DxLaughRiot t1_ja9ar9s wrote

I use trolly problems as an example because it’s un-nuanced, straightforward, and still yields huge differences in supposedly objective systems of ethics. If objective systems can’t agree on something as basic as that - whether the scenario is common in real life or not - how are we supposed to find objectively ethical solutions for even the most slightly nuanced questions in the world?

Even your “we should all agree sexually abusing children is bad” has issues with it. On the surface, yeah no duh people shouldn’t sexually abuse children, but start digging even a little bit and you start to see cracks in the statement. What constitutes a “child”? What constitutes “abuse”? Ancient Greek philosophers had sex with young boys as young as 13 on the regular and thought it was ethical as long as both consented. Was that child abuse? Age of consent in Germany is 14 - in parts of Japan it’s 20. Whose legal framework is correct and why?

If the basis of your ethics is “legal consensus” you’re going to have a hell of a time trying to consolidate a global ethical framework.

3

DxLaughRiot t1_ja8xz7d wrote

Very fair. Maybe if the statement was that people have lost faith in current tech firms to solve these problems? Or maybe only some of them like meta in particular. I don’t think Meta is about to fix anything but AI from new tech firms might.

Maybe the better statement is that any tech firm that has arrived at the Desert of the Virtual is on course to its own demise - though I guess that’s pretty circular reasoning. Any company that isn’t solving real world problems is going to start failing as a company seems like a pretty redundant take on capitalism

1

DxLaughRiot t1_ja8qrdj wrote

Study philosophy - people have been trying to come up with an objective theory of ethics for thousands of years unsuccessfully. We’re not going to suddenly stumble upon one now, especially in a day in age where people can’t even agree that vaccines during a pandemic are “ethically required”.

Just look at how two supposedly objective ethical systems like utilitarianism and deontology try to answer simple ethical questions like trolly car problems. Despite both supposedly being rooted in objectivity they come up with very different answers to the same ethical dilemmas.

I get that you want to say “education is the answer”, but that just opens up new ethical questions to answer. Who defines what education is “needed”, how does science even play a role in ethics, what happens when there isn’t scientific consensus, etc.

6

DxLaughRiot t1_j7r2ndu wrote

If you’re looking for a number, I don’t think anyone can give you something reliable, because the important factors involved here are constantly changing.

What you’re asking is generally “what is the minimum number of people required to keep society functioning well”. You kind of define “functioning well” as keeping our industries running as well as progressing, keeping our knowledge preserved, solving major issues like disease/natural disasters, etc. If I bucket each of the groups of people required to tackle these problems into something we call “an industry”, you could naively say it’s a function of “number of critical industries required for society to operate well” (call this NCI for Number of Critical Industries) multiplied by the minimum number of people required to work those industries (call this MRW for minimum required workers) and boom you’re good. The answer is NCI * MRW.

But this doesn’t really work for a number of reasons. Here’s just a few:

  • For NCI, you’d need to explicitly define what industries are “critical” and that will vary from person to person. The number will probably shift pretty dramatically over time as the world environment changes or new problems crop up
  • The MRW will be different for Critical Industry as every industry should require different amounts of people to work it. This means you should calculate it on an industry by industry basis
  • The MRW will decrease over time as new technology and practices give way to automation in each industry. So the MRW on an industry by industry basis will be changing constantly
  • The size of required industries will change depending on the total number of people allowed by this society right? So as soon as you decide “ok we’ll need to add another indistry that will require 10000 more people” those new people now need food, shelter, electricity, and transportation which means some of those industries now need to grow. Any change effects every other industry.
  • We can never predict what things will cause changes to the NCI. You might consider car manufacturing a critical industry, but one day we may develop teleportation and suddenly that industry will not be needed anymore

There are so many factors here that are just impossible to account for as well as totally subjective. I don’t think people currently can estimate a minimum work force required for one industry (i.e how many teachers do we really need? Doctors? Scientists?) let alone make a good estimate on a societal or global scale.

Honestly whatever question you’re asking seems too abstract/impossible to model to ever really be super helpful.

Given that - idk let’s say 1 billion people

2

DxLaughRiot t1_iu5dbrh wrote

This is pretty much my take.

I got an oculus and have been playing with it and it’s pretty damn fun. There are loads of issues, it’s glitchy af, and right now it’s pretty empty but overall I see loads of promise. And the way it’s being built is so that others will build on top of Meta’s work and make their own money. They want to build a platform that people can make a living off of. It’s huge.

The problem is Zuckerberg has killed all good will people have with him now. There’s no way he’s going to build the cool new internet because his name is just toxic. They really need another face in front of it, or someone who knows how to sell the vision better because when I hear that lizard robot tell me this is the future I want to run the other direction.

5