UniversalMomentum t1_jackz8p wrote

It's bad news for navigation I suppose, but we have almost no idea if it's bad news for humans. Species have seemingly lived through many polar reversals with no signs of mass extinctions, so probably not too much to worry about.

The polar reversals appear to happen slow also, so we'd probably have plenty of time to adapt.

I'd say the most likely catastrophize is the field lowers enough some electronics get friend and some navigation gets screwed up, but I don't think it happens fast so it would probably be more like a trickle of problems, not an avalanche of problems.


UniversalMomentum t1_ja9638e wrote

Yes, but it also just so happens when your the richest country you have to outsource labor to make things cheap enough to sell them to most anyone but yourself.

I think that's the part most people overlook. America could make everything in America but then not only would we pay more, but much fewer places could afford the products and the entire global economy would grow slower.

Before globalism it was like developing countries were more linked to developed countries economies because they were trapped behind the inflated values of developed nations. Globalism is really just letting them make the stuff themselves so everybody can grow faster and it's actually rather generous of the developed nations to not lock developing nations into higher costs.

When you get right down to it Globalism has been the biggest wealth redistribution to the developing world in all human history. US and EU aren't getting richer relative to developing nations via Globalism, they are getting richer slower and developing nations are growing faster and for that developed nations citizens get some of the cheapest retail buying power in history... on those items. Not so much on healthcare or housing where globalism doesn't help much.

Soo you might be barking up the wrong tree when you really stop and think about it all.


UniversalMomentum t1_ja8kgrv wrote

No, it's just like a rubik cube solving program, but fancier. It's just solving puzzle through brute force data. Realistically the rate of progress will already boom with just machine learning and human imaginatin. Real AI doesn't add as much to the equation as you think OR machine learning adds much MUCH more than you realize without sentience being even remotely important.

It's like we really just need machine problem solvers, not machines that can argue with us. Humans have more good ideas than we know what to do with, things like automating labor and getting costs down so more ideas become viable is a lot more important to progress than AI will be.

AI modeling us the most likely big bang sequence or figure out the true origin of life really isn't super important. Like those could be mysterious forever and we will be fine, it's the resource management and cost of living that humans need help with and you need more than brains to fix that.. you need LABOR.

It's not like AI is really going to be so smart that it like just starts casting spells from inside it's datacenter and re-writes the fabric of the universe. You're letting your imagination get the best of you.. which is part of the reason our need for AI is somewhat limited.

With an imagination like that all we have to do is have humans bang out ever crazy idea they have and non sentient machine learning can puzzle solve all out bullshit until it eventually makes sense.

We are the AI! The machine learning brute forces complex puzzles to produce probable answers WITHOUT self awareness. What more do you need and good luck investing all that effort into AI just to have it imagine stuff and then use machine learning to brute force the problem.

AI is when humans get so lazy they don't even want to imagine anymore. Everything else is just robotic automating and better programming. right now we call better programming machine learning, but at it's core it's just better programming that can allow for the inconsistent nature of input in the real world.. it can adapt to variations in the data.

When you do that billions of times per second cool stuff happens.. like video games or machine that solve puzzles. It's not alive, but it is amazing AT FIRST. Ater 10-20 years you will think machines that solve puzzles are old news and DUH that was always going to happen.. just like the computer and the internet are just obvious progressions of tech.


UniversalMomentum t1_ja7v6sw wrote

I think for that to be practical you need a lot of robotic automation to raise the standard of living and reduce the need for humans to compete against each other just to survive so much. Otherwise what you're saying is pretty much what the UN is already trying to do, but with no where near enough resources to make it happen.

We need to lure the global population into such a plan, so we need something to lure them with and something like robotic automation lowering the cost of all commodities and labor is the most plan I can think up to help reduce greed and give people less reasons to fight each other constantly. Otherwise there is literal constant benefit to screwing each other over and the more desperate your situation the bigger the incentive. Kind of like when we imagine a world where food runs out and law and order falls rapidly with it. That's the kind of wild asset of humanity we are dealing with, so we need a way to stabilize their living conditions so they act more sane and predictable vs desperate and lawless as we commonly see anytime living conditions deteriorate.


UniversalMomentum t1_ja7tq4t wrote

We don't know how sentience works really. We don't know what animals are thinking. We can barely tell what humans are thinking most of the time!

AI is a process of digital evolution, not hand crafting all the code, so you kind of get what you get. You COULD get an AI that appreciates art but sucks at communication. You could get an AI that just wants to stare at the wall and lick doorknobs. You could get an AI that always invents a new way to get stuck in loops.

It's kind of like throwing a bunch of chemicals into a soup to make life, don't expect to know what you will get once we really get to the point of sentience. Right now I think we are no where near that point and the progress of AI might slow down so much it's not a big deal. We may make great progress in the first 90% and find real sentience is vastly more complex than we thought, we really have no idea at this point. We certainly don't even understand how out own brain produce sentience or even how to define it well, so LOTS of unknowns there.


UniversalMomentum t1_ja7t8m1 wrote

If we program human emotions into a big dataset and keep crunching the algorithums the result should be something that mimics humans emotions so well you can't tell the difference.

We can argue if it really FEELS or not, but from our perspective it should be able to easily mimic all human behavior convincingly. Humans are not THAT complex, rather we tend to all act very similar, so we won't be that hard to mimic.


UniversalMomentum t1_ja7t1ov wrote

The same way you do everything with machine learning. You provide it with a ridiculously large dataset to build a suitable algorithm from. You don't have to understand every aspect of something because your using evolution, not hand crafting every piece of code. It's just machine learning digital evolution instead of good old biological evolution.


UniversalMomentum t1_ja7suyc wrote

AI will be evolved through machine learning cycles too, not just hand made, it will have components and features that were not designed for at all. I don't think we will have much certainly about what we will be creating at first and then we will still lock short and long term control over the outcome of this artificial evolution.

More than we are hand crafting digital life, we are evolving digital life, which means a lot of it is still kind of out of our direct control and understanding.


UniversalMomentum t1_ja7smoh wrote

If we make enough AI's then at least one will appreciate humans. One question is how many AI's will we actually make. I think most of you see AI as mass proliferating. I don't. I think real AI will be far and few between and not even as useful as just plain old machine learning and robots capable of doing the physical part.

It's really the automation of labor we need, not a brilliant AI to tell us how dumb we are. Knowing things is great, but that doesn't get the actual labor done and humans are mostly not in a position of low innovation. If anything our innovation might be killing us. It's really endless cheap labor we need much more than self aware AI.

So one question is how many profitable uses will many competing AI's really have. As a consumer I'm MUCH more interested in like Rosie The robot level tech with no need for AI. A don't mind fake AI like Google, Siri and ChatGPT does to interact with humans more fluidly, but if AI is a live we can't actually put it into lots of devices.

One scenarios that might be common with AI is that you develop it, it shows some promise and then it devolved into insanity.

There is too much assumption here that AI will be super beneficial soon just because we are making some progress. Often it's the last 10% of any project that takes 90% of the work and time and we aren't 90% of the way to AI yet I'd say.

That all being said AI is artificially evolved. This artificial evolution process will create ALL KIND of different AI types and personalities and we will mostly not know what we are creating before hand because we are using digital evolution and not custom making every part of the AI.


UniversalMomentum t1_ja7k1hp wrote

No they can't. I think a fair metric is to judge the rate of robotic progress by the state of robotic vacuum cleaners and it's not that impressive that you're going to come anywhere even remotely close to having like 39% of jobs potentially on The Chopping Block no less that you would actually have the robots made in enough Surplus in 10 years to threaten those Industries as the headline might suggest.

Also if you're just talking about fake AI/maxhinr learning using code to replace white collar workers sitting at desks then you're not talking about robots you just talking about better apps.


UniversalMomentum t1_ja50noe wrote

Guy isn't what really changes the job market dramatically though it's the robotic engineering because you know you have to physically be able to do the job.

The Brain power part A lot of times isn't going to require a real AI it's just like a bunch of repetitive actions.

Like you don't need AI to like pick vegetables or pick up trash or do deliveries or mine Commodities you just need like endless physical labor.

It's all going to come in waves you know industry will adopt automation at different rates so you really don't have much to worry about anytime soon and by the time you do have something to worry about Society will probably already be adopting in ways that make your speculation pointless right now.

No way you can predict all the new jobs that are created by an emerging technology... I say there's no way I mean you won't even come close so we can't really speculate what the future job markets hold with enough certainty for it to be anything but misleading.


UniversalMomentum t1_ja24apu wrote

I think it could be good for design in the sense of looking at how a finished product will look in a more human perspective, but for the actual design really all VR does is make your camera angle turn with your head and that's not super useful for most things. Your plain old monitor and mouse will be just as good for design. Better software is really a lot more important than if your interface is monitor/mouse vs VR/VR stuff.

VR is really just a display and input technology. It doesn't add much ability to computers that isn't there with a monitor and mouse. If you had head tracking software that moved the monitor image with your head you would get most of the wow factor of VR just like that.

I worked at an engineering company doing IT. I don't see how VR would help them. It's a lot of number entering and checking measurements against other records, dual monitors are nice and many still use paper and have to do field work and then bring that data into the design tool like AutoCAD. Soo VR seems like it would just get in the way of multitasking so bad it would suck for most real productivity uses. Like if you think about how most any office works.. VR sucks for that.


UniversalMomentum t1_ja1c6j4 wrote

It depends on why you want to learn a language. If you need to just for working a certain job then yeah we can remove the need for some jobs to require you to learn a 2nd language, but if you want to go market your business globally or live in another country and really get a long with people you probably don't want to rely on just a translator. It would still be very useful, but showing off you can learn a language or play an instrument kind of builds confidence in you that using a translator doesn't, so there will be value to that probably forever.. also sports.. as silly as they are ;)

It's kind of like all humans are in a constant show off contest and that won't ever change much, so some of those benefits will always be marketable and thus in some level of demand.


UniversalMomentum t1_ja0z651 wrote

I think we should lock up violent people much longer so the risk vs reward is much worse for violent behavior across the board.

Gun regulations take forever to have an impact and you mostly just kind of punish a bunch of people who weren't going to commit crimes trying to catch the few who might. It's a bit of a sucky strategy with low pay off that causes a lot of pushback. Accountability sounds great, but that's kind of like thinking car insurance would make car accidents rare because you're accountable for your driving. It's easy to say, but how do you make people accountable in a way that changes their behavior BEFORE they do something stupid? This requires them to like learn stuff.. which means low probability of success and very long adoption time to get results.

I'm not against the idea so much as I don't think it will work fast enough to notice much impact, so the investment of effort tends to not produce much result.

I'd rather replace all the CCTV with smart cameras that tie into rapid respnse police forces. So basically if you fuck around in public there is always a camera that can detect violent behavior or sounds and get the police there, probably eventually with a drone because that's the fastest.

Camera/mic/call for public violence and a drone gets there in like 3 minutes. If that's how it worked a lot less people would be willing to commit crime in public because a police drone would be on them recording them, shining lights on them so fast they have to stay in the shadows more... which means crime is harder to commit.

We are going to get smart camera and more camera anyway so I don't see any real intrusion of privacy issue, just public cameras that automate reporting crime and rapid response to get police there so fast it basically scares criminals how fast they can show up. Once that is setup it would be a major deterrent for most crime and not cost much at all AND it works on most crime vs just gun crimes.


UniversalMomentum t1_ja0wx76 wrote

But you're trying to mix terms from now and then, which makes no sense anyway.

I think all these terms are rather generic and don't have precise meanings and never did. A militia and an army can be the same thing and they can also be totally different things opposing each other.

That's what happens when you use terms that don't have much meaning!


UniversalMomentum t1_ja0wm1v wrote

I right is not a declaraction of necessity. It's just a limit on legislation. It's not a guaranteed freedom or a duty of citizens. It's ONLY a limut on legislation.

"Congress shall make no law" is the term that should ring in your head when thinking about the scope of a Constitutional Rights.

Freedom of religion doesn't say religion is necessary it just tries to equalize the freedom for all who want it. Speach too really, you aren't really required to talk much and exercise your freedom of speech, it's mostly optional. 2nd ammendment right is the same. It's a limit on legislation to control guns that ensures the OPTION to get a gun remains an option, but still only an option. It doesn't proclaim that all citizens should get guns.


UniversalMomentum t1_ja0o5hm wrote

Most of these licenses are super easy tests so I doubt their Theory.

Even the somewhat serious tests like plumbers and electricians are open book tests. Not just about the licensing but also that you know that kind of work has to meet code so you need everybody to be on the same page somehow.

But without the breakdown of what industries they're talking about the entire concept is basically worthless.


UniversalMomentum t1_ja0nqn1 wrote

Feel like eventually we'll be able to make an archive or copy so precise that it will serve as a backup to your brain and eventually we will have computers that can render it though you know realistically backing up your brain with today's technology is probably not going to create a viable product 100 years or whatever from now when we have the technology


UniversalMomentum t1_ja0loc7 wrote

I don't see how that is possible or necessary. You have a clandestine view of how products are made.

It's more like these things up an idea and maybe gets it to Market and through many cycles of iteration the product improves while also every new generation of Engineers want to have their chance to try to design a product or add new features and sometimes it works and sometimes it doesn't.

What about the damage it might do to Innovation and Engineering if you plan for your products to essentially never get upgrades?

Sorry nds like less jobs and less Innovation to me.

What Is the supposed upside here?

Think if the supposed upside is like waste you should just plan for robotic automation to clean up most of the waste that your business models can't.

If it's just a way to better products for consumers keep in mind a lot of people need that lowest possible purchase price option to for the product to be within their comfortable price ranges.

Cuz you make a product that lasts longer doesn't mean people are going to buy that product if also made the product cost more and normally to make a product last longer it also does cost more.

And see how there's some motivation to do this, but what you're talking about is I will complete Logistics nightmare where you also have to take away a lot of the decision making from the actual companies making the products.

First system where business is act more independent and kind of make their own decisions within and agreed framework of rules and laws versus kind of hurting everyone into the same mindset in an attempt to force a result.

Soo need to have a pretty good incentive on these long-lasting products for consumers and where the business is making them or it's like you're asking the government to take over all manufacturing take the profit out of it and make the products last as long as possible.

Sounds great at first but you have to keep in mind your Innovation Cycles would go down and your rate of progress would also go way down when you do that.

Some products are kind of just suited to be disposable because they're changing rapidly or they get used really hard.

Part of the reason we have batteries this good and screens this good is because people bought so many cell phones they theoretically didn't really need.

Prove the Innovation cycles and now we have cheaper screens and batteries for everything else so the waste did wind up having a payoff that you might be overlooking.