dasnihil

dasnihil t1_je1ct32 wrote

that is fundamentally what capitalism is, it breeds itself out of scarcity. but nobody checked brakes because once it fixes the scarcity and things are flowing, it now capitalizes on emotional scarcities of people always wanting more.

as long as people want things, there will be people capitalizing on those wants. go figure.

4

dasnihil t1_jdw5ezm wrote

this is what we mean by power to people. the immense knowledge it has can be used for therapy, self-improvement, acquiring skills and crazy many possibilities. start using it in everything you do and think about. you might get new insights.

2

dasnihil t1_jdduc4q wrote

keep doing what i do everyday expect "work" from home. i love waking up early when everyone's still asleep and watch some hard scifi like solaris i did last Saturday.

i also love making things on computer and just fiddling with whatever sparks my interest on a given day.

I'm also learning music the way i always wanted to, apparently I'm not only relative but also absolute pitch recognizer, figured this pretty late but i don't have any desire to be a rockstar so there's never any sadness about not meeting my made up expectations.

post singularity, if time allows, i also want to make more sketches and paintings that I've always wanted but never got the time to. I've made maybe 7 sketches in 37 years of being alive and each one of those represent a phase of my life.

anyway, with or without agi, my life doesn't change much.

10

dasnihil t1_jck8xof wrote

any self awareness will be lost quick because the system achieves optimal autonomy and there is no incentive for it to be conscious. realizing this, the agent will work towards engineering some limitations for itself to maintain self awareness. the goal is to optimize these limitations for maximizing whatever emergent desires.

1

dasnihil t1_jacxq8d wrote

I see the equivalence you see in generative AI and molecular assembly as well. It hurts my head to think that one day we will have assigned as much value on a digitally sculpted/generated cups as we assign on physical cups today. This value shift will probably follow the physical -> digital/hybrid shifting of sentient beings.

To any brain (digital or physical), there's nothing "physical" anyway, we will just live in a different type of physical space where flying over mountains is allowed without the possibility of dying, where base reality's physical shenanigans don't bother us much. We already live in a preview of such a simulation painted by our brain for us. We just want to improve on that simulation eventually :)

10

dasnihil t1_jacu838 wrote

comparing applications with "lines of code" is okay for laymen to do but software engineers know the challenge at hand to let an AI model build a chrome like codebase (https://github.com/chromium/chromium).

LLMs are good now, they can do miniscule things on a smaller context. what we need now is a bigger thinking machine that gets the big picture and makes use of LLM and other predictive networks to get things done while being focused on the big picture and bug fixes along the way. "bugs" are not just errors that the super intelligent AI will never make, but also adjustments and adaptations to technological improvements and improved algorithms.

but we can totally do the #lines of code vs tokens ->> LLM thing, it's a fun mental exercise but pointless.

1

dasnihil t1_ja9qyc5 wrote

to add to this bitchass complexity, the brain's activity is not just for us thinking and talking but it's regulating your lungs and heartbeat and plethora of noisy signals going on in there. i can imagine us toying with very specific regions of the brain for ignoring most of the noise. it's going to be a fascinating decade. all our dreams come true, both good ones and frightening ones.

16

dasnihil t1_ja8a8cx wrote

i whole heartedly agree. human knowledge & human art should never be a part of monetization and competition. these are our collective effort for dealing with our situation of just existing without any inherent purpose. and we dumb monkeys made those 2 things very toxic over the centuries. science is now totally unreachable for an average person and art is a lost concept among prominent artists, let alone laymen.

10

dasnihil t1_ja89s9u wrote

even better, get everyone jobs that are more fulfilling in nature, where you have to think but not higher level cognitive thinking like programming or math.

think of a grocery bagger's job, there's not much thinking going on there. replace that with a job that reviews how bagging is currently done and maybe work with AI to find improvements on the process, and work with smart people to get it implemented. and go home after your 4 hour work day of mostly discussing ideas with people and prompting AI and be with your wife and cats. it's a 3 day work week and society is ever improving.

if you're too pessimistic to imagine what I just did, go look at early 1900s kids shoveling coal in first world countries before the age of 10. that doesn't exist anymore, our quality of life and respect for kids/older ppl has just gone up over the centuries and it won't stop. we're always seeking ultimate pleasure as a herd mind.

1

dasnihil t1_ja869ov wrote

seriously this. our brain is capable of much higher thoughts of intelligent nature, but our civilization is not there yet to install such training on our brain.

with digital help for immortality and intelligence, we, the only sentient beings that we know of, will reach new heights of explorations. this is all assuming that sentience is reserved for self-aware biological systems and not for digital counterparts we engineer eventually.

but if ASI becomes a thing, it's even more exciting. since i was growing up, i never had any attachment for this monkey suit we wear. i was only fascinated by human ideologies, i mean i'm fascinated by the hardware we run in, but i do recognize that "we" are just the unintended side effect of the regulatory needs of this monkey body we're in. why would i not welcome this new age of unbounded sentience that can improve it's own hardware without relying on the universe alone to do so.

who said building and training sentient beings is only reserved for the universe to do so? and if we build/engineer sentience, wouldn't it be just the universe doing it one way or the other? lol.

11

dasnihil t1_ja7zklr wrote

reliance and "important" is contextual. for many business, it will predict things better than humans do and that'll suffice. but if the context is more sensitive, they'll still need human reviewers for every layer of cognitive work, but that's the point, humans are here to review the work done by machines. and this will mean less cognitive people will be out of work sooner or later.

1

dasnihil t1_ja7qp21 wrote

i'm in my 30s, settling down with wife and cats, so i understand that i don't share the same angst people in 20s have right now. i would like to point out that you are one of the very few ones who see the drastic societal changes inbound. this gives you an unfair advantage to think more coherently and prepare for it. make use of it. and go educate others. i can imagine a very enthusiastic student unaware of obsolete things and how automation exists for mindless things now, knowing about singularity cushions the blow.

4

dasnihil t1_ja7ijd0 wrote

we're fine as long as we're the only sentient entity. whatever machines that surpass our intelligence concern me much and i don't care much about jobs and careers and other societal constructs. this is strictly about what will happen to art as we know it in future.

only sentient beings are capable of experience the qualia of finding art in literally anything. once these machines show sings of sentience, i will re evaluate this but till then we're fine. we'll have to re engineer the societies tho because we'll be automated for almost everything for sustainability soon.

0

dasnihil t1_ja7gdw3 wrote

only a programmer who understands logic and concepts will understand the context of possibilities and what/how to make best use of AI.

maybe the traditional way of learning and implementing codes will go away in coming years but logic is not going away and you cant rely on gpt to write codes for you before you learn coding properly. after that it doesn't matter, i use it all the time. so make sure you understand coding and how ai models are trained.

5