Saerain

Saerain t1_je65wah wrote

I mean, it's been improving for billions of humans for centuries, and stands to accelerate even more with all the downstream effects of eliminating aging.

Are you thinking mostly that the decline in population growth may reverse? From the traffic remark that's what I'm assuming.

Don't think that's very clearly true, though. Birth rates have been so strongly correlated with death rates over time... There's a lag in birth rates catching up when death rates drop, as we've seen in the African nations booming for instance, but catch up they do. Faster with higher standards of living.

4

Saerain t1_je5qwgg wrote

The pressure for that to drop rapidly is even stronger than usual for emergent tech. Aging is a tremendous burden on the whole of society. Everyone, especially "the rich" from a monetary standpoint, benefits from minimizing it across the whole population as quickly/effectively as possible.

And that's just the pure economic pressure, never mind the ethical or sociopolitical.

1

Saerain t1_jdzpr2o wrote

Speaking of which, does this use of "sentient" generally mean something more like "sapient"? Been trying to get a handle on the way naysayers are talking.

'Cause sentience is just having senses. All of animalia at the very least is sentient.

Inclined to blame pop sci-fi for this misconception.

1

Saerain t1_jdzp73u wrote

What kind of corporate PR has claimed to have AGI?

As for "near", well yes. It's noticeable we have most human cognitive capabilities in place as narrow AI, separate from one another, and the remaining challenge—at least for the transformer paradigm—is in going sufficiently multi-modal between them.

0

Saerain t1_j92ux4j wrote

I kind of get ya, but amazing feats becoming so easily commonplace as to achieve more frivolous things demonstrates accelerating returns well, and such jokes are evocative of that IMO.

Akin to computers going from top secret military encryption-breaking to virtual fantasy worlds. Pretty beautiful.

1

Saerain t1_j4i7uem wrote

Unless existence is viewed as necessarily a state of increasing suffering, I don't see how that couldn't be a bad thing.

And I don't really ultimately care more about myself—I want people to not be forced to decay and die against their will, doesn't matter that much if it only becomes possible after my lifetime. Humanity deserves it.

1