frequenttimetraveler

frequenttimetraveler t1_jdh09hf wrote

Yeah this is like looking at the linux kernel binary and seeing patterns of clouds in it. It makes zero sense to psychoanalyze a bunch of optimized vectors and to pretend to be Shamans or medieval alchemists. We better stick to scientific arguments about it

0

frequenttimetraveler t1_jb60ifw wrote

I mean, you left the biggest blocker in the last place. It's amazing that in 2023 a visit to the doctor involves measuring blood pressure and 'listening' to your lungs. My guess is the first mass medical devices will be pirated from some awkward place because regulators won't approve them for sale. Isn't it the same reason why the iphone cant even measure SpO2 ?

And then you have the "AI Safety" mob which will prevent life-saving devices because they are biased to the blood samples of rich country dwellers.

Considering the general lack of progress in how physicians work for decades (vs the progress in drug and diagnostic devices), it seems these blockers will linger for a while

Also, consider COVID. Despite having billions and billions of cases, relatively very few studies have emerged that use same procedurs for measuring indicators, because doctors tend to stick to old, incompatible methods despite the availability of more modern alternatives. Or something like long covid, which despite billions of cases as well, is relatively understudied because records of cases were not taken, wasnt even recognized as a condition for many, and too many MDs rely on their "hunch".

In short, the Medical profession has not embraced AI , which is a requirement

4

frequenttimetraveler t1_jae8dyo wrote

> Despite AI’s impressive track record, its computational power pales in comparison with a human brain.

This is not true , no human can do what an 1TB model can do. There doesnt seem to be any limit in sight to scaling and extending AI models, as opposed to humans and other brains.

Organoids don't have anatomy, layering and connectivity, which our brain does. a giant unstructured clump of neurons is not necessarily smarter (like elephants or dolphins)

> Dr Brett Kagan of the Cortical Labs

This team used an organoid to 'learn' to play the game of Pong last year, but there is a lot left to be desired in that paper. The extent to which it 'learned' (rather than adapted slightly) is debateable, as are the consequences of their findings

1