Comments

You must log in or register to comment.

HighTechPipefitter t1_j6xlimi wrote

> raising questions about whether the AI chatbot could one day help write the exam or help students prepare for it.

It's raising a much more important question than that...

46

visarga t1_j6xtpyh wrote

Maybe it can raise research questions and then follow through experimental testing. That would be neat. Automate science.

14

fuckingcoolshit t1_j6xyj9w wrote

This is where we either lose humanity completely or finally gain it all back. Here's to hoping!

15

povlov0987 t1_j7019wc wrote

I like how they downplay it will replace all jobs soon

7

JenMacAllister t1_j6x2pso wrote

Do we have to call it Doctor Chat now?

20

crua9 t1_j6xh5er wrote

So here is the thing. There was a report on the news a week ago about how there was a ton of nurses that didn't go to school. They used a degree mill to make it look like they went to school.

Then during it, the people in charge of doing the national test (which you need to do in order to be license. There is no way around this but to be legit licensed) brag about how only about 30% of them passed the test and how this proves how hard it is to be a nurse.

​

But what that told me is the restrictions of having them go to school is too high as is and the system is built on pure BS.

18

Carl_The_Sagan t1_j6yh9gy wrote

Why is that your conclusion? Don’t you want your relatives cared for by someone who is educated in the field?

6

crua9 t1_j6yiv4r wrote

Can they do the job yes or no?

What was said in this report is there was 0 evidence of anyone that got bad treatment.

Anyways, I think in the future when we get humanoid robots we will be able to use them for basic medical. Maybe even make it where a robot or more advanced AI can remote in.

4

Carl_The_Sagan t1_j6yot42 wrote

It’s very hard to loop in one person into bad outcomes in a hospital system with team based care. These systems are very good at obscuring and limiting liability. I personally am not thrilled about people faking credentials who are direct responsibility for others health.

6

crua9 t1_j6ytto1 wrote

Keep in mind they are only faking the degree. They still have to be license. This is a separate thing and can't be faked. Without this, the person legally can't be a nurse.

I'm not saying training is bad. I just know how back logged nurse schools are and nursing degrees go way beyond medical like any other degree like I couldn't give a flying f if my nurse took art class or world religion. So it comes back down to. Can they do the job yes or no

Any case, it is likely in a number of years this won't be a problem with robotics

3

em_goldman t1_j6zs1ws wrote

I’m a doctor who works with a ton of nurses -

  1. It’s a vocation that needs to be taught from a human, to a human, mostly in the setting that the work will be done.

  2. It’s important to have a basic understanding of medicine to be a nurse so you don’t accidentally kill someone when something happens out-of-the-algorithm.

Example: someone with too high of a blood sugar needs insulin. So if they’re in a state called diabetic ketoacidosis, they have an incredibly high blood sugar, so you’d think the first step is insulin, right? But insulin causes potassium to move into cells, and people’s total body potassium is super low in DKA because it’s getting peed out, despite their potassium blood levels looking fine. So if you push insulin, you’re going to cause all the potassium to shift into the cells, which can give someone a heart attack.

You can (and do) train people to memorize algorithms like “potassium and fluids, then insulin.” But I want the nurses who work with me to be their own smart, critically thinking, educated selves - it’s safer, it’s more rewarding, and your team gives better care.

  1. Most schools nowadays, for anything, are outdated and largely bullshit. My medical classroom schooling from my school was disorganized and useless. I learned the material using 3rd-party online resources and Anki. My in-person training is what matters the most.
6

crua9 t1_j70o455 wrote

Based on 3, I'm assuming you are agreeing with me that it is pure BS. The hands on is needed. But forcing them to take stupid things like art or whatever that has nothing to do with the job is BS money making crap with the excuse of "well rounded".

​

As far as your first part. IDK if AI should teach doctors yet. Like eventually I think robots will have to be good enough for the basics. I imagine at some point, any humanoid robot you have in your house will double as bottom level medical care. Like they are good for finding if there is a problem, dealing with cuts, and so on. Not so much with fixing a broken bone or whatever.

But at some point it will have to be better and adapt. I don't think we are anywhere near this.

​

Anyways, I can see at some point nurses and doctors being complete replaced by robots in many areas. Like there is 2 options.

Option 1: Many hospitals, homes, etc will have humanoid robot. A human doctor can remote in through the robot and control it, feel, hear, etc as if the doctor is there. Maybe even smell depending on if the hardware allows for it or there is some brain implant in the doctor.

Anyways, this makes it where the person can be at home, a hotel, or even the moon. And the doctor can interact with the person as if they are there.

​

Option 2: The AI will keep getting better and better to the point where human doctors will be obsolete.

​

​

Both cases, your job is safe. But there will be a time where just like truck drivers today can see it with self driving around the corner. The ones driving today are likely the last generation in that career.

2

monsieurpooh t1_j71aba4 wrote

That is true for a lot of jobs. Go to a software engineering interview at a typical big company. Compare the skills you need for the job vs the ones you're being tested for. Very little overlap.

2

em_goldman t1_j6zt5pf wrote

I’m a doctor and not that surprised, the questions are very algorithmic.

Do I think AI could make medical decisions? Absolutely.

At the expense of comparing myself to Paul Atreides, for which I apologize, being a doctor is like wedding the mentat and the Bene Gesserit - your android side is entangled with your emotion witch side.

Do I think AI could de-escalate the delirious dude with meningitis who was trying to flee out the door that I saw yesterday? Nope.

AIs would make amazing matrix doctors of still, sedated patients in tanks.

Do I think that doctors could be replaced by AIs and the role of physician would be taken over by a slightly-higher-trained nurse akin to a charge nurse, and all the other nursing staff would carry out the orders that the AI places? I fuckin hope so, then I would have an excuse to retire

9

monsieurpooh t1_j71a647 wrote

I don't think the AI needs nor recognizes the distinction between the emotional vs logical side; it's just optimized for a task, and there's no theoretical limitation from it being optimized for de-escalation given enough time, training data and perhaps robot body if necessary

2

Carl_The_Sagan t1_j6yh02v wrote

There’s an in person component to step 2 which I doubt it can pass without a physical form

3

em_goldman t1_j6zs9cm wrote

Fuck, is that back?? I thought they got rid of it for good? Any robot with a good-enough skin and an American accent can pass that bullshit thing.

1

Carl_The_Sagan t1_j6zumf1 wrote

I actually have no idea if its still a thing, but yea I agree, mostly an issue of dermal-actuator interfacing

1

Ale_Alejandro t1_j6z45kl wrote

Weeeeeell… can you really say it did it without cramming when it was trained on all that medical data plus a heck of a lot more for who knows how long, specially since computers process data waaaaaay faster than we do, so in a sense it did a a shit ton of cramming.

Good thing it’s not actually sentient or it might have asked for it to stop XD

2