Comments

You must log in or register to comment.

Reagalan t1_j1m3hux wrote

"I'm sorry. Our ML assistant alerts us that, due to your pre-existing ADHD diagnosis, we cannot prescribe you Schedule II medications, as ADHD carries an elevated risk of developing substance use disorders."

439

carlitospig t1_j1m71az wrote

Or ‘due to your socioeconomic background, you appear to be under elevated levels of stress and are therefore no longer qualified for this type of healthcare. Please pay for your parking on your way ou.’

377

EvLokadottr t1_j1pbqpl wrote

"we see that you are a middle-aged female who has survived sexual abuse as a child. Your are at risk of aberrant behavior because of this, so you will have to take ibuprofen and bite a stick whilst the burns on 80% of your body heal."

44

digitelle t1_j1mogcd wrote

Right after you had your wisdom teeth pulled.

Which to be honest was the absolute worst pain. It took a few days for it to even settle in but the sheer throbbing was unbearable.

75

CaptainNoodleArm t1_j1n3mkp wrote

I handled my removal perfectly with Ibuprofen (even though the procedure was 2h long). First night I used a small dose of opoids just because my doctor prescribed me only 8 tablets with 200mg Ibuprofen (and I'm huge), after my friend upped the dose to 400 I was pain-free

16

thebraddestbrad t1_j1obw7e wrote

Not all removals are the same. Mine was so intense I had to be put to sleep and I looked like a squirrel hoarding nuts in my cheeks for a week

26

Mejai91 t1_j1qsycg wrote

Me too, still didn’t take the opioids they gave me because ibuprofen and Tylenol are of a similar efficacy when it comes to pain control, they just don’t get you high

2

thebraddestbrad t1_j1r2cn5 wrote

Not true. Opioids are better at pain control than NSAIDs. There are other options besides otc NSAIDs and opioids, too, like ketolorac.

This also isn't about who can handle more pain without resorting to narcotics. If you didn't need anything but otc NSAIDs, don't assume it's because you're more tolerant of pain. You just weren't in as much pain.

1

noiamholmstar t1_j1q3uh8 wrote

And I had 4 impacted wisdom teeth that required surgical removal, and I never took anything but ibuprofen. They prescribed opioids, but I never felt the need to take them. Everyone is different.

1

delirium_red t1_j1pnbr7 wrote

And this is the only thing you get if you are not based in US. No one is giving you opioids prescription for teeth removal, and I know it’s extremely painful for weeks. Not so painful to risk addiction though

5

CaptainNoodleArm t1_j1pnn32 wrote

Without Ibuprofen it was extremely painful and they were kinda stingy with that. I had to take em for 4 weeks, without them the pain was so nagging that I was sick and couldn't eat all day. But as you say in America you get opoids like candy

2

Bruc3w4yn3 t1_j1q46lv wrote

I believe that the prescription ibuprofen is the same as the OTC, except it's more concentrated/higher dosage. Basically, OTC has 200mg and prescription strength has 800mg, so taking four tablets OTC is the equivalent of one tablet prescription. The prescription dose has all of the same digestive and kidney complications as taking more than the recommended OTC dosage, but it's generally considered better not to advise taking higher than advised dosages of easily accessible medications because I guess they fear that increases the likelihood of future abuse.

1

CaptainNoodleArm t1_j1qakcv wrote

I know, but if I'm in pain it's not properly working, also Ibuprofen abuse happens but it's far less dangerous than opoids. Also my friend was/is a doctor.

1

Mixedstereotype t1_j1pxwew wrote

I must be a mutant because I felt no pain after mine and I had 9 teeth removed in one setting.

2

BrandynBlaze t1_j1nuqty wrote

To be fair they do that now without ML. I told my doctor I used marijuana in college because I thought it was important to be open with them and then 2 years later they denied me pain medication after an injury because of “substance abuse concerns.”

54

james_d_rustles t1_j1oylby wrote

Lot of people have this issue, most people just know that you can’t actually be honest with your doctor anymore. I have friends who told their doctor that they smoked some pot in highschool, and at 28 they still won’t prescribe them pain meds after various surgeries, won’t prescribe them ADHD meds even with a diagnosis dating back decades and long history of similar prescriptions, etc.

Straight up, do not be honest with your doctor unless you genuinely need help with something. Don’t tell them if you smoked weed a few times, or drank more than 5 alcoholic beverages at a party that one time. It’ll only make your life a nightmare years down the line when you need medicine and can’t get any.

24

linksgreyhair t1_j1p87iv wrote

Except: do be honest with your anesthesiologist because not disclosing drug use can cause horrible stuff, like you waking up during surgery.

This is probably obvious but I’m not talking about stuff like smoking weed a few times in college, but be honest about your amount of current drug and alcohol use and any history of very heavy use.

23

james_d_rustles t1_j1pbd3m wrote

Of course. It’s really a shame that its set up like this, because you’re right, there are times that your doctor truly needs to know. But it’s a shame that it’s up to the patient’s best judgement about what should be disclosed or not, instead of simply being able to tell your doctor the truth all the time without fearing negative future consequences.

We really need a complete overhaul of the way we view drug use and drug addiction, it seems like the standards that the medical industry follows were written by 1930s mennonites. Nobody should have to worry about their ability to receive necessary medications 10 years down the line because they smoked some weed in college, or drank too much a handful of times. Past moderate drug/alcohol use should not bar a person from various prescription medications.

8

Mejai91 t1_j1qsjhf wrote

Friendly reminder that adderall and oxy are not necessary medications

−2

delirium_red t1_j1pn6ta wrote

Opioids are prescribed routinely after surgery in the US, much more than actual indications require. I don’t think it is in the case in other western countries, they usually ween you off in the hospital and you get something like diclofenac (voltaren) for at home (unless chronic condition / end of life of course). So maybe ones denied are the lucky ones

0

Mixedstereotype t1_j1pxrl6 wrote

I refused a tramadol injection because I didn't feel I needed it(resetting broken arm) but the doctor wouldn't relent until I said, "Family history of addiction." So now every time people check my things they look at me like I'm an addict.

2

Mikey637 t1_j1nqxus wrote

I mean I get it, years ago I was spiralling with prescribed codeine use to numb both back pain and ADHD symptoms, now I refuse the prescription and use over the counter doses (half strength mixed with either ibuprofen or paracetamol, the fear of stomach lining/liver damage stops me over doing it).

Even without my back pain I’d struggle to not use less than once a week, they make me calm, collected and stop most of the weird “possibly more than the ADHD spectrum but further testing required” triggers and other behaviour.

Thankfully though I am in that queue for further testing so I can hopefully stop poly usage and live a more productive lifestyle. Life sober is difficult to the point I’d rather continue being poor with a substance problem even having spent plenty of time sober.

13

whoamvv t1_j1ozdt1 wrote

Why in the hell were you using codeine to treat ADHD?

How did you get off the codeine? I mean, if you simply pulled yourself off it because you realized it was becoming an issue, I don't think that counts as substance abuse.

6

Mikey637 t1_j1q3oxb wrote

It numbs the symptoms more than treats them. I pulled myself off as it was becoming a problem before it became a problem, I didn’t want to have to ruin the only thing that helps my back come out of spasm but even still it’s a constant effort to not overindulge.

Luckily one of my things is having control over myself so I’m constantly making sure I eat healthy food and don’t destroy my body with substances, so many others aren’t so lucky.

2

whoamvv t1_j1q4vob wrote

Damn, I just want to say congrats. Sounds like you are a truly good person, and we definitely don't have enough of those these days.

1

Ok_Dog_4059 t1_j1o82zg wrote

I have too many times where the doctors and I can't explain why something is happening and the doctors either used their best judgment or knew me long enough and well enough to trust what I was telling them. I can imagine AI just saying this shouldn't happen so leave when it is actually some fluke incident not even trained professionals can explain.

7

whoamvv t1_j1p1vmv wrote

Yeah, here I was thinking it might actually help us because it would weed out the addicted and dealing NTs and allow those of us who actually need it to get it. But, now I realize that you are right and it's just another way to screw us.

5

hippolover77 t1_j1o5st9 wrote

Except they will prescribe you stimulants. Took me years to realized I was dependent on adderall

2

Mr_Venom t1_j1m7qla wrote

Take the model out of that example for a second. Not prescribing drugs to people because of contraindications is a good thing.

−52

Masark t1_j1mc338 wrote

Except the contraindication is the condition being treated.

47

Mr_Venom t1_j1mcqcj wrote

Well, yes. There's lots of treatments that would technically solve a problem but cause too many problems to be viable.

−46

theoccasionalempath t1_j1mwx76 wrote

Every treatment has risk factors, so we're just supposed to let people suffer in pain, even though we have the solution?

15

Mr_Venom t1_j1nea2a wrote

Someone could "cure" my depression with a lobotomy, or a handgun.

1

FailOsprey t1_j1nf8s6 wrote

Unfortunately, opioids are not an effective long-term solution for pain. They feel good-and anyone on them long enough will sware by them-but most objective measures show they create more problems than they fix.

−2

Well_being1 t1_j1nq816 wrote

Tolerance creates to their effects, as to almost any other medication. That they feel good is not a problem

4

FailOsprey t1_j1nty56 wrote

The euphoria isn't a problem per say, but it will bias the patient in favor of more opioids. Drugs that modify the dopamine system have a tendency to skew opinions in favor of continued use.

For susceptible individuals, these changes can be more or less permanent. The damage was done the minute they filled their first prescription; instead of withholding opioids from those who've already been exposed, it makes more sense to prevent exposure in the first place.

... without meaningful regulations, doctors use these properties to create patients for life. A patient on opioids is much better at scheduling appointments then one on ibuprofen. Given a lack of immediate consequences, even the most well-intentioned doctor is susceptible to large enough sums of cash.

−2

Devil_May_Kare t1_j1phwea wrote

If the level of opioid signaling in an opioid user's brain weren't higher than a non-user's, there'd be no driving force to maintain tolerance.

1

Masterlyn t1_j1nicl4 wrote

So if an AI tool becomes advanced enough to reliably predict that prescribing a patient opioids will have a 100% chance of inflicting the patient with Substance Abuse Disorder, you believe the doctor should just go ahead and prescribe them the drugs?

−2

james_d_rustles t1_j1p044o wrote

Every patient is different, and that’s why we leave these decisions to doctors who know each patient’s specific situation. Sometimes potential substance use disorder is by far the lesser of 2 evils.

Say patient 1 has a 100% chance of developing substance abuse disorder. Patient 1 has also just been hit by a train. They’re on the verge of death, they’re peeing the hospital bed crying, asking for god to put them out of their misery because of the unthinkable amount of pain they’re in. There’s a real chance that they’ll die soon.

Patient 2 also has a 100% chance of developing substance use disorder. Patient 2 says that they have mild lower back pain after they get home from their office job. They have no other medical problems, and they live a normal, well adjusted life.

Using your reasoning, both patients should be denied painkillers. Do you think that is a sound medical decision?

Every case is different, and every medical decision carries various risks and trade offs. It’s between a patient and their doctor to decide which trade offs are worth it, which aren’t. You’d be crazy to say that the amount of harm done by a touch of back pain is greater than the amount of harm done by a long term opiate addiction, but what about patient 1, who was hit by a train? They may or may not live to see the next month - don’t you think that the trade off for that patient when looking at a potential opiate addiction would be a little bit different than the patient with slight back pain?

Every single medical decision is like that, to varying degrees. Some decisions are easier than others, and some carry with them much less risk of harm, but nothing is free of side effects or risks. Opiates are no different. Leave the doctors to make the decisions that they’ve been trained to make, looking at individual patients and circumstances.

3

Devil_May_Kare t1_j1phr7v wrote

I think in cases like this, it should be up to the patient whether the risk is worth it for them.

1

wrathtarw t1_j1m3ij4 wrote

Things like this are not the helpful tools they appear to be. They are only as good as the researchers and data used to train them, and they are significantly biased by both

The opiate crisis is a disaster but it also has created significant problems for people who never have abused their medicines and need them to be functional.

https://internationalpain.org/how-the-opiate-crisis-has-affected-chronic-pain-sufferers/

https://www.sapiens.org/biology/chronic-pain-opioid-crackdown/

306

carlitospig t1_j1m77qa wrote

Amen. It made a lot of established pain folks reduce or cease entirely their pain management protocols. Still left with their pain, they went to the street for help. And now they’re fent addicts.

This whole thing is such a mess.

118

bob0979 t1_j1nfe99 wrote

I was in an inpatient rehab with a d1 college soccer player in the US. Broke his ankle, got on opiates, took them as prescribed, never made it back on the field because his scripts ran out and he still hurt.

24

carlitospig t1_j1nfwon wrote

Aww poor kid.

5

bob0979 t1_j1ngeio wrote

The worst part is it's his fault sort of, but I mean that's just about an impossible situation to win. I know I'd have failed there. I don't know many people who'd have been capable of running their lives correctly like that. It's not really his fault, at least not entirely. He was failed by society and couldn't fix it himself.

12

carlitospig t1_j1ngmsl wrote

You’d think the uni would bend over backwards to make sure he could get on the field again, the best specialists, etc. That’s really sad. I wonder if it still pains him today.

8

pseudocultist t1_j1ohsgx wrote

I couldn't get even a short course of painkillers when I had a facial abscess last year. I wound up having to take my dog's medications for several days. The dentist was like, "see you didn't need me to prescribe you anything."

I am a recovering addict. I know how to get street drugs. Please, please, please don't leave me hanging when it comes to pain management. I need to be able to trust my doctors to do this for me.

17

Spacefungi t1_j1n8932 wrote

Yeah, if you want to reduce addiction, give everyone good healthcare so people don't self-medicate and give people a good standard of living so they don't turn to substances to forget about their hardships.

30

Puzzleheaded-Ad-5002 t1_j1nskx2 wrote

Agreed. Same with ADHD medications. A young person (even at 35), who has a history of ADHD can go to 2-3 doctors and be treated like I’m there to get something to compliment some iv meth.

I hate that my health records are electronic and shared everywhere. And it’s being backed up many times that by not treating people with ADHD, they are much more likely to develop a drug issue or to relapse.

10

juggles_geese4 t1_j1oe2hq wrote

I think I’m lucky that my doctor treats me with respect and not like a drug seeker or addict because I’m on meds for ADHD. It probably helps that I don’t ask for them early, or anything. She initially sent me to a psychiatrist to help. I thought they were the type of doctors you met with had therapy and they prescribed meds as appropriate. No. She only serves the purpose of prescribing me meds. She tried to prescribe me Klonopin at a high dose multiple times a day because I have bad anxiety too. I had actually tried that with my doctor and all of those types of drugs make me. Too tried drunk feeling and Im a funeral director that literally can’t take that type of medication every night since I do on call a few nights a week. She forced me to try them so I did. She suggested a different med for adhd at one point and I thought that her and my doctor were in communication, so I told my doctor her suggestion and she prescribed it agreeing with the reason. The psychiatrist got pissed that I had asked her to prescribe something and then went on a rampage about other meds I’m on. I’m on a med for restless legs and have been for like a decade. It has a benefits of quieting my brain at night but doesn’t make me groggy like Xanax and what not. It’s a Parkinson medication. She at that point decided that my dr wasn’t going to be prescribing any of my meds from now on. That the klonopin wasn’t making me so tired the meds I take for restless legs is and that I was going to stop them. Ok, lady. You want to take me off a medication that makes me physically I’ll when I miss a dose not to mention makes me go literally insane because my RLS kicks in full force because of withdrawal (it’s not a control substance but stopping it suddenly is not a good thing like many meds.) to put me on one that is that I say I can’t be taking if I’m going to keep my job? The klonopin actually made my Restless leg syndrome worse in itself for whatever reason. As someone with ADHD it might be best to not needlessly take controlled substances that aren’t doing what I need them to do? I never went back to her. I told my dr about it and she was really upset that that was her plan. I’m not even sure she can be making medical decisions like taking me off meds that I don’t take for mental health issues but because of my physical health? It was utterly insane. She actually made me feel like a drug seek not just for asking my doctor for Concerta but for the RLS meds I took. I was really baffled by her for so many reasons. Now I just see a therapist and my doctor and I went through a bunch of trial and error to find the best combo of meds for my ADHD and anxiety until we found what is currently working best. A lot of my anxiety comes from adhd anyway. Point being finding your doctor that doesn’t judge is most important. You’ll have to see other doctors along the way but their opinions only mean so much.

5

Puzzleheaded-Ad-5002 t1_j1s86gn wrote

You seem to have better communication skills and assertiveness with your doctors than I have had.

I have always had major issues with RLS as well! I’ve been on a pretty high dose of Gabapentin for about 10 years. It helps with insomnia / quieting my brain quite a bit as well.

I used to be on a high dose of Adderal for several years. External events lead to me stopping it and switching to kratom. Kratom worked fantastic for me for about a year but then the quality dropped and I started taking more than a moderate dose. After another 8 months I unfortunately became very physically dependent on it, and when I told my doctors, they cancelled everything except for Gabbapentin.

I haven’t taken Kratom for 2 years, but my doctors treat me like I’m trying to become a homeless in meth user and rob banks and lie to doctors to get ADHD meds. I had to “restart” the process of finding a workable med and dose. It took over a year before I was trusted with concerta. I just switched to Focalin and I’ve found it to be better for me and with less of the “edginess” I’d sometimes get with concerta.

I wish you the best on this journey. It sounds like we have shared a lot in common with our experiences with doctors!

1

juggles_geese4 t1_j1sg2x3 wrote

It’s interesting that Kratom gives doctors such a drug addict vibe when it’s a supplement. That doesn’t make it less addicting or ok for you but it does make it easier to stumble into. You’d think they’d be a little more willing to see that you aren’t necessarily an addict just because your dependent on it.

I’m glad they trust you again with different meds. It’s interesting that you also struggle with RLS. I wonder if that’s common in people with ADHD or what.

2

Puzzleheaded-Ad-5002 t1_j1snzg7 wrote

My relationship with doctors changed significantly for the worse once I opened up about taking Kratom.

I really appreciate hearing someone say that they would hope doctors would consider someone that started taking kratom as what they initiated thought was the healthier option, to not be the same as someone that was taking a drug with an totally different reputation. I wish my doctors felt the same way!

You are spot on about ADHD and RLS. Both RLS and PLM are known to be more common in those with ADHD, and the theory that I buy into the most is that it is due to how our brains have a harder time with supplying a consistent amount of dopamine.

2

juggles_geese4 t1_j1soxoc wrote

I’m sorry your relationship with your doctor changed so much. That sucks a ton. I have a hard time with new doctors so I’d struggle big time if something like that happened. I hope things get better, or at least that you don’t have to much of a struggle with your ADHD and other health issues mental or otherwise.

I’m going to have to look more into the connection. I don’t often talk about RLS so I never really heard anyone else mention it.

2

Hydrocoded t1_j1pe10e wrote

The opioid crackdown is one of the most cruel things I have witnessed in my lifetime. I’d rather see heroin legal and unregulated than deal with our current system.

5

pharmaway123 t1_j1ntu65 wrote

can you elaborate a bit? How would biases in the nationally representative claims data (or the researchers) here make this model less useful?

2

wrathtarw t1_j1ocd3y wrote

The same bias that is present in the medical system is then programmed into the algorithm- the way machine learning works is that it essentially condenses information from the source and then uses it to determine the output. Garbage in garbage out…

If the source is flawed so too will the algorithm: https://developer.ibm.com/articles/machine-learning-and-bias/

And the source is flawed: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8344207/

5

pharmaway123 t1_j1on8ck wrote

Right, and I'm asking in this specific instance, given the rank ordered feature importance from the study, how would bias impact the results from this model, concretely.

−1

wrathtarw t1_j1onilg wrote

Sorry- reddit karma doesn’t pay enough to do that analysis for you;

3

pharmaway123 t1_j1oqzaz wrote

Yeah, I figured it was just a nice sound bite without any actual thought behind it. Thanks for confirming.

−5

croninsiglos t1_j1lsywg wrote

“… sociodemographic information”

There it is! Then they go on to claim it’s predicting and not labeling.

Yet, if this informs prescribing then you’ve automatically programmed bias and prejudice into the model.

171

fiveswords t1_j1lwknk wrote

I like that it predicted "high-risk" at 86% accuracy. It means absolutely nothing statistically. If someone is high risk and NOT an addict is it still an accurate prediction because they're only predicting the risk?How could it even be wrong 14% of the time

59

pharmaway123 t1_j1nubt8 wrote

If you read the paper, you'd see that the paper predicted the presence of opioid use disorder with 86% balanced accuracy (sensitivity of 93.0%, and a specificity of 78.9%)

6

poo2thegeek t1_j1m1f3k wrote

There’s probably definitions for what “high risk” is. Maybe for example “high risk” means 90% of people in that group overdose within 6 months. These definitions are obviously decided by the person creating the model, and so should be based on expert opinion. But predicting someone as “high risk” 86% of the time is pretty damn good, and it’s definitely a useful tool. However, it probably shouldn’t be the only tool. Doctors shouldn’t say “the ml model says you’re high risk, so no more drugs”, instead a discussion should be started with the patient at this point, and then the doctor can make a balanced decision based on the ml output, as well as the facts they’ve got from the patient.

−1

Lydiafae t1_j1lybua wrote

Yeah, you'd want a model at least at 95%.

−7

Hsinats t1_j1lzbpr wrote

You wouldn't evaluate the model based on accuracy. If you 5 % of people became addicts you could always predict they wouldn't and get 95 % accuracy.

17

godset t1_j1mdxec wrote

Yeah, these models are evaluated based on sensitivity and specificity, and ideally each would be above 90% for this type of application (making these types of models is my job)

Edit: the question of adding things like gender into predictive models is really interesting. Do you withhold information that legitimately makes it more accurate? The fact that black women have more prenatal complications is a thing - is building that into your model building in bias, or just reflecting bias in the healthcare system accurately? It’s a very interesting debate.

4

Devil_May_Kare t1_j1pkczk wrote

And then no one gets denied medical care on the advice of your software. Which is a significant improvement over the state of the art.

1

andromedex t1_j1lynho wrote

Yeah this is really scary. What's even scarier is to wonder if it's reinforcing the exact biases that it's founded on.

29

carlitospig t1_j1m6oty wrote

Yes. Yes it is, which is why we’ve been screaming about bias for years. Yet they keep not addressing it and instead write articles like ‘look how great this is!’ instead of ‘look at all the power we are giving to our own biases!’

26

andromedex t1_j1m8pv9 wrote

People just think of AI as a magical black box.

12

InTheEndEntropyWins t1_j1lycqt wrote

There was another study that showed that ML can determine race from body scans. People were like soo what, it's not an issue.

The problem is when the ML just determines you are black from a scan, and is then like no pain killers for you.

20

Azozel t1_j1lyvdz wrote

I dont even know why they thought they needed to do it this way. I recall reading an article a couple years ago that stated they had identified genes that reliably identified if a person would be likely to become addicted to opioids.

0

carlitospig t1_j1m6u5g wrote

But even then folks with those genes deserve pain management care too. Needlessly suffering because your grandfather was an alcoholic is just cruelty wrapped in a ‘care’ bow.

19

Azozel t1_j1mx8wc wrote

Of course but then docs would know to monitor you more closely

2

linksgreyhair t1_j1p8stn wrote

I stopped telling my doctors that my mother was an addict for this exact reason. They immediately start side-eying me.

Too bad it’s still somewhere in my electronic records forever so I’m sure the damn algorithm already knows.

2

something-crazier t1_j1lwdnn wrote

I realize ML in healthcare is likely the way of the future, but articles like this one make me really worried about this sort of technology

159

BlishBlash t1_j1mq46q wrote

Don't let anyone gaslight you, it WILL be as bad as you think. Probably worse. Anything to make the most money possible at the expense of patients and medical workers.

36

poo2thegeek t1_j1m10fi wrote

Agreed. ML is the future, but it needs significant legislation to ensure its safe. ML probably should just be used as an aid, and not as a final truth.

34

UnkleRinkus t1_j1n2xsy wrote

If you think Congress's attempts at regulating social media were disastrous, wait until they try to regulate applied statistics and model fitting. You can't usefully regulate something you don't understand.

19

TurboTurtle- t1_j1p3p4g wrote

Of course. Why try to understand something when it’s so much easier to just accept loads of money from your favorite mega corps?

2

Hydrocoded t1_j1pdxyr wrote

They already regulate the medical system and look how wonderful that has turned out.

Lawmakers ruin everything they touch.

1

Subjective-Suspect t1_j1pndnb wrote

True story: I was threatened w police intervention by my doctor’s nurse for trying to get a refill for hydrocodone the day before Thanksgiving.

I had pinched a nerve the previous week and was in substantial pain. I knew I’d run out of meds over the long weekend, so I called. They assumed I was already out of medication and accused me of abusing it. I went by the office w the partially-full bottle, to no avail. The nurse and another staffer (witness) pulled me into a room. They refused to listen or examine my med bottle. That’s when they threatened cops if I didn’t leave immediately. I left and went straight to urgent care. Prescription given.

I booked my next—and final—visit to my doctor to tell him how furious I was to be dismissed, threatened, and ostensibly left in pain for days. I told him I was never coming back and that they were damn lucky that’s all I intended to do. He claimed no knowledge of whole ugly situation. As if.

4

faen_du_sa t1_j1m3qts wrote

Indeed. Would Imagine it would be extremely helpful in pointing to where to look in a lot of cases. Prob a while since we can rely on it exclusively tho, would also imagine that is a territory of responsibility hell. Who gets the blame if someone dies due to something not being discovered, the software team?

Pretty much all the problems that arises with automated cars and insurance issues

6

poo2thegeek t1_j1m457q wrote

Yeah, it’s certainly difficult. But it’s also complicated. For example, I believe ML models looking at certain cancer scans have higher accuracy than experts looking at the same scans. In this situation, if someone is told they have no cancer (by the scan) but it turns out they do, is the model really at fault?

I think the thing that should be done in the time being, is that models should have better uncertainty calibration (I.e, in the cancer scan example, if it says this person has an 80% chance of cancer, then if you were to take all scans that scored 80% chance, then 80% of them should have cancer, and 20% should not) and then a cutoff point at which point an expert will double check the scan (maybe anything more than a 1% ML output)

11

DogGetDownFromThere t1_j1mmy4n wrote

> For example, I believe ML models looking at certain cancer scans have higher accuracy than experts looking at the same scans.

Technically true, but not practically. The truth of the statement comes from the fact that you can crank up the sensitivity on a lot of models to flag any remotely suspicious shapes, finding ALL known tumors in the testing/validation set, including those most humans wouldn’t find… at the expense of an absurd number of false positives. Pretty reasonable misunderstanding, because paper authors routinely write about “better than human” results to make their work seem more important than it is to a lay audience. I’ve met extremely few clinicians who are truly bullish on the prospects of CAD (computer-aided detection).

(I work in healthtech R&D; spent several years doing radiology research and prepping data for machine learning models in this vein.)

7

UnkleRinkus t1_j1n3903 wrote

You didn't mention the other side which is false negatives. Who gets sued if the model misses one cancer? Which it inevitably will.

3

Subjective-Suspect t1_j1pod2z wrote

Cancer and other serious conditions get missed and misdiagnosed all the time. No person nor test is infallible. However, if you advocate properly for yourself, you’ll ask your doctor what other possible conditions you might have, and how they arrived at their diagnosis.

Most doctors routinely tell you all this stuff, anyway but, if they don’t, that’s a red flag to me. If that conversation isn’t happening, you aren’t going to be prompted by their explanation to provide clarity or more useful information you hadn’t previously thought important.

1

poo2thegeek t1_j1mqw02 wrote

Very interesting, thanks for the information! Goes to show that scientific papers don’t always mean useable results!

2

isleepinahammock t1_j1ndgzm wrote

I agree. It might be useful as an aid, but not as a final diagnosis. For example, maybe machine learning is able to discover some hitherto-unknown correlation between two seemingly unrelated conditions. That could be used as an aid in diagnosis and treatment.

For example, imagine a machine learning algorithm spat out a conclusion, "male patients of South Asian ancestry with a diagnosis of bipolar disorder have a 50% increased chance of later receiving a diagnosis of testicular cancer."

I chose these criteria off the top of my head, so they're meaningless. But bipolar disorder and testicular cancer are two diagnosis that have seemingly very little connection, and it would be even more counter-intuitive if this only significantly affected South Asian men. So it's the kind of correlation that would be very unlikely to be found through any other method than big machine learning studies. But biology is complicated, and sometimes very nonintuitive results do occur.

If this result was produced, and it was later confirmed by follow-up work, then it could be used as a diagnostic tool. Maybe South Asian men who have bipolar disorder need to be checked more often for testiclular cancer. But you would be crazy to assume that just because a South Asian man is bipolar, that they automatically also must have testicular cancer, or vice versa.

4

Hydrocoded t1_j1pdmpy wrote

Appriss is one of the most evil groups of people in the western world. They should all be jailed for life. What they do is no different than torture. They are sadistic.

There are millions of people who have chronic pain. We had advanced medications to treat their pain. Our lawmakers and companies like appriss unilaterally decide it’s better for millions of people to suffer in agony than to risk a single junkie getting a fix.

Words cannot describe how evil I believe them to be. There are many group that do awful things in this country, but there are precious few who are so gleefully, self-righteously cruel. They don’t just torture the sick, they torture the old. Their victims are our grandparents, our great aunts and uncles. They victimize our most desperate. They ensure that lives are cut short, as the stress of chronic pain leads to depression, cancer, obesity, and heart disease.

We have a treatment for pain, and these monsters want us to refuse it to those who need it.

6

james_d_rustles t1_j1p0zet wrote

That’s the scariest article I’ve read in a while. I actually saw my own “score” looking back. I’m prescribed meds for ADHD, and my doctor was telling me about how they have to follow some “new system” to prevent ODs. He showed me the computer screen, and it was in fact exactly like a credit score. Just some numbers and a few pie chart looking things that had my medical history.

Luckily, I guess my score was low, so I was allowed to continue being prescribed the medicine that I’ve been prescribed for years, but still horrible either way. I can’t even imagine what it feels like being a patient with a “high score” for reasons outside of your control.

5

scrample_egg t1_j1m56un wrote

this is no longer the way of the future this is just Now. hope everyone has fun getting charged $500 for 2 aspirin pills to help with their tooth infection. thanks Bayer

3

TurboTurtle- t1_j1p3tmg wrote

Why couldn’t they prescribe a different painkiller? Opioids are not the only one. And why terminate her from the hospital? Even if she was addicted, does that somehow make her medical emergency not matter?

1

Devil_May_Kare t1_j1pk2ul wrote

People think of drug addicts as subhuman. Doctors are people. Therefore, doctors think drug addicts are subhuman.

6

Devil_May_Kare t1_j1pjxdp wrote

I might grow breadseed poppies and extract raw opium sometime, so I can prove to doctors beyond a reasonable doubt that I'm not drug-seeking (if I have opiates at home and I'm at a doctor asking for help instead of at home getting high, obviously I'm not just there to ask for morphine). I mean, they shouldn't deny medical treatment to people they think are drug seeking, but as a stopgap measure this idea appeals to me.

My experience of telling a doctor that I had unauthorized prescription medications has been good so far (it was estradiol and bicalutamide and she was more or less chill about it). So I'm inclined to think similar strategies will go well in the future.

1

modsarefascists42 t1_j1mww02 wrote

Let me guess, everyone right? Everyone gets banned, no opiates ever.

Been living in chronic pain for over a decade and it's become very obvious the war on drugs will end up killing people like me, well more than the ones it already has.

50

bob0979 t1_j1ng5fv wrote

It's still wild to me that the War on Drugs halfway created our opiate issue sheerly because the government wanted to arrest black and poor people in the good old days of overt racism. Like imagine if we'd had good regulation surrounding pain management instead of villifying everything that's not percocet.

15

GenderBender3000 t1_j1mxy2t wrote

Fairly sure this will just be used to avoid giving people certain painkillers. No actual help will be given. Just “sorry the system flagged you as high risk so you’re just going to have to power through this. here’s so regular strength Tylenol

31

strolpol t1_j1mfoog wrote

Yeah this sounds nice until you need the medication but they tell you the robot says you can’t have any

16

naenouk t1_j1mfptw wrote

Can machine learning detect idiots yet?

9

UnkleRinkus t1_j1n3uet wrote

I work adjacent to sales and support for an AI company. We exactly have a model to identify , ahem, less skilled customers. Except we call it " low maturity in the technology".

5

trimeta t1_j1n18fx wrote

Forget issues of using socioeconomic data, which bakes in existing biases, the top 10 features in their model are variations on "past misuse of drugs" or "long-term use of opioids." And #11 is Chronic Pain. So is this actually telling clinicians anything they don't already know?

9

marketrent OP t1_j1ltdx7 wrote

In The Canadian Journal of Psychiatry, DOI 10.1177/07067437221114094:

>Opioid use disorder (OUD) is a chronic relapsing disorder with a problematic pattern of opioid use, affecting nearly 27 million people worldwide. Machine learning (ML)-based prediction of OUD may lead to early detection and intervention.

>In the current study, we aimed to develop and prospectively validate an ML model that could predict individual OUD cases based on representative large-scale health data.

In the linked release written by Gillian Rutherford, 7 December 2022:

>Opioid use disorder is a treatable, chronic disease in which patients can't control their opioid use, leading to difficulties at work or home, and sometimes even overdose and death, according to the U.S. Centers for Disease Control and Prevention.

>People with opioid use disorder are originally exposed to the drugs either through prescriptions to manage pain or through the illicit drug market.

>“Most of those people have interacted with the health system before their diagnosis, and that provides us with data that could allow us to predict and potentially prevent some of the cases,” says principal investigator Bo Cao, Canada Research Chair in Computational Psychiatry and associate professor of psychiatry.

> 

>The machine learning model analyzed health data from nearly 700,000 Alberta patients who received prescriptions for opioids between 2014 and 2018, cross-referencing 62 factors such as the number of doctor and emergency room visits, diagnoses and sociodemographic information.

>The team found the top risk factors for opioid use disorder included frequency of opioid use, high dosage and a history of other substance use disorders, among others.

>They determined the model predicted high-risk patients with an accuracy of 86 per cent when it was validated against a new sample of 316,000 patients from 2019.

>“It’s important that the model’s prediction of whether someone will develop opioid use disorder is interpreted as a risk instead of a label,” says first author Yang Liu, post-doctoral fellow in psychiatry. “It is information to put into the hands of clinicians, who are actually making the diagnosis.”

University of Alberta

8

shawsome12 t1_j1m54k1 wrote

Why can’t we find better pain management than opioids?

8

carlitospig t1_j1m7ovx wrote

Usually because the folks who are doing pain management research aren’t actually pain management sufferers. I work adjacently to healthcare in which I review educational protocols for health higher ed. I was shocked to learn that a ‘new’ pain management teaching module did absolutely everything but discuss opiates. Which, fine - for the times it makes a sad kind of sense. But, there was literally nothing new in their protocols. And it didn’t even address part of the issues which is that some of the more holistic approaches (yoga, those fancy sea salt soak pods, acupuncture) are terribly expensive and many aren’t covered by insurance. So they build up protocols that nobody but the rich can use, and wonder why the poor are relying on opiates to be able to make it through a work day.

TLDR: we are avoiding the money problem in healthcare.

25

ohhelloperson t1_j1meozs wrote

Massages and acupuncture are the only real alternatives that have helped with my pain (lupus related). But neither are covered by insurance while prescriptions are. It’s super frustrating to have to regularly explain to my various doctors that I can’t afford to pay for monthly massages and acupunctures on a nannying-salary. Like obviously I would love to get those treatments as often as necessary. But I’d also like to pay my rent and eat too.

14

Baud_Olofsson t1_j1me8p3 wrote

> And it didn’t even address part of the issues which is that some of the more holistic approaches (yoga, those fancy sea salt soak pods, acupuncture) are terribly expensive and many aren’t covered by insurance.

... because they're placebos.

7

SleekExorcist t1_j1n90iy wrote

Ehhh acupuncture has some clinical support. So does physical therapy, which can and does include massage. But seriously no one wants to cover appropriate PT.

2

Baud_Olofsson t1_j1nj40v wrote

Acupuncture is pure placebo.
One of my favorite studies compared actual acupuncture (needles inserted at extremely specific woo-woo energy sites) to a sham treatment of poking the subjects at random with toothpicks. The real acupuncture and the toothpicks were equally as effective at relieving pain.
(And note that unlike poking people with toothpicks, inserting actual needles carries actual risks (mostly infection))

Pain is very, very subjective. It can be greatly relieved by things like simple distraction and personal attention (hell, even swearing has an analgesic effect). Acupuncture provides both personal attention and distraction. So it "works", but it doesn't work better than sham treatment. Meaning it only works as well as a placebo. Meaning it is a placebo.

3

neuro__atypical t1_j1ne3oa wrote

There are novel and highly effective ways to treat pain being developed, like capsaicin injections, but development is moving at a snails' pace. Not enough money being put into it, probably.

There are also less novel treatments for pain like ketamine/esketamine, the latter being a nasal spray. But this isn't allowed to be used (regulatory red tape) even though it's safer than opioids. They won't let you take esketamine nasal spray home, it has to be administered at a clinic, and prescribing for pain is considered off-label use.

5

argv_minus_one t1_j1ozjv9 wrote

Why is there so little money going into it? Chronic pain isn't exactly an uncommon problem; that's why Purdue made so much money selling a drug for it.

2

PMMeYourBoehner t1_j1n7n3u wrote

I understand you're in horrible pain. And yes I totally could fix. But the computer noticed you're brown. Best I can do is a children's Tylenol chewable. That will be $10000.

8

pharmaway123 t1_j1ntpo7 wrote

Health care data scientist here. I'll copy and paste my thoughts from our company slack:

> The model has a shockingly high sensitivity and specificity. Looks very impressive off the bat. But once you dig in, it's a bit more of a "well duh" moment.

> The most predictive factors were opioid-related poisoning diags, long term use of opioids, and high daily MME (milligrams of morphine equivalence, a standardized opioid dosage)

> I think that leads to a couple high level take aways.

  1. Often times with claims data, you have highly informative signals. So high, in fact, that you actually want to train a model on that signal instead. Another classic example would be predicting Alzheimers disease in claims. The strongest predictive signal are diags like "age related cognitive decline" and a bunch of other R41.X codes. If you want to add value, you probably want to use those diags as your outcome variable. Otherwise, you can just come up with nearly perfect model by just ID'ing folks with (ex) opioid-related poisoning. But by that point, they've probably already a very expensive inpatient/ER episode (where the opioid poisoning was coded).

  2. In spite of the above, the initial predictive model like the one in this paper can be super useful. If you ask a clinician how to identify folks who use too many opioids, they are unlikely to deliver a list of diagnosis codes including the opioid poisoning codes. Similarly, it's unrealistic to expect that they would be able to identify a specific cut off for the number of MME's that are sensitive and specific for opioid use disorder. Those initial models can be used with a "human in the loop" mechanism where you review that output with clinicians and refine the inclusion criteria or outcome variable

  3. The last thing I'd highlight is it is rarely ever a good idea to take a predictive model developed in a different data set and apply it to our own data. The population differences, the difference in coding patterns, the difference in temporal utilization all mean you can expect a drastic drop in performance if you take an existing model and apply it to our data. Said differently: medical predictive models generally have poor generalizability. We have the luxury of having lots and lots of data, and we can likely get much better performance by developing models internally.

8

Hydrocoded t1_j1pendj wrote

Appriss and their ilk are no different in my mind to a medieval torturer with a hot iron. The only difference is instead of the iron they use chronic pain, and instead of a dungeon they use data science and the law.

Anything that prevents a chronic pain patient from receiving adequate opiates is evil. I don’t give a damn if I’m addicted; it’s better than being in pain. Some junkie getting his fix should not prevent me from getting treatment.

0

psychodelephant t1_j1mr9kd wrote

It’s really just two lines of BASIC code:

10 PRINT “EVERYBODY”

20 GOTO 10

6

AldoLagana t1_j1lyt8p wrote

shouldn't we just give that group all the opioids they want? if you expect to coerce behavior, why do we still have hateful and intolerant humans around? why have we not behavior modified that aspect of humanity out of our "system"? hint: more humans mean more money to the money-grubbing capitalists, their personality and behavior does not matter to capitalists because every human pays to live.

5

UnkleRinkus t1_j1n4g2f wrote

Until the US passed the Harrison tax act of 1914, we did exactly that. Opiates were perfectly legal to sell in over the counter medications, and there were many morphine addicts in the US. However, they could satisfy their addiction cheaply and legally, and so it was simply tolerated reasonably. They held jobs, they lived their lives, they just had to have their patent medicine every day.

The single biggest law that reduced addiction in the US was the 1906 pure food and drugs act. This required labeling the contents, and people started avoiding the ones with morphine.

In 1970, Britain did an experiment, they resisted the pressure from the United States and made heroin legal to addicts. Marginalized people who had been living on the streets, went back to their families, started working, because they didn't need to steal $100 a day to maintain their addiction. Then the US pressured Britain to stop the experiment, which they did, and the positive effects disappeared.

You made the glib statement above, but there is strong rationale to do exactly that. If we provided legal opiates, of a regulated strength, in a regulated manner to addicts, my belief is that we would see a dramatic drop in crime, and a serious reduction in the life problems of addicts.

9

ohhelloperson t1_j1meums wrote

…? What point exactly are you trying to make here?

4

dftba-ftw t1_j1mfprf wrote

I think they're trying to argue that we should just let this people kill themselfs so that their genes don't get passed on.

So basically they're shouting "I'm a horrible human being"

−3

Devil_May_Kare t1_j1pkueq wrote

Opioid use rarely kills people with clean drugs of known strength. Opioid use kills people when they don't know how strong their drugs are and take too much by mistake, and when they consume poisons along with the drug, but those risks can be removed without removing the opioid.

So letting people buy opioids over the counter wouldn't actually cause people that kill themselves and not pass on their genes.

1

SkuaGoingHome t1_j1o619t wrote

Oh god.. This totally won't be used by health insurance to decide whether or not to refuse you as a client.

4

argv_minus_one t1_j1ozp31 wrote

That's just pre-existing conditions with extra steps, which in the US is already illegal.

2

smoxy t1_j1mmnr9 wrote

An accuracy of 86% is not good enough, especially in the medical field. You need at least 95%

3

trimeta t1_j1n23k4 wrote

Accuracy's isn't what's important, precision and recall are. Since to get used, the model needs an appropriate threshold, and then you must consider the consequences of that threshold.

5

Southern_Scholar_243 t1_j1nvalk wrote

As soon as AI will be used to start preventing im yeeting out of this world. Dont wanna go back to slavery times.

3

gourdgal t1_j1nwlff wrote

Same as being indicted for ‘pre-crime’ or thought crime, before it happens!

3

Not_Larfy t1_j1o2pql wrote

Oh no... This sounds like Sibyl in Psychopass

3

gettheflyoffmycock t1_j1og78c wrote

As a machine learning engineer this is the WORST type of problems a ml model should be applied to. We are taught in school how easy it is to make a machine learning model that can detect a terrorist on a plane. Just guess “not a terrorist“ and you have a 99.9% accurate model.

theres countless records already of ML models doing things like this article that end up racist, wrong, and damage peoples lives. Shame on these ML engineers.

3

Dont_know_nothin0 t1_j1nw12b wrote

Can the ML learn to average out the salary of opioid CEO’s down to just 100,000 a year? And use the rest of the earnings for opioid awareness and substance abuse programs? Maybe have the ML give out some of the money for scholarships, after school programs, and k-12 building remodels?

2

somethingcleverer t1_j1olm9l wrote

This is just going to make poor people not be able to get pain meds.

2

plooptyploots t1_j1owi0v wrote

Yeah.. poor and rural. There you go.

2

AutoModerator t1_j1lsk3p wrote

Welcome to r/science! This is a heavily moderated subreddit in order to keep the discussion on science. However, we recognize that many people want to discuss how they feel the research relates to their own personal lives, so to give people a space to do that, personal anecdotes are allowed as responses to this comment. Any anecdotal comments elsewhere in the discussion will be removed and our normal comment rules apply to all other comments.

I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

1

[deleted] t1_j1oyba6 wrote

Seeing as they prescribe opioids to known heroin addicts, I doubt it will help.

1

Devil_May_Kare t1_j1pkycq wrote

Prescription opioids with slow pharmacokinetics are actually a safe and effective treatment for heroin addiction. I don't see how that's a problem. If anything, it's a step towards a solution.

1

[deleted] t1_j1qcolc wrote

I'm not talking about suboxone or whatever you are referring to. I am talking about prescribing oxycodone to people who tell there doctor directly that they are a recovering heroin addict.

1

FOlahey t1_j1p1vu0 wrote

We could also educate people how the brain actually works and what happens when you take drugs. It's wild how few people understand the neuroscience of the drugs they consume. I realize neuroscience is a hard concept, but its really not that hard of a field once you break it down into the various receptors and the chemicals people are imbibing. We should make education, safe drug use, and safe chemistry accessible and stop promoting the War on Drugs and scientific illiteracy in the name of social control and profit.

1

Tra_Jake t1_j1pl6nr wrote

Uhhh… is anyone else who’s seen the last few seasons of Westworld nervous?

1

sayingshitudontlike t1_j1pltph wrote

As long as prevention - which implies acting beforehand - is done in a communicative ad consensual way.

Noone likes being told they're going to be a drug addict so we better step in before that happens.

Pygmalion Effect applies a bit here, plus the natural tendency to buck expectations to lean into them to spite people.

How you use this info is just as important as having it in the first place.

1

dano415 t1_j1pterp wrote

How about just giving anyone who wants opioids a reasonable supply. Kinda like Covid tests.

People are definetly self medicating because the psychiatric community has really failed.

1

TrevorIRL t1_j1mzlcq wrote

A great tool big pharma can use to find high CLV targets and market too!

Talk about increasing ROI on marketing!!

−2

enigmaroboto t1_j1mn5pz wrote

Most of the time, when my kid was hurt, I said told her to get up and stop crying. No bandage for you. Rinse and proceed with the activity.

This is how you harden kids, so that when they are adults they don't expect pain killers for minor issues.

−18