petrichoring

petrichoring t1_jbyr8i8 wrote

Totally! The issue of suicide is such a tricky one, especially when it comes to children/teens. My policy is that i want to be a safe person to discuss suicidal ideation with so with my teens I make it clear when I would need to break confidentiality without the consent of the client (so unless they tell me they’re going to kill themselves when they leave my office and aren’t open to safety planning, it stays in the room). With children under 14, it’s definitely more of an automatic call to parents if there’s SI out of baseline, especially with any sort of implication of plan or intern. Either way, it’s important to keep it as trauma-informed and consent-based as possible to avoid damaging trust.

But absolutely it becomes more of an issue when an adult doesn’t have the training or relationship with the client to handle the nuances of SI with what a big spectrum it can present as; ethically, safety has to come first. And, like you said, that can then become a huge barrier to seeking support. My fear is that a chat bot can’t effectively offer crisis intervention because it is such a delicate art and we’d end up with dead kids. The possibility for harm outweighs the potential benefits for me as a clinician.

I do recommend crisis lines for support with SI as long as there is informed consent in calling. Many teens I work with (and people in general) are afraid that if they call, a police officer will come to their house and force them into a hospital stay, which is a realistic-ish fear. Best practice should be that that only happens if there’s imminent risk to harm without the ability to engage in less invasive crisis management (I was a crisis worker before grad school and only had to call for a welfare check without the person’s consent as a very last resort maybe 5% of the time, and it felt awful) but that depends on the individual call taker’s training and protocol of the crisis center they work at. I’ve heard horror stories of a person calling with passive ideation and still having police sent to their house against their will and I know that understandably stops many from calling for support. I recommend using crisis lines if there isn’t other available support because I do believe in the power of human connection when all else feels lost, with the caveat that a caller know their rights and have informed consent for the process.

Ideally we’d implement mental health first aid training to teens across the board so they could provide peer support to their friends and be a first line of defense for suicide risk mitigation without triggering an automatic report. Would that have helped you when you were going through it?

6

petrichoring t1_jbyc42i wrote

As a therapist who works with teens, I feel concerned about a teen talking to an AI about their problems given the lack of crisis assessment and constructive support.

“Adult confidants” need to have safeguards in place like mandated reporting, mental health first aid, and a basic ethics code to avoid the potential for harm (see the abuse perpetrated by the Catholic Church).

If a teen is needing adult support, a licensed therapist is a great option. The Youth Line is a good alternative as well for teens who are looking for immediate support as there are of course barriers to seeing a therapist in a moment of need. Most schools now have a school social worker who can provide interim support while a teen is getting connected to resources.

Edit: also wanted to highlight the importance of having a non-familial adult support. Having at least two non-parent adults who take a genuine interest in a child’s well-being is one of the Positive Childhood Experiences that promote resilience and mitigate the damage caused by Adverse Childhood Experiences. A chatbot is wildly inadequate to provide the kind of support I think you’re referring to, OP. It might offer some basic degree of attunement and responsiveness, but the connection is absent. Neurobiologically, we have specific neural systems that are responsible for social bonds to another human, and I suspect even the most advanced AI would fall short in sufficiently activating these networks. Rather than using AI to address the lack of support and connection that many, if not most, teens experience in their lives, we should be finding ways to increase safe, accessible, organic human support.

65