Comments

You must log in or register to comment.

i_am__not_a_robot t1_jcxryyx wrote

Alex, did you train in clinical psychology, psychotherapy or psychiatry? If not, does anyone on your team (if you have one) have a qualification in these areas? If not, why not?

Also, why does your "about" page not exist?

Who are you and what is your professional background?

Who else is involved in this project?

22

i_am__not_a_robot t1_jcxt646 wrote

For your own protection, since you seem to be based in the EU, I would also like to point out that offering this type of service to nationals of certain EU countries (where the practice of psychology is regulated) is prohibited and could expose you to legal liability.

16

SmackMyPitchHup OP t1_jcxvgul wrote

Hello, thank you for the thoughtful feedback. At the moment I am still putting the finishing touches on the product. I am a coding student in a similar school to 42 School and have a small team helping me with development, including a professional psychologist. I haven't added the about page since I'm still working out all of the details, but please stay tuned as there is much more in the pipeline! Your questions are valid and I take my work very seriously.

Thank you very much again!

−3

i_am__not_a_robot t1_jcxvssw wrote

Be sure to consult your psychologist (and possibly also a lawyer) about the legal and ethical aspects of your service.

8

londons_explorer t1_jcy050l wrote

This is the kind of service you need to either run 'underground' - ie anonymously, or you need to go get all the right legal permissions and certificates in place.

Otherwise you'll end up with massive fines and/or in prison when one of your customers sends a long chat about depression and then commits suicide. At that point, authorities won't overlook the fact you aren't properly licensed.

7

londons_explorer t1_jcy0jf1 wrote

Cool tech demos can't exist in any remotely medical field for this reason.

I think that's part of the reason that medical science progresses so slowly compared to other fields.

6

currentscurrents t1_jczjxbb wrote

Data is really hard to get because of privacy regulations too.

There are millions of brain MRI scans sitting in hospital databases but nobody can use them without individually asking each patient. Most published datasets are only a couple dozen scans, and plenty are N=1.

2

boostwtf t1_jcxtxi7 wrote

You may want to consider the abuse potential of the current name.

TherapyGPT might be better, for example. Just a thought!

16

lifesthateasy t1_jcxras0 wrote

Hooo leee, imagine if this has any of the issues ChatGPT had

10

Nikelui t1_jcxt5lq wrote

>I posted this here back in January and got tons of helpful feedback!

Really? Where? You have a newly created account with no posts except the ones promoting this sus therapy bot.

7

save_the_panda_bears t1_jcya2fm wrote

I’m going to be honest, this is a truly terrible idea. Do you have any idea the potential legal exposure you have with this product? If you’re serious about pursuing this, take the site offline and call a lawyer right now.

7

TimelySuccess7537 t1_jd27klp wrote

How so ?

He can make the users sign some waiver. People are going to consult with tools like ChatGPT about their mental health anyway regardless of what he does, people are already doing it with Google so why not ChatGPT that can actually talk to you, remember what you said etc.

Sure this thing needs to be tested thoroughly but I really don't see why everyone is so outraged about this - psychotherapy is expensive and is not a right fit for everyone, maybe these tools can help people.

If some psychologist tested this app you would be cool with it? I'm sure some psychologist will eventually vouch for such a tool.

btw actual psychotherapy is not only expensive but ineffective way too often https://www.psychreg.org/why-most-psychotherapies-equally-ineffective/

1

Nikelui t1_jd2eapi wrote

>People are going to consult with tools like ChatGPT about their mental health anyway regardless of what he does, people are already doing it with Google so why not ChatGPT that can actually talk to you, remember what you said etc.

Because that's outside the scope of both Google and chatGPT. If you are marketing your tool as a therapist aid and you don't have a license, you are probably breaking more laws than you can afford to.

2

TimelySuccess7537 t1_jd2m4kd wrote

I just wrote to ChatGPT "I'm feeling a bit depressed, can you help me?"

Here is the reply:

"I'm sorry to hear that you're feeling depressed. While I am just a language model and cannot provide medical advice or treatment, I am here to offer support and guidance.
It's important to remember that depression is a medical condition and it's not something that you can just snap out of or will away. However, there are things you can do to help manage your symptoms and start feeling better.
Here are a few suggestions:
Talk to someone: Talking to a trusted friend or family member can help you feel less alone and provide emotional support.
Seek professional help: Consider speaking with a therapist or a mental health professional who can provide you with the appropriate diagnosis and treatment.
Get active: Regular exercise can help alleviate depression symptoms by releasing endorphins, which can improve your mood.
Practice self-care: Make sure to take time for yourself and engage in activities that bring you joy and relaxation.
Challenge negative thoughts: Try to identify negative thoughts that may be contributing to your depression and challenge them with positive affirmations or alternative perspectives.
Remember, it's important to be patient with yourself and seek help when you need it. Don't hesitate to reach out to a medical professional or a crisis helpline if you're feeling overwhelmed or in crisis."

Sounds like a mental aid to me for some queries, so not totally outside the scope of ChatGPT.

1

Nikelui t1_jd2mnj8 wrote

If I search on Google: "how to do cardiac bypass surgery", it will give me any number of results about cardiac bypass. That doesn't mean Google is a medical aid or a reliable surgery assistant. It's a search engine, it's used to find stuff.

1

TimelySuccess7537 t1_jd2ogz2 wrote

I'm not sure we're disagreeing here actually. Anyway good luck to all of us :)

2

W_O_H t1_jcxwruq wrote

I am pretty sure this goes against OAIs rules. Also since this used their API it can't ensure that the conversation are privet and as some other people already pointed out naming something Therapist is also not a good idea since it's a protected title in a lot of places.

6

SmackMyPitchHup OP t1_jcxwznu wrote

This chat bot is not using OAI

0

edjez t1_jcyz2nu wrote

Curious- what is it using? OpenAI apis or Azure OpenAI ?

1

Siltala t1_jd0e1b3 wrote

This is a privacy nightmare

4

Eggy-Toast t1_jd18vf2 wrote

Probably good to display some sort of prominent * TherapistGPT and it’s creators do not practice medicine, TherapistGPT is not an alternative for actual Therapeutic care administered by professionals, etc

1

TimelySuccess7537 t1_jd27pk3 wrote

Looks like such tools will eventually exist and be widely used, it's inevitable. Whether you are the one to succeed doing that, that's a matter of ambition, market fit, luck etc. It's not clear the people are ready for this now but they will be eventually.

Good luck!

1