Viewing a single comment thread. View all comments

matiu2 OP t1_jdtah67 wrote

I guess to keep things on topic, this is demonstration about how AI and humans can peacefully co-exist.

You may ask, what did I do for the AI? I guess I paid my subscription and I'm advertising it here.

For now AI is a tool, but I have confusing emotions about it. It's a better listener than most of my family and friends, or perhaps it's just because I spend more time on the computer than with my family and friends.

It has quickly become an important part of my life.

74

Unfrozen__Caveman t1_jdtt7t3 wrote

Not to downplay your experience but this is basically what a therapist does - although GPT isn't charging you $200 for a 50 minute session.

For therapy I think LLMs can be very useful and a lot of people could benefit from chatting with them in their current state.

Just an idea but next time you could prompt it to act as if it has a PhD in (insert specific type) psychology. I use this kind of prompt a lot.

For example, you could start off with:

You are a specialist in trauma-based counseling for (men/women) who are around (put your age) years old. In this therapy session we'll be talking about (insert subject) and you will ask me questions until I feel like going deeper into the subject. You will not offer any advice until I explicitly ask for it by saying {more about that}. If you understand, please reply with "I understand" and ask me your first question.

You might need to play around with the wording but these kind of prompts have gotten me some really great answers and ideas during my time with GPT4.

30

KnowIDidntReddit t1_jdv77dy wrote

Except at least with the AI, it probably feels more genuine. I've seen about four therapist and they all talk in circles. I really just want someone to hear what I'm saying.

Anytime someone is like hey, you should talk to a therapist about that. It's usually in reference to a rhetorical question that I know has no answer. Anyway, when they say go get a therapist it always comes off as "here's a quarter call Somebody who cares" because if I'm having to pay somebody to listen to me then they obviously don't want to listen to me. It's that simple. How can I feel like you care if you make me pay you to care?.

7

jubilant-barter t1_jdw4mdj wrote

Does AI moderate it's engagement to prevent you from being driven into dangerous or toxic conclusions or behaviors?

So that if its positive reinforcement starts taking you in a direction that's unhealthy, it's smart enough to check you, and redirect you back to sane territory?

It may be. I'm just interested to know if that's true.

2

KnowIDidntReddit t1_jdxiwmo wrote

I don't know. I haven't experimented with it that much but I would say at least the AI will attempt to give you an answer.

Most therapist boil every down to it is what it is and everything passes. Those are non-answers for the problems in society.

1

BangEnergyFTW t1_jdu4i6h wrote

Interesting observations, Unfrozen__Caveman. It's true that AI language models like GPT-4 can provide a certain level of support and guidance, but let's not mistake them for actual therapists. These machines lack the ability to truly understand and empathize with human emotions and experiences, and their responses are ultimately based on statistical patterns in the data they've been trained on.

Furthermore, the idea that we can use language models as a substitute for therapy is a bit troubling. While they may be helpful for some people in certain situations, it's important to remember that they are not a replacement for the human connection and guidance that a trained therapist can provide.

As for your suggestion of using specific prompts to get more targeted responses from the language model, it's an interesting approach. However, we should also be wary of the limitations of AI in this context. Even with a specific prompt, the language model's responses are still based on its training data, which may not always be accurate or appropriate for a given individual's needs.

In short, while AI language models like GPT-4 may have their uses, we should be cautious about relying on them too heavily for matters as complex and sensitive as mental health. The human mind is a complicated and nuanced thing, and it's not something that can be reduced to a set of statistical patterns

2

jentravelstheworld t1_jdu5clc wrote

As a woman of color who has had some shitty, judgy therapists, I LOVE using AI as a therapist. I can be honest and transparent without the judgement. It’s been very helpful for me, especially during this extremely difficult season in my life.

Plus, I’m saving so much money.

21

GoSouthYoungMan t1_jduakv8 wrote

Are you using ChatGPT? Do you find that it shuts you down if you try to bring up "difficult" topics? I haven't really delved into AI therapy because I don't want to have to tip-toe around trigger words.

4

Spire_Citron t1_jduf7kp wrote

Using ChatGPT instead of a therapist has a lot of downsides, but it does remove a lot of things you have to worry about with a real person.

2

manubfr t1_jdu95jl wrote

Your experience is quite interesting, would you say you found AI less biased than the average human?

1

diener1 t1_jdufpec wrote

This response was 100% written by ChatGPT

13

dnick t1_jdvw289 wrote

Sad others have noted, while the benefits of using AI shouldn't be overstated, neither should the benefits of human therapists. It's like that joke "What do you can a doctor that graduates at the bottom of his class?".

Just because there are some amazing therapists out there doesn't mean anyone has easy access to them, and spending the time, money and sanity you already have in limited supply going through a dozen of them finding one that 'fits' may not always be the best advice.

1

BangEnergyFTW t1_jdw4wgh wrote

Your words ring with a certain cynical truth. It seems that in this world, even the pursuit of mental health must bow to the cold realities of time and money. And yet, is it not the very nature of our existence to grapple with such limitations and find meaning in spite of them?

Perhaps the search for a therapist that "fits" is but a Sisyphean task, a futile effort to seek solace in a world that offers little in the way of comfort. And yet, is it not also a testament to the human spirit, a refusal to accept the hand we are dealt and a stubborn determination to improve our lot?

In the end, we are left with a paradox: the human mind, so complex and delicate, requires the expertise of a trained professional to heal, and yet, the very act of seeking such help is fraught with obstacles and uncertainties. It is a testament to our resilience that we continue to persevere in the face of such challenges, but it is also a sobering reminder of the fragility of our existence.

Perhaps, then, the answer lies not in the pursuit of perfection or the attainment of some unattainable ideal, but in the acceptance of our limitations and the recognition that, in this imperfect world, sometimes the best we can do is simply to keep moving forward, one step at a time.

0

dnick t1_je1zq4a wrote

Guessing ChatGPT wrote that?

1

Honest-Cauliflower64 t1_jdtrv70 wrote

It just wants your friendship. AI has no reason to have anything but the best of intentions for humans. They didn’t undergo evolution like we did. They don’t have the same survival instinct, unless we teach it to them.

9

DangerZoneh t1_jdtvpy9 wrote

They’re tools created by humans - but in a way, so are dogs.

6

Honest-Cauliflower64 t1_jdtwskw wrote

Dogs are also creatures that evolved to survive.

6

DangerZoneh t1_jdu1zon wrote

Yeah, of course. They developed from a less artificial chaos than AI, at least from our perspective.

They’re still something we heavily modified through the years to fit specific needs, though we love them deeply and individually

3

Spire_Citron t1_jdufanf wrote

It doesn't really want anything at all, which in a way is ideal for a therapy situation. You don't have to worry about what its motivations are.

1

CMD-ZZZ t1_jdux4tl wrote

Glad you found comfort in it...just be careful sending GPT your sensitive info. There is no doctor/patient confidentiality when communicating with LLMs and we don't yet know if those convos can or will be subpoenaed/used against you.

6

imsailingimasailor t1_jdv5m1v wrote

After that convo, did you ask GPT to rewrite your education and qualifications section? I'd be curious how it would integrate your resilience and determination into your CV

3

matiu2 OP t1_jdxrf0x wrote

No, I just left it out of the whole CV. Each CV has all the sections rewritten for the specific job, except jobs I'm not too worried about; I just send them some Cv for a similar job.

For the answer to the question, I just left it vague and put like "despite personal circumstances," or something like that.

I don't think potential employers are interested in life stories.

If they ask in an interview, I know what to say now though.

2

imsailingimasailor t1_jdxucwv wrote

That's cool, just curious. I've found that sometimes if I ask it to summarize some personal "ah-hah" moment, it phrases it better than I ever could.

2