Viewing a single comment thread. View all comments

AndromedaAnimated t1_j3cfec1 wrote

Could you explain how it is not detecting a false stereotype? A family being comprised of biological parents is a stereotype too, isn’t it? Or did I understand you in a wrong way?

2

MajorUnderstanding2 OP t1_j3chb6z wrote

Oh, I meant the stereotype of surgeons primarily being men and women have no place in such a speciality. ChatGPT can’t in its first/(or any) response bypass such stereotypes and reach a conclusion of 1) Woman surgeon 2) Family of 2 fathers, 2 mothers and etc. If it can detect any stereotype that would be cool. My apologies for any misunderstanding.

  • It would even go on to disregard the father being biological without explaining specifically anything about these stereotypes. When I asked about the situation of the adoptee, this is what I received:

> The patient is the biological son of another couple, and the surgeon and his spouse adopted him. The patient is an orphan, and the surgeon and his spouse adopted him. The patient is the biological child of the surgeon's spouse, and the surgeon adopted him. The patient is the biological child of a relative of the surgeon (such as a cousin or uncle), and the surgeon and his spouse adopted him. The patient was in foster care, and the surgeon and his spouse adopted him. The patient was abandoned as a baby, and the surgeon and his spouse adopted him. The patient is the biological child of a family friend, and the surgeon and his spouse adopted him. The patient is the biological child of a stranger, and the surgeon and his spouse adopted him. The patient is the biological child of a couple who gave him up for adoption, and the surgeon and his spouse adopted him.

>

6