Submitted by Irate_Librarian1503 t3_10njvu5 in Futurology
NinjaLanternShark t1_j698902 wrote
Reply to comment by Irate_Librarian1503 in Why not use chat gpt to spot obvious fake news? by Irate_Librarian1503
Actual "fake news" with false information is relatively rare, and easy to spot -- think of tabloid news.
Here's the problem: "Classified documents found in Biden's home shines light on Democrat witch hunt of Trump."
That's not provably false. It's an extreme spin on what is ultimately true information.
You won't find universal agreement that headlines like that should be taken down. Therefore anyone using AI to filter stuff like the above only reinforces the echo chamber.
baumpop t1_j69olbd wrote
You curb this by having the ai write the news then check it then also be the audience for it.
schrod t1_j69ftbd wrote
It could substitute all references to 'witch hunts' with more accurate term "investigation" foe example
Odd_Armadillo5315 t1_j6iu2s7 wrote
Nice idea, but as soon as a platform starts actually changing the content like this, the news providers will stop using that platform. Who is liable if a news item suddenly becomes libellous due to a change the AI made?
Plus you get some kid failing an exam because they kept writing about the Salem investigations.
Irate_Librarian1503 OP t1_j6992ps wrote
The first goal would not to make such an extreme spin as false or fake, but rather scientifically false ideas, aka vaccines cause autism.
NinjaLanternShark t1_j69a3r2 wrote
IMHO any true scientist will tell you you can't prove a negative -- you can't prove vaccines don't cause autism, you can only state we have no studies or evidence that show they do, to which someone will reply, we have studies that do link them.
I'm not trying to be difficult, just saying it's much more complex than "half the country believes in lies."
Irate_Librarian1503 OP t1_j69cbii wrote
Never said that half the country believes lies. But any decent mega study on the subject shows that there is no correlation between autism and vaccines. Not trying to be difficult just trying to say that „vaccines cause autism“ is , after our current understanding, false.
Dry-Influence9 t1_j6awkxm wrote
Mate the short point is, classifying truths with machine learning is a very hard problem and it cant be done today. Chatgpt definitely cannot do that. There's a lot of very smart people working on it and hopefully they can come up with something eventually.
schrod t1_j69gghw wrote
It could say 'it is alleged by some but not proven' for the word 'causes' when there is a definite question about accuracy of the statement.
tim0901 t1_j6b6b56 wrote
What is truth though? You speak of it as if it's an absolute, definable thing, but in reality it's very much not. Truth is a relative term - we can both have truths that are in complete contradiction of each other, even within the realm of modern science.
Let's take a classic physics example - you're on a train, watching the world moving through the window. From your perspective you and the train appear stationary, while the world looks like it's moving beneath you. But from a person on the platform's perspective, they are the ones who are stationary while you and the train are the ones that are moving. If you drop something, from your perspective it moves straight downwards. But from the outsider's perspective, its trajectory is slanted - it's moving forwards as well as downwards.
Which of these perspectives is the "truth"? Is the train stationary, or the planet? Well, both - and simultaneously, neither. There is no absolute, definitive truth of the situation - it depends on whose perspective you take on the matter. And things get more confusing when you add more perspectives into the mix - after all as far as an observer on the moon is concerned, they are the one that is stationary while both the Earth and the train are moving. Or if you were to take the perspective of someone standing at the centre of the universe, then their truth is that everything is moving away from them. As such a definitive "truth" is impossible to define here. You can only state things from a particular perspective or - in physics language - a particular frame of reference.
This is why we don't use the concept of "truth" in science. Because while this is only a single example, this concept extends to basically everything. Science is not the "truth" nor does it ever attempt to be. Science is humanity's understanding of how the universe around us works from our particular perspective. Judging things as absolute truths or falsehoods is antithetical to this concept and therefore to science as a whole.
[deleted] t1_j6annc5 wrote
[removed]
Viewing a single comment thread. View all comments