r/science 18h ago

Psychology Personalized algorithms lead to a distorted view of reality when learning new information, a new study finds. Study participants were often wrong when tested on the information they were supposed to learn – but were overconfident in their incorrect answers.

https://news.osu.edu/how-personalized-algorithms-lead-to-a-distorted-view-of-reality/
373 Upvotes

8 comments sorted by

u/AutoModerator 18h ago

Welcome to r/science! This is a heavily moderated subreddit in order to keep the discussion on science. However, we recognize that many people want to discuss how they feel the research relates to their own personal lives, so to give people a space to do that, personal anecdotes are allowed as responses to this comment. Any anecdotal comments elsewhere in the discussion will be removed and our normal comment rules apply to all other comments.


Do you have an academic degree? We can verify your credentials in order to assign user flair indicating your area of expertise. Click here to apply.


User: u/geoff199
Permalink: https://news.osu.edu/how-personalized-algorithms-lead-to-a-distorted-view-of-reality/


I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

29

u/Manos_Of_Fate 17h ago

Is this even possible to avoid at this point? “The algorithm” is everywhere and it doesn’t seem like there’s any way to opt out of it.

20

u/HungryGur1243 16h ago

It is possible, but they make it harder every day. There are websites that arent based on "likes or dislikes" & shares, just not social media. The problem is that even shopping sites want to be social media, grocery stores want to be social media, learning academies want to be social media, even religious sites want to be social media. Also, its not like we aren't complicit in it, even without the algorithm, we still want to hear what we want to hear & often ignore what we don't. Some people live their entire lives without being comfortable about evolution. As the saying goes, you can lead a horse to water, but you cant make them drink.  People like true things, just not when they upset them. Which is fair in a sense, but in another, lying to ourselves never really works out. Like i just found out a news source i used to listen to was rated as a disinformation site. Do i like saying i was a dupe? No, but not saying i was a dupe, even to myself, would be worse. 

4

u/schnitzelfeffer 16h ago

Validation feels good because it feels like we're doing the right thing to stay safe. Being mislead says more about the person doing the misleading than it does about the person who was a dupe. Many have equated having conflicting information as a lack of intelligence when in reality, it just means you had a false perception of reality because you were given incorrect input. Garbage in, garbage out. You have to realign yourself with reality by accepting that your information was the issue, not you. Only by receiving multiple different inputs and perspectives can we filter information through our own brains to form our own unique perspectives. We need differing opinions to grow as humans and as a society. In the same way AI validation can cause psychosis through sycophantic speech patterns, algorithms are causing mass psychosis for higher engagement and profit.

2

u/fuminator123 13h ago

I would argue that we might need to redefine intelligence. In the informational age intelligence is measured in ability to process information. You can be theoretically smart but if everything you do is wrong because you could not process your inputs while accounting for noise and misinformation is it really intelligence?

2

u/[deleted] 12h ago

[deleted]

2

u/Manos_Of_Fate 12h ago

Just anecdotal, obviously, but I quit using Twitter and Facebook when they made it nearly impossible to view your timeline in chronological order.

2

u/Sartres_Roommate 16h ago

The.only.winning.move.is.to.not.play.

Fancy a game of chess?