Just In
- 9 min ago HMD Branded First Set of Android Smartphones Are Here!
- 1 hr ago OnePlus 13 New Leak Reveals Display and Camera Details; What to Expect?
- 1 hr ago iQOO Z9 Series Launching at 04:30 pm Today: How to Watch the Livestream? What to Expect?
- 2 hrs ago Gigabyte Unveils AORUS CO49DQ: A Curved QD-OLED Gaming Monitor for Immersive Entertainment
Don't Miss
- Education MP Board Result 2024: Toppers List of Class 10th and 12th is here
- Movies Mankatha Re-Release Woes: Ajith-Venkat Prabhu's Action Thriller Faces Issues; To Hit Screens On THIS Date
- Sports Alcaraz: Stopping world's best Sinner not easy in Madrid
- News Indian MBBS Student, Dasari Chandu, Dies In Kyrgyzstan After Waterfall Mishap
- Finance Vodafone Idea FPO: Shares To Debut Tomorrow; GMP Hints Premium Listing; Shares Slump 9%
- Automobiles 2024 Kia Carens Secures 3-Star Adult & 5-Star Child Safety Ratings – Detailed Look
- Travel Mumbai Opens BMC Headquarters For Exclusive Heritage Tour
- Lifestyle Summer Style: 6 Must-Try Colors To Stay Fashionably Cool Like B-Town Divas!
Users to be blamed for their less diverse news feed: Facebook
If your Facebook News Feed is devoid of variety and diverse views, most of the blame lies with you and not with the Facebook algorithm, says a new study by data scientists at Facebook.
The researchers analysed the accounts of 10 million users over six months to reach the conclusion that the so-called "echo-chamber" isn't as impermeable as thought to be.
Recommended: Amazon Great Indian Summer Sale: Here are the best deals on Devices
It said that liberals and conservatives are regularly exposed to at least some content that doesn't conform to their political or religious views, adding that almost 29 percent of the stories displayed by Facebook's news feed present views that conflict with an individual's ideology.
"You would think that if there was an echo chamber, you would not be exposed to any conflicting information, but that's not the case here," Eytan Bakshy, a data scientist at Facebook who led the study, was quoted as saying by NYT.
Recommended: Amazon Summer Sale: 10 Smartphones Available with Upto 50% Discount
The researchers found individuals' choices about which stories to click on had a larger effect than Facebook's filtering mechanism in determining whether people encountered news that conflicted with their professed ideology.
Facebook's algorithm serves users stories based in part on the content they have clicked in the past.
The researchers found that people's networks of friends and the stories they see are skewed toward their ideological preferences.
But that effect is more limited than the worst case that some theorists had predicted, in which people would see almost no information from the other side.
On average, about 23 percent of users' friends are of an opposing political affiliation, according to the study.
However, some observers argued that the Facebook study was flawed because of sampling problems and interpretation issues.
The study appeared in the journal Science.
Source: IANS
-
99,999
-
1,29,999
-
69,999
-
41,999
-
64,999
-
99,999
-
29,999
-
63,999
-
39,999
-
1,56,900
-
79,900
-
1,39,900
-
1,29,900
-
65,900
-
1,56,900
-
1,30,990
-
76,990
-
16,499
-
30,700
-
12,999
-
16,026
-
14,248
-
14,466
-
26,634
-
18,800
-
62,425
-
1,15,909
-
93,635
-
75,804
-
9,999