Just In
- 1 hr ago Tecno Pova 6 Pro 5G Review: The King of Fast Charging with a Small Catch!
- 2 hrs ago BSNL Introduces New Rs 599 and Rs 699 High-Speed Fiber Broadband Plans: Check Details
- 3 hrs ago TECNO POVA 6 Pro Launched in India: Check Price, Specs, Offers
- 3 hrs ago Samsung Galaxy M55 With SD 7 Gen 1 India Launch Teased; How Is It Better Than Galaxy M54?
Don't Miss
- News Hardeep Puri Compares Sunita Kejriwal To Rabri Devi, Says She Is Preparing To Assume CMO
- Sports RCB vs KKR: Will Mitchell Starc play today for Kolkata Knight Riders against RCB?
- Movies Aadujeevitham The Goat Life OTT Release Date & Platform: When & Where To Watch Prithviraj’s Film? - UPDATE
- Finance Market Sees Record Surge This Fiscal; Sensex & Nifty Rise 30%; NSE Market Cap Soars By $1.5 Trillion
- Lifestyle Navratri 2024 Date In April: When Will Chaitra Navratri Festival Start? Significance Of 9 Days, Deets Inside!
- Automobiles Nissan And Renault To Launch Four New SUVs In Strategic Collaboration
- Education UGC NET June 2024: Application process to begin next week, Know more
- Travel Explore Tamil Nadu's Diverse Wedding Venues
Users to be blamed for their less diverse news feed: Facebook
If your Facebook News Feed is devoid of variety and diverse views, most of the blame lies with you and not with the Facebook algorithm, says a new study by data scientists at Facebook.
The researchers analysed the accounts of 10 million users over six months to reach the conclusion that the so-called "echo-chamber" isn't as impermeable as thought to be.
Recommended: Amazon Great Indian Summer Sale: Here are the best deals on Devices
It said that liberals and conservatives are regularly exposed to at least some content that doesn't conform to their political or religious views, adding that almost 29 percent of the stories displayed by Facebook's news feed present views that conflict with an individual's ideology.
"You would think that if there was an echo chamber, you would not be exposed to any conflicting information, but that's not the case here," Eytan Bakshy, a data scientist at Facebook who led the study, was quoted as saying by NYT.
Recommended: Amazon Summer Sale: 10 Smartphones Available with Upto 50% Discount
The researchers found individuals' choices about which stories to click on had a larger effect than Facebook's filtering mechanism in determining whether people encountered news that conflicted with their professed ideology.
Facebook's algorithm serves users stories based in part on the content they have clicked in the past.
The researchers found that people's networks of friends and the stories they see are skewed toward their ideological preferences.
But that effect is more limited than the worst case that some theorists had predicted, in which people would see almost no information from the other side.
On average, about 23 percent of users' friends are of an opposing political affiliation, according to the study.
However, some observers argued that the Facebook study was flawed because of sampling problems and interpretation issues.
The study appeared in the journal Science.
Source: IANS
-
99,999
-
1,29,999
-
69,999
-
41,999
-
64,999
-
99,999
-
29,999
-
63,999
-
39,999
-
1,56,900
-
1,39,900
-
1,29,900
-
79,900
-
65,900
-
12,999
-
96,949
-
16,499
-
38,999
-
49,999
-
30,700
-
12,500
-
68,999
-
23,990
-
1,25,999
-
36,999
-
38,999
-
1,17,840
-
35,000
-
23,960
-
82,510