The researchers, the New York Times reports, find that the same tenets that reward extremism also happen with sexual content on YouTube: A user who watches erotic videos might be recommended videos of ...
Add Yahoo as a preferred source to see more of our stories on Google. YouTube's algorithm recommends right-wing, extremist videos to users — even if they haven't interacted with that content before — ...
"If you randomly follow the algorithm, you probably would consume less radical content using YouTube as you typically do!" So says Manoel Ribeiro, co-author of a new paper on YouTube's recommendation ...
An exclusive excerpt from Every Screen On The Planet reveals how the social media app’s powerful recommendation engine was shaped by a bunch of ordinary, twentysomething curators—including a guy named ...
YouTube’s recommendation algorithm no longer inadvertently sends people down a rabbit hole of extreme political content, researchers have found. Following changes to the algorithm in 2019, individual ...
YouTube tends to recommend videos that are similar to what people have already watched. New research has found that those recommendations can lead users down a rabbit hole of extremist political ...
Artificial intelligence (AI) will soon be the judge for Facebook users' video feeds as announced by its head Tom Alison. As per reports, one of Meta's significant AI investments is creating an AI ...
YouTube has two billion active monthly users and uploads 500 hours of content every minute. Twenty five percent of U.S. adults get their news from YouTube, and 60% of regular users “use the platform ...
Warning: This episode contains references to guns and gun violence. YouTube’s recommendation algorithm has always been key to keeping users on the site. Watch a cute cat video, and the platform spews ...