Support for LAist comes from
Audience-funded nonprofit news
Stay Connected
Audience-funded nonprofit news
Listen
Podcasts AirTalk
Down The Rabbit Hole: The Dark Side Of YouTube’s Automated Recommendation System
solid blue rectangular banner
()
AirTalk Tile 2024
Jun 5, 2019
Listen 20:51
Down The Rabbit Hole: The Dark Side Of YouTube’s Automated Recommendation System
Earlier this week, an article from the New York Times outlined the “rabbit hole effect” of YouTube’s algorithms and pointed out how the platform may recommend videos of young children to viewers who had previously viewed sexually themed content.
The dark side of YouTube
The dark side of YouTube
(
Dado Ruvic / Reuters
)

Earlier this week, an article from the New York Times outlined the “rabbit hole effect” of YouTube’s algorithms and pointed out how the platform may recommend videos of young children to viewers who had previously viewed sexually themed content.

Earlier this week, an article from the New York Times outlined the “rabbit hole effect” of YouTube’s algorithms and pointed out how the platform may recommend videos of young children to viewers who had previously viewed sexually themed content.

Researchers attribute this to what some studies call a “rabbit hole effect” an online viewing trend that leads people on platforms like YouTube to incrementally watch more extreme topics or videos, which then hook viewers in.  However, YouTube has stated that removing their automated recommendation system, which they say drive up to 70 percent of views by suggesting what users should watch next, altogether would only hurt its content creators who rely on the system.

Shortly after the New York Times article was published, YouTube released a statement on their official blog, outlining recent policy and algorithm updates that protect minors and families. Some of YouTube’s new policies included restricting live streaming features for younger minors unless they are clearly accompanied by an adult as well as limiting recommendations of content that features minors in risky situations.

Earlier this year, YouTube had also announced that it would turn off comments on nearly all videos featuring kids, following a similar controversy in February where pedophiles were leaving inappropriate comments on children’s videos.

Today on AirTalk, we’ll discuss the role technology and online platforms like YouTube play in content moderation and public safety with the co-director of the Center for Scholars & Storytellers as well as one of the researchers who ran an experiment on how YouTube’s algorithms direct its users.

Should parents be responsible for the content they're putting online of their children or does YouTube need to step in? What do you think? Give us a call at 866-893-5722.

Guests:

Jonas Kaiser, affiliate researcher at Harvard’s Berkman Klein Center for Internet & Society; he is one of three researchers who conducted an experiment on how YouTube’s algorithms direct its users; he tweets

Sierra Filucci, editorial director at Common Sense Media, a media literacy nonprofit in San Francisco; one of her areas of expertise is educating parents on media and social media use

Suresh Venkatasubramanian, professor of computing at the University of Utah and a member of the board of directors for the ACLU Utah; he studies algorithmic fairness; he tweets

Credits
Host, AirTalk
Host, Morning Edition, AirTalk Friday, The L.A. Report Morning Edition
Senior Producer, AirTalk with Larry Mantle
Producer, AirTalk with Larry Mantle
Producer, AirTalk with Larry Mantle
Associate Producer, AirTalk & FilmWeek
Associate Producer, AirTalk
Apprentice News Clerk, AirTalk
Apprentice News Clerk, FilmWeek