YouTube’s recent algorithm change explains why your feed is full of children’s videos

In this photo illustration the Youtube logo seen displayed... Omar Marques/SOPA Images/LightRocket via Getty Images

YouTube quietly rolled out changes to its algorithm last month in an effort to surface more family-friendly content amid an investigation into the platform by the Federal Trade Commission, according to a new Bloomberg report.

The change essentially led to certain channels being surfaced and recommended for viewers, while other channels were ignored. Although the change was touted as a regular update that YouTube’s engineering team rolls out often, a spokesperson told Bloomberg that this change was an effort to improve the “ability for users to find quality family content.”

Whether the recommended videos are quality family content is up for debate within the YouTube user community. The main YouTube subreddit is full of people complaining about being recommended children’s content — including from non-English-speaking channels — based on nursery rhymes. It’s a facet of the change that coincides with Bloomberg’s reporting: it found that channels that “post nursery rhymes and animated sing-a-longs at an astounding rate” were racking up millions of views.

“I didn’t watch a single video that had to do anything with this, [I] just watching skate vids,” one person on Reddit wrote. “The video before this one had regular recommendations but this one [sic] not a single one isn’t a foreign kids channel.”

Other frustrated commenters shared their own stories, with one person noting that this wasn’t the first time that YouTube’s recommendation algorithm seemed to change, recommending strange children’s content.

“It seems like they’re just pushing this shit hard on literally the whole platform to grab as many kids’ views as they can,” a user wrote on Reddit earlier today.

The company is in a precarious spot. It has battled numerous controversies and scandals surrounding children’s content over the last couple of years. Reports from The New York Times, Wired, and even investigations conducted by creators have discovered a multitude of issues, including a network of predators using comment sections of videos featuring kids to post disturbing messages. The FTC also launched an investigation into YouTube and the safety of children on the platform. YouTube has repeatedly said that its website is not intended for children under the age of 13.

Changing parts of the recommendation algorithm to surface more family-friendly content is a first step in addressing the problem, but the result is that part of the community is upset with what’s being surfaced. Multiple Reddit and Twitter threads from people who use the site daily cite the same issues: none of these videos are even tangentially related to what they were watching, and it makes using YouTube a more uncomfortable experience.

“These exact videos just came in my recommended too, even though I was just watch video gameplay,” another Reddit user wrote. “So odd, I got a lil creeped out.”

Comments

  1. What if I say that this is one of the best article of this topic & it is well written with the good amount of content.

    Free tinder super like generator

    ReplyDelete

Post a Comment

Popular posts from this blog

Canadian transit agency teases amazing new transportation technology: the bus

Amazon’s plans for a New York office are under new scrutiny

Why Twitter should ignore the phony outrage over “shadow banning”