Connect with us


YouTube’s use of machine learning to detect content made for kids has irked content creators



YouTube has become the new cable TV of this decade. According to a report by Sandvine revealed that as of March 2019 YouTube is responsible for 37% of the total worldwide downstream web traffic on smartphones. The platform is leaps ahead of its direct competition Facebook which only accounts for 8.4%. The worldwide audience for the video streaming platform has made it one of, if not the most profitable content streaming service.

However, as YouTube gained more and more web traffic over the years, the content creators on YouTube have complained about the platform’s policies becoming hostile towards them. YouTube’s use of machine learning and AI tools to decide whether or not videos are family friendly and should be monetized has been very controversial.

Furthermore, content creators on the platform have also been critical of YouTube’s copywrite claim system. They allege that the three-strikes based system is heavily biased towards the claimants and does not provide creators an opportunity to defend themselves. However, YouTube has stood their ground on these issues and creators have had no choice but to comply.

Now, YouTube and its content creators find themselves at a cross roads once again.

YouTube’s $170 Million Settlement for COPPA Guidelines Violations

Back in September, Google and YouTube agreed to pay a whopping $170 million fine to settle allegations put forth against them by the US Federal Trade Commission and the New York Attorney General. In those allegations, the FTC and New York AG claimed that YouTube had illegally collected personal information from children and violated the Children’s Online Privacy Protection Act (COPPA).

The fine is a record in a case related to COPPA violations. The fine may seem like a drop in the bucket to the Alphabet Inc., YouTube’s parent company, who made around $10 billion in net profits in the second quarter of 2019. However, the settlement agreement requires YouTube to implement a system that allows channel owners on the platform to identify content made for children. The new policy change is meant to help YouTube comply with COPPA rules in the future.

Furthermore, YouTube must provide annual training to their staff that deals with the content creators about complying with COPPA. Google and YouTube will also be required to alert channel owners that their videos may be subject to COPPA’s obligations.

New policies to protect kid’s data

In a blogpost, YouTube CEO Susan Wojcicki announced major changes in the way that the platform will handle content made for kids in the future. She said that the platform will treat any data coming from viewers watching content made for children as coming for a child regardless of the viewer’s age. The platform will not serve personalized ads on such content and disable certain other features such as notifications, recommendations and comments.

Wojcicki said that these policies will be enforced over the course of 4 months to allow family and kid creators time to adjust. She acknowledged that while there will be some challenges that these creators will have to face when it comes to compliance with these policies but said that the platform is “committed to working with them through this transition”.

Using machine learning to find content made for kids

Wojcicki’s statement also included a brief overview of what the new changes meant for creators on the platform. She stated that content creators will be required to tell YouTube when their content falls in this category. Wojcicki also announced that YouTube will start using machine learning to identify content that is clearly targeted towards younger audiences. That includes content that heavily focuses on kids’ characters, themes, toys, or games.

Creators in a disarray

Soon after the announcement by Susan Wojcicki, several videos from content creators started popping up on the platform expressing their frustration and confusion regarding the matter. Creators are wary of being handed off to machine learning and AI tools once again to decide whether or not certain videos on their channels will be monetized.

In the past, YouTube’s machine learning algorithms have been known to ruthlessly deny monetization to any video which contained non-family friendly content. That meant any swear words, or charged topics such as white supremacism would get videos demonetized and creators would lose all revenue as well as production cost on those videos.

What’s ‘Made for kids’?

While there are channels on YouTube that clearly target their content towards kids, there are also other channels who create family friendly content but don’t target kids specifically. That doesn’t change the fact that this content is still viewed by a lot of kids.

Over the last couple of weeks, many content creators on YouTube started to notice that their videos were being tagged ‘Made for kids’. The tag brought along with it the heavy monetization cuts and other liabilities for the creators. However, a lot of creators are complaining that their content that has never been targeted at children is being tagged by the YouTube’s algorithms unfairly.

Content creators who work on reshaping kids’ toys or create content about dolls or create animations are seeing their videos lose a lot of revenue because it’s been deemed ‘Made for kids’ by YouTube. YouTube’s biggest star Pewdiepie also uploaded a video recently where he expressed his frustration and confusion regarding the new policies and the ‘Made for kids’ tag. Creators have also been voicing their concerns over social media platforms such as Twitter and Facebook.

What’s most concerning for content creators is that if they choose to mark their videos as ‘Made for kids’ themselves or if the YouTube algorithm tags it, they become liable to legal repercussions form the FTC. Many channels are afraid of getting washed away with this wave of new rules and policies.


Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *