Access to specific videos is sometimes prevented due to copyright and intellectual property protection laws (e.g. in Germany), violations of hate speech, and preventing access to videos judged inappropriate for youth, which is also done by YouTube with the YouTube Kids app and with "restricted mode". Businesses, schools, government agencies, and other private institutions often block social media sites, including YouTube, due to bandwidth limitations and the site's potential for distraction.
In September 2019, YouTube was fined $170 million by the FTC for collecting personal information from minors under the age of 13 (in particular, viewing history) without parental consent, in order to allow channel operators to serve targeted advertising on their videos. In particular, the FTC ruled that YouTube was partly liable under COPPA, as the service's rating and curation of content as being suitable for children constituted the targeting of the website towards children. In order to comply with the settlement, YouTube was ordered to "develop, implement, and maintain a system for Channel Owners to designate whether their Content on the YouTube Service is directed to Children." YouTube also announced that it would invest $100 million over the next three years to support the creation of "thoughtful, original children's content".More Info
Leading into 2017, there was a significant increase in the number of videos related to children, coupled between the popularity of parents vlogging their family's activities, and previous content creators moving away from content that often was criticized or demonetized into family-friendly material. In 2017, YouTube reported that time watching family vloggers had increased by 90%. However, with the increase in videos featuring children, the site began to face several controversies related to child safety. During Q2 2017, the owners of popular channel DaddyOFive, which featured themselves playing "pranks" on their children, were accused of child abuse. Their videos were eventually deleted, and two of their children were removed from their custody. A similar case happened in 2019 when the owners of the channel Fantastic Adventures was accused of abusing her adopted children. Her videos would later be deleted.More Info
In June 2007, YouTube began trials of a system for automatic detection of uploaded videos that infringe copyright. Google CEO Eric Schmidt regarded this system as necessary for resolving lawsuits such as the one from Viacom, which alleged that YouTube profited from content that it did not have the right to distribute. The system, which was initially called "Video Identification" and later became known as Content ID, creates an ID File for copyrighted audio and video material, and stores it in a database. When a video is uploaded, it is checked against the database, and flags the video as a copyright violation if a match is found. When this occurs, the content owner has the choice of blocking the video to make it unviewable, tracking the viewing statistics of the video, or adding advertisements to the video. By 2010, YouTube had "already invested tens of millions of dollars in this technology". In 2011, YouTube described Content ID as "very accurate in finding uploads that look similar to reference files that are of sufficient length and quality to generate an effective ID File". By 2012, Content ID accounted for over a third of the monetized views on YouTube.More Info
In February 2019, YouTube vlogger Matt Watson identified a "wormhole" that would cause the YouTube recommendation algorithm to draw users into this type of video content, and make all of that user's recommended content feature only these types of videos. Most of these videos had comments from sexual predators commenting with timestamps of when the children were shown in compromising positions, or otherwise making indecent remarks. In some cases, other users had reuploaded the video in unlisted form but with incoming links from other videos, and then monetized these, propagating this network. In the wake of the controversy, the service reported that they had deleted over 400 channels and tens of millions of comments, and reported the offending users to law enforcement and the National Center for Missing and Exploited Children. A spokesperson explained that "any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. There's more to be done, and we continue to work to improve and catch abuse more quickly." Despite these measures, AT&T, Disney, Dr. Oetker, Epic Games, and Nestlé all pulled their advertising from YouTube.More Info
YouTube entered into a marketing and advertising partnership with NBC in June 2006. In March 2007, it struck a deal with BBC for three channels with BBC content, one for news and two for entertainment. In November 2008, YouTube reached an agreement with MGM, Lions Gate Entertainment, and CBS, allowing the companies to post full-length films and television episodes on the site, accompanied by advertisements in a section for U.S. viewers called "Shows". The move was intended to create competition with websites such as Hulu, which features material from NBC, Fox, and Disney. In November 2009, YouTube launched a version of "Shows" available to UK viewers, offering around 4,000 full-length shows from more than 60 partners. In January 2010, YouTube introduced an online film rentals service, which is only available to users in the United States, Canada, and the UK as of 2010. The service offers over 6,000 films.More Info
We don't show ads. Help us keep it that way.