Five leading content creators whose channels were based on LGBTQ+ materials filed a federal lawsuit against YouTube in August 2019, alleging that YouTube's algorithms diverts discovery away from their channels, impacting their revenue. The plaintiffs claimed that the algorithms discourage content with words like "lesbian" or "gay", which would be predominant in their channels' content, and because of YouTube's near-monopolization of online video services, they are abusing that position.
In January 2018, YouTube creator Logan Paul faced criticism for a video he had uploaded from a trip to Japan, where he encountered a body of a suicide death in the Aokigahara forest. The corpse was visible in the video, although its face was censored. The video proved controversial due to its content, with its handling of the subject matter being deemed insensitive by critics. On January 10—eleven days after the video was published—YouTube announced that it would cut Paul from the Google Preferred advertising program. Six days later, YouTube announced tighter thresholds for the partner program to "significantly improve our ability to identify creators who contribute positively to the community", under which channels must have at least 4,000 hours of watch time within the past 12 months and at least 1,000 subscribers. YouTube also announced that videos approved for the Google Preferred program would become subject to manual review, and that videos would be rated based on suitability (with advertisers allowed to choose).More Info
YouTube joined an initiative led by France and New Zealand with other countries and tech companies in May 2019 to develop tools to be used to block online hate speech and to develop regulations, to be implemented at the national level, to be levied against technology firms that failed to take steps to remove such speech, though the United States declined to participate. Subsequently, on June 5, 2019, YouTube announced a major change to its terms of service, "specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status." YouTube identified specific examples of such videos as those that "promote or glorify Nazi ideology, which is inherently discriminatory". YouTube further stated it would "remove content denying that well-documented violent events, like the Holocaust or the shooting at Sandy Hook Elementary, took place."More Info
In August 2008, a US court ruled in Lenz v. Universal Music Corp. that copyright holders cannot order the removal of an online file without first determining whether the posting reflected fair use of the material. The case involved Stephanie Lenz from Gallitzin, Pennsylvania, who had made a home video of her 13-month-old son dancing to Prince's song "Let's Go Crazy", and posted the 29-second video on YouTube. In the case of Smith v. Summit Entertainment LLC, professional singer Matt Smith sued Summit Entertainment for the wrongful use of copyright takedown notices on YouTube. He asserted seven causes of action, and four were ruled in Smith's favor.More Info
In June 2007, YouTube began trials of a system for automatic detection of uploaded videos that infringe copyright. Google CEO Eric Schmidt regarded this system as necessary for resolving lawsuits such as the one from Viacom, which alleged that YouTube profited from content that it did not have the right to distribute. The system, which was initially called "Video Identification" and later became known as Content ID, creates an ID File for copyrighted audio and video material, and stores it in a database. When a video is uploaded, it is checked against the database, and flags the video as a copyright violation if a match is found. When this occurs, the content owner has the choice of blocking the video to make it unviewable, tracking the viewing statistics of the video, or adding advertisements to the video. By 2010, YouTube had "already invested tens of millions of dollars in this technology". In 2011, YouTube described Content ID as "very accurate in finding uploads that look similar to reference files that are of sufficient length and quality to generate an effective ID File". By 2012, Content ID accounted for over a third of the monetized views on YouTube.More Info
In December 2012, two billion views were removed from the view counts of Universal and Sony music videos on YouTube, prompting a claim by The Daily Dot that the views had been deleted due to a violation of the site's terms of service, which ban the use of automated processes to inflate view counts. This was disputed by Billboard, which said that the two billion views had been moved to Vevo, since the videos were no longer active on YouTube. On August 5, 2015, YouTube patched the formerly notorious behaviour which caused a video's view count to freeze at "301" (later "301+") until the actual count was verified to prevent view count fraud. YouTube view counts once again updated in real time.More Info
We don't show ads. Help us keep it that way.