YouTube

When did the New York Times make their report?

In June 2019, The New York Times cited researchers who found that users who watched erotic videos could be recommended seemingly innocuous videos of children. As a result, Senator Josh Hawley stated plans to introduce federal legislation that would ban YouTube and other video sharing sites from including videos that predominantly feature minors as "recommended" videos, excluding those that were "professionally produced", such as videos of televised talent shows. YouTube has suggested potential plans to remove all videos featuring children from the main YouTube site and transferring them to the YouTube Kids site where they would have stronger controls over the recommendation system, as well as other major changes on the main YouTube site to the recommended feature and autoplay system.


People Also Ask

  • In June 2007, YouTube began trials of a system for automatic detection of uploaded videos that infringe copyright. Google CEO Eric Schmidt regarded this system as necessary for resolving lawsuits such as the one from Viacom, which alleged that YouTube profited from content that it did not have the right to distribute. The system, which was initially called "Video Identification" and later became known as Content ID, creates an ID File for copyrighted audio and video material, and stores it in a database. When a video is uploaded, it is checked against the database, and flags the video as a copyright violation if a match is found. When this occurs, the content owner has the choice of blocking the video to make it unviewable, tracking the viewing statistics of the video, or adding advertisements to the video. By 2010, YouTube had "already invested tens of millions of dollars in this technology". In 2011, YouTube described Content ID as "very accurate in finding uploads that look similar to reference files that are of sufficient length and quality to generate an effective ID File". By 2012, Content ID accounted for over a third of the monetized views on YouTube.

    More Info
  • During the same court battle, Viacom won a court ruling requiring YouTube to hand over 12 terabytes of data detailing the viewing habits of every user who has watched videos on the site. The decision was criticized by the Electronic Frontier Foundation, which called the court ruling "a setback to privacy rights". In June 2010, Viacom's lawsuit against Google was rejected in a summary judgment, with U.S. federal Judge Louis L. Stanton stating that Google was protected by provisions of the Digital Millennium Copyright Act. Viacom announced its intention to appeal the ruling. On April 5, 2012, the United States Court of Appeals for the Second Circuit reinstated the case, allowing Viacom's lawsuit against Google to be heard in court again. On March 18, 2014, the lawsuit was settled after seven years with an undisclosed agreement.

    More Info
  • Leading into 2017, there was a significant increase in the number of videos related to children, coupled between the popularity of parents vlogging their family's activities, and previous content creators moving away from content that often was criticized or demonetized into family-friendly material. In 2017, YouTube reported that time watching family vloggers had increased by 90%. However, with the increase in videos featuring children, the site began to face several controversies related to child safety. During Q2 2017, the owners of popular channel DaddyOFive, which featured themselves playing "pranks" on their children, were accused of child abuse. Their videos were eventually deleted, and two of their children were removed from their custody. A similar case happened in 2019 when the owners of the channel Fantastic Adventures was accused of abusing her adopted children. Her videos would later be deleted.

    More Info
  • Even for content that appears to aimed at children and appears to contain only child-friendly content, YouTube's system allows for anonymity of who uploads these videos. These questions have been raised in the past, as YouTube has had to remove channels with children's content which, after becoming popular, then suddenly include inappropriate content masked as children's content. Alternative, some of the most-watched children's programming on Youtube comes from channels who have no identifiable owners, raising concerns of intent and purpose. One channel that had been of concern was "Cocomelon" which provided numerous mass-produced animated videos aimed at children. Up through 2019, it had drawn up to US$10 million a month in ad revenue, and was one of the largest kid-friendly channels on YouTube prior to 2020. Ownership of Cocomelon was unclear outside of its ties to "Treasure Studio", itself an unknown entity, raising questions as to the channel's purpose, but Bloomberg News had been able to confirm and interview the small team of American owners in February 2020 regarding "Cocomelon", who stated their goal for the channel was to simply entertain children, wanting to keep to themselves to avoid attention from outside investors. The anonymity of such channel raise concerns because of the lack of knowledge of what purpose they are trying to serve. The difficulty to identify who operates these channels "adds to the lack of accountability", according to Josh Golin of the Campaign for a Commercial-Free Childhood, and educational consultant Renée Chernow-O’Leary found the videos were designed to entertain with no intent to educate, all leading to both critics and parents to be concerns for their children becoming too enraptured by the content from these channels. Content creators that earnestly make kid-friendly videos have found it difficult to compete with larger channels like ChuChu TV, unable to produce content at the same rate as these large channels, and lack the same means of being promoted through YouTube's recommendation algorithms that the larger animated channel networks have shared.

    More Info
  • In June 2007, YouTube began trials of a system for automatic detection of uploaded videos that infringe copyright. Google CEO Eric Schmidt regarded this system as necessary for resolving lawsuits such as the one from Viacom, which alleged that YouTube profited from content that it did not have the right to distribute. The system, which was initially called "Video Identification" and later became known as Content ID, creates an ID File for copyrighted audio and video material, and stores it in a database. When a video is uploaded, it is checked against the database, and flags the video as a copyright violation if a match is found. When this occurs, the content owner has the choice of blocking the video to make it unviewable, tracking the viewing statistics of the video, or adding advertisements to the video. By 2010, YouTube had "already invested tens of millions of dollars in this technology". In 2011, YouTube described Content ID as "very accurate in finding uploads that look similar to reference files that are of sufficient length and quality to generate an effective ID File". By 2012, Content ID accounted for over a third of the monetized views on YouTube.

    More Info

Featured

We don't show ads. Help us keep it that way.