YouTube

What year did Fantastic Adventures' owner have to take care of their adopted children?

Leading into 2017, there was a significant increase in the number of videos related to children, coupled between the popularity of parents vlogging their family's activities, and previous content creators moving away from content that often was criticized or demonetized into family-friendly material. In 2017, YouTube reported that time watching family vloggers had increased by 90%. However, with the increase in videos featuring children, the site began to face several controversies related to child safety. During Q2 2017, the owners of popular channel DaddyOFive, which featured themselves playing "pranks" on their children, were accused of child abuse. Their videos were eventually deleted, and two of their children were removed from their custody. A similar case happened in 2019 when the owners of the channel Fantastic Adventures was accused of abusing her adopted children. Her videos would later be deleted.


People Also Ask

  • YouTube joined an initiative led by France and New Zealand with other countries and tech companies in May 2019 to develop tools to be used to block online hate speech and to develop regulations, to be implemented at the national level, to be levied against technology firms that failed to take steps to remove such speech, though the United States declined to participate. Subsequently, on June 5, 2019, YouTube announced a major change to its terms of service, "specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status." YouTube identified specific examples of such videos as those that "promote or glorify Nazi ideology, which is inherently discriminatory". YouTube further stated it would "remove content denying that well-documented violent events, like the Holocaust or the shooting at Sandy Hook Elementary, took place."

    More Info
  • In May 2014, before Music Key service was launched, the independent music trade organization Worldwide Independent Network alleged that YouTube was using non-negotiable contracts with independent labels that were "undervalued" in comparison to other streaming services, and that YouTube would block all music content from labels who do not reach a deal to be included on the paid service. In a statement to the Financial Times in June 2014, Robert Kyncl confirmed that YouTube would block the content of labels who do not negotiate deals to be included in the paid service "to ensure that all content on the platform is governed by its new contractual terms." Stating that 90% of labels had reached deals, he went on to say that "while we wish that we had [a] 100% success rate, we understand that is not likely an achievable goal and therefore it is our responsibility to our users and the industry to launch the enhanced music experience." The Financial Times later reported that YouTube had reached an aggregate deal with Merlin Network—a trade group representing over 20,000 independent labels, for their inclusion in the service. However, YouTube itself has not confirmed the deal.

    More Info
  • Controversial content has included material relating to Holocaust denial and the Hillsborough disaster, in which 96 football fans from Liverpool were crushed to death in 1989. In July 2008, the Culture and Media Committee of the House of Commons of the United Kingdom stated that it was "unimpressed" with YouTube's system for policing its videos, and argued that "proactive review of content should be standard practice for sites hosting user-generated content". YouTube responded by stating:

    More Info
  • In the wake of the March 2019 Christchurch mosque attacks, YouTube and other sites like Facebook and Twitter that allowed user-submitted content drew criticism for doing little to moderate and control the spread of hate speech, which was considered to be a factor in the rationale for the attacks. These platforms were pressured to remove such content, but in an interview with The New York Times, YouTube's chief product officer Neal Mohan said that unlike content such as ISIS videos which take a particular format and thus easy to detect through computer-aided algorithms, general hate speech was more difficult to recognize and handle, and thus could not readily take action to remove without human interaction.

    More Info
  • In January 2018, YouTube creator Logan Paul faced criticism for a video he had uploaded from a trip to Japan, where he encountered a body of a suicide death in the Aokigahara forest. The corpse was visible in the video, although its face was censored. The video proved controversial due to its content, with its handling of the subject matter being deemed insensitive by critics. On January 10—eleven days after the video was published—YouTube announced that it would cut Paul from the Google Preferred advertising program. Six days later, YouTube announced tighter thresholds for the partner program to "significantly improve our ability to identify creators who contribute positively to the community", under which channels must have at least 4,000 hours of watch time within the past 12 months and at least 1,000 subscribers. YouTube also announced that videos approved for the Google Preferred program would become subject to manual review, and that videos would be rated based on suitability (with advertisers allowed to choose).

    More Info

Featured

We don't show ads. Help us keep it that way.