YouTube

When did youtube implement the Google+ interface?

On November 6, 2013, Google implemented a comment system oriented on Google+ that required all YouTube users to use a Google+ account in order to comment on videos. The stated motivation for the change was giving creators more power to moderate and block comments, thereby addressing frequent criticisms of their quality and tone. The new system restored the ability to include URLs in comments, which had previously been removed due to problems with abuse. In response, YouTube co-founder Jawed Karim posted the question "why the fuck do I need a google+ account to comment on a video?" on his YouTube channel to express his negative opinion of the change. The official YouTube announcement received 20,097 "thumbs down" votes and generated more than 32,000 comments in two days. Writing in the Newsday blog Silicon Island, Chase Melvin noted that "Google+ is nowhere near as popular a social media network as Facebook, but it's essentially being forced upon millions of YouTube users who don't want to lose their ability to comment on videos" and "Discussion forums across the Internet are already bursting with outcry against the new comment system". In the same article Melvin goes on to say:


People Also Ask

  • In January 2018, YouTube creator Logan Paul faced criticism for a video he had uploaded from a trip to Japan, where he encountered a body of a suicide death in the Aokigahara forest. The corpse was visible in the video, although its face was censored. The video proved controversial due to its content, with its handling of the subject matter being deemed insensitive by critics. On January 10—eleven days after the video was published—YouTube announced that it would cut Paul from the Google Preferred advertising program. Six days later, YouTube announced tighter thresholds for the partner program to "significantly improve our ability to identify creators who contribute positively to the community", under which channels must have at least 4,000 hours of watch time within the past 12 months and at least 1,000 subscribers. YouTube also announced that videos approved for the Google Preferred program would become subject to manual review, and that videos would be rated based on suitability (with advertisers allowed to choose).

    More Info
  • In May 2014, before Music Key service was launched, the independent music trade organization Worldwide Independent Network alleged that YouTube was using non-negotiable contracts with independent labels that were "undervalued" in comparison to other streaming services, and that YouTube would block all music content from labels who do not reach a deal to be included on the paid service. In a statement to the Financial Times in June 2014, Robert Kyncl confirmed that YouTube would block the content of labels who do not negotiate deals to be included in the paid service "to ensure that all content on the platform is governed by its new contractual terms." Stating that 90% of labels had reached deals, he went on to say that "while we wish that we had [a] 100% success rate, we understand that is not likely an achievable goal and therefore it is our responsibility to our users and the industry to launch the enhanced music experience." The Financial Times later reported that YouTube had reached an aggregate deal with Merlin Network—a trade group representing over 20,000 independent labels, for their inclusion in the service. However, YouTube itself has not confirmed the deal.

    More Info
  • In August 2008, a US court ruled in Lenz v. Universal Music Corp. that copyright holders cannot order the removal of an online file without first determining whether the posting reflected fair use of the material. The case involved Stephanie Lenz from Gallitzin, Pennsylvania, who had made a home video of her 13-month-old son dancing to Prince's song "Let's Go Crazy", and posted the 29-second video on YouTube. In the case of Smith v. Summit Entertainment LLC, professional singer Matt Smith sued Summit Entertainment for the wrongful use of copyright takedown notices on YouTube. He asserted seven causes of action, and four were ruled in Smith's favor.

    More Info
  • Even for content that appears to aimed at children and appears to contain only child-friendly content, YouTube's system allows for anonymity of who uploads these videos. These questions have been raised in the past, as YouTube has had to remove channels with children's content which, after becoming popular, then suddenly include inappropriate content masked as children's content. Alternative, some of the most-watched children's programming on Youtube comes from channels who have no identifiable owners, raising concerns of intent and purpose. One channel that had been of concern was "Cocomelon" which provided numerous mass-produced animated videos aimed at children. Up through 2019, it had drawn up to US$10 million a month in ad revenue, and was one of the largest kid-friendly channels on YouTube prior to 2020. Ownership of Cocomelon was unclear outside of its ties to "Treasure Studio", itself an unknown entity, raising questions as to the channel's purpose, but Bloomberg News had been able to confirm and interview the small team of American owners in February 2020 regarding "Cocomelon", who stated their goal for the channel was to simply entertain children, wanting to keep to themselves to avoid attention from outside investors. The anonymity of such channel raise concerns because of the lack of knowledge of what purpose they are trying to serve. The difficulty to identify who operates these channels "adds to the lack of accountability", according to Josh Golin of the Campaign for a Commercial-Free Childhood, and educational consultant Renée Chernow-O’Leary found the videos were designed to entertain with no intent to educate, all leading to both critics and parents to be concerns for their children becoming too enraptured by the content from these channels. Content creators that earnestly make kid-friendly videos have found it difficult to compete with larger channels like ChuChu TV, unable to produce content at the same rate as these large channels, and lack the same means of being promoted through YouTube's recommendation algorithms that the larger animated channel networks have shared.

    More Info
  • In the wake of the March 2019 Christchurch mosque attacks, YouTube and other sites like Facebook and Twitter that allowed user-submitted content drew criticism for doing little to moderate and control the spread of hate speech, which was considered to be a factor in the rationale for the attacks. These platforms were pressured to remove such content, but in an interview with The New York Times, YouTube's chief product officer Neal Mohan said that unlike content such as ISIS videos which take a particular format and thus easy to detect through computer-aided algorithms, general hate speech was more difficult to recognize and handle, and thus could not readily take action to remove without human interaction.

    More Info

Featured

We don't show ads. Help us keep it that way.