After receiving criticism globally for questionable content that had been uploaded on YouTube, Google has announced that it will be hiring thousands of new moderators to review content.
Reports have recently emerged that videos depicting child abuse.
Google LLC is an American multinational technology company that was founded in 1998 by Larry Page and Sergey Brin while they were Ph.D. students at Stanford University, in California. It specializes in Internet-related services and products. These include online advertising technologies, search, cloud computing, software, and hardware. In August 2015, Google announced plans to reorganize its various interests as a conglomerate called Alphabet Inc.
Alphabet declared that it made $3.5 billion in net income and saw sales of $26 billion in the second quarter of 2017. In June 2017, the European Union antitrust regulators slapped Alphabet with a record $2.7 billion fine. The verdict was for the first of three investigations currently being conducted on the search giant. The European regulator accused the company of using its dominance in the industry for pushing its own advertising business. Google, the EC has noted, uses its considerable clout to promote its own services at the cost of other businesses.
However, the company has grown significantly in the second quarter. The revenue is up by 21% as compared to the same period in 2016. Much of Google’s revenue comes from two advertising programs called AdWords and AdSense (Google’s annual revenue stands at $90 billion). Both programs have been under EC’s scanner since 2010.
YouTube is an American video-sharing website headquartered in San Bruno, California. The service was created by three former PayPal employees—Chad Hurley, Steve Chen, and Jawed Karim—in February 2005. Google bought the site in November 2006 for US$1.65 billion; YouTube now operates as one of Google's subsidiaries.
Google and YouTube have been criticized for questionable content that have been allowed to exist on the platform in recent months. It has been accused of spreading fake news and propaganda about a Texas mass shooting suspect just weeks after the attack took place.
YouTube has been accused of “infrastructural violence” against children due to its role in the creation of vast quantities of low-quality, disturbing content aimed at pre-schoolers. James Bridle, a campaigning technology-focused artist and writer, has argued how the video platform’s algorithmic curation drives enormous amounts of viewers to content made purely to satisfy those algorithms as closely as possible.
He writes: “These videos, wherever they are made, however they come to be made, and whatever their conscious intention (ie to accumulate ad revenue) are feeding upon a system which was consciously intended to show videos to children for profit. The unconsciously-generated, emergent outcomes of that are all over the place.
Some families of people killed in high-profile shootings have many hours reporting abusive videos about their deceased children but to little success. After receiving criticism globally for questionable content that had been uploaded on YouTube, Google has announced that it will be hiring thousands of new moderators to review content. Reports have recently emerged that videos depicting child abuse.
In a blog post she noted, “As the CEO of YouTube, I’ve seen how our open platform has been a force for creativity, learning and access to information… But I’ve also seen up-close that there can be another, more troubling, side of YouTube’s openness. I’ve seen how some bad actors are exploiting our openness to mislead, manipulate, harass or even harm.”
She added, “Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content. We will continue the significant growth of our teams into next year, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018.”
Since June 2017, YouTube’s trust and safety teams have reportedly reviewed 2 million videos for violent extremist content and helped train machine-learning technology to identify similar videos in the future.
Our assessment is that unlike machines that operate on algorithms, human beings can still employ judgement in deciphering the intent of content. However, in a platform as vast as YouTube, it will be difficult to police and monitor every one of the millions of videos that gets uploaded regularly. However, more manpower will result in better results.