NEW YORK (AP) — YouTube is hiring more people to help curb videos that violate its policies.
YouTube CEO Susan Wojcicki said “some bad actors are exploiting” the Google-owned service to “mislead, manipulate, harass or even harm.”
Google will have more than 10,000 workers address the problem by next year, though her blog post Monday doesn’t say how many the company already has. Spokeswoman Michelle Slavich said Tuesday that some have already been hired, and the team will be a combination of employees and contractors.
Wojcicki said the company will apply lessons learned from combating violent and extremist videos to other “problematic” videos. YouTube will expand the use of “machine-learning” technology, a new form of artificial intelligence, to flag videos or comments that show hate speech or harm to children. It’s already been using the technology to help remove violent extremist videos.
Several advertisers have reportedly pulled ads from YouTube in the past few weeks as a result of stories about videos showing harm to children, hate speech and other topics they don’t want their ads next to. Some 250 advertisers earlier this year also said they would boycott YouTube because of extremist videos that promoted hate and violence. YouTube said Monday that it is also taking steps to try to reassure advertisers that their ads won’t run next to gross videos.
There have been reports of creepy videos aimed at children and pedophiles posting comments on children’s videos in recent weeks. There was also a conspiracy-theory video that came up in a YouTube search a day after the October Las Vegas shooting that killed dozens.
Google isn’t the only tech company that’s stepping up human content reviews to help it police its platform after criticism. Facebook in May said it would hire 3,000 more people to review videos and posts, and, later, another 1,000 to review ads after discovering Russian ads meant to influence the U.S. presidential election.