Skip to main content
digital media
Open this photo in gallery:

Juniper Downs testifies to the Senate hearing on monitoring extremist content online on Capitol Hill on Jan. 17, 2018 in Washington, DC.Tasos Katopodis

Sometimes it’s a hassle being number one. Sure, each day on YouTube, the Google-owned Goliath video platform, people upload more than 65 years’ worth of content. But so-called bad actors – terrorists, child abusers, pornographers, conspiracy theorists, hate mongers, and others – are infecting the site in ever more frenzied ways, dragging it into some ugly headlines. On Monday afternoon, a few hours before YouTube was set to release its first-ever quarterly transparency report on the steps it is taking to combat controversial and toxic content, Juniper Downs, the site’s director of public policy and government relations, spoke with the Globe.

Last year, Google announced it would hire 10,000 people to help fight toxic content, from engineers engaged in machine learning to a raft of new content moderators for YouTube. Why has it taken so long to do this?

This is an area we’ve always been invested in. A couple of things have happened at once. The use of these kinds of services by bad actors has continued to increase. And the motivation to exploit these services toward ill ends has increased. Technology has also gotten better. What we’re doing now with machine learning, we couldn’t have done five years ago. So as the technology improves, it can do more of the heavy lifting, to help us detect this content.

Whether it’s machines or people sniffing out the toxic content, though, inevitably there will be overreach and you’ll remove something that hasn’t violated your Community Guidelines. Recently, YouTube was accused of a “purge” of right-wing channels, and of being anti-free speech, because you accidentally removed some content. How much of a concern is that?

That is a major concern. If you think about YouTube’s DNA, we often talk about there being four freedoms that represent the core values of the company: Freedom of expression, freedom of information, freedom of opportunity, and the freedom to belong. Any system is going to make mistakes, which is why we have an appeals process.

Our policies are not crafted to target particular political points of view, they’re crafted to try to make sure we’re keeping the most egregious content off the site. We have a hate speech policy, we don’t allow incitement of violence toward people on the basis of gender, race, religion, sexual orientation, etc. It may be that people on one side of the political spectrum tend to violate that more than the other.

Can you actually solve the problem of controversial or toxic content, when your algorithms are designed to keep people clicking and engaged, and they’re drawn to that kind of content?

We think about growth or engagement with our service as “responsible growth:” How can we engage users in a way where we’re continuing to deliver a reliable, trustworthy product? Even when we have to make decisions that create a dip in run-time in the short run. When it comes to [content categories] like news, where obviously veracity is more important than virality: If you’re looking for news about the world, you want to make sure what you’re reading is trustworthy. We have invested a lot.

Still, there’s been an explosion on your site of those who are – to put it politely – not dealing in facts?

I think a lot of people might want to watch vloggers who opine about current events, and they find that entertaining. But when they’re actually seeking out what happened in Las Vegas, for example, we want to serve authoritative sources for that. We don’t want to serve people who are opining without doing fact-based journalism.

Then how do you explain that Alex Jones, one of the most notorious peddlers of fake news and conspiracy theories, who has suggested that the Sandy Hook shooting was a “false flag” incident – that is, a government-staged hoax – remains on YouTube?

Yeah. He is on the platform, but he’s not part of our news corpus. So if someone is indicating an interest in a news event, Alex Jones is not going to rank highly in those results, because we do try to understand when a user is expressing interest in news and give them authoritative sources for that. This stuff is all obviously really tricky and we could maybe set up more time to talk about how we handle misinformation…

I see I’m running out of time. So let me just ask, are you taking these steps, such as the release of today’s report, because of pressure from your users, or a concern that you’ll face more regulation if you don’t get it right?

We operate in 80 countries around the world, in very complex regulatory landscapes in many of them. I think the primary motivation here is the more intrinsic motivation in staying true to what we want YouTube to be – to our end users, to our creators, to our advertisers, to our viewers, to everyone who’s participating in the ecosystem. We created these policies, because, despite our commitment to access to information and freedom of expression, it’s not “anything goes” on YouTube. We did not create the service to be another porn site, right? So, we’ve never allowed porn. It’s not a value judgment on porn, it’s just that’s not the service we want to create.

A lot of this investment is not wanting our service to fall into the hands of bad actors, and be exploited or manipulated in ways that are not what we intend.

This interview has been condensed and edited.

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe