MARY LOUISE KELLY, HOST:
Now it's time for All Tech Considered.
(SOUNDBITE OF ULRICH SCHNAUSS' "NOTHING HAPPENS IN JUNE")
KELLY: Hate speech, terrorism, foreign interference in American democracy, fake news - we have spent many hours talking about what went wrong online during the 2016 presidential election. Well, tech companies are working hard to avoid a repeat as the 2018 midterm elections approach. They're fast upon us. And here to tell us what Facebook, Google and Twitter are doing is NPR tech correspondent Aarti Shahani. Hey, Aarti.
AARTI SHAHANI, BYLINE: Hi.
KELLY: Let's talk first about Facebook. It took a long time for CEO Mark Zuckerberg to acknowledge that Facebook was a platform for fake news, fake ads during the 2016 election. What are they doing to get ready for this coming November?
SHAHANI: Well, none of the tech giants claims to be ready, OK? They each have a version of, we're getting there but only for ads explicitly about candidates like Donald Trump or Hillary Clinton, not ads about issues like DACA or global warming. Facebook says it's creating a new documentation process where advertisers may be required to verify who they are and where they are.
KELLY: And then I suppose there's the process of verifying that they're telling the truth, which is another layer of this.
SHAHANI: That's right. You're getting it, exactly.
KELLY: Yeah, well, let me move you to Google and YouTube, we should mention, as part of Google. What changes are they making?
SHAHANI: So Google does not yet have a working definition of what counts as an election ad. They're working on that. And what each company is desperately trying to stave off is any kind of regulation. They don't want politicians putting rules on them like legacy media has.
KELLY: What about, Aarti, the favored platform of our tweeter in chief? I'm talking of course about Twitter. Their latest, I was interested to read, is Twitter is reaching out to something like close to 700,000 people here in the U.S., letting them know if they maybe were shown political ads on their Twitter feed that may have had some kind of link to Russia.
SHAHANI: Yeah. And you know, Twitter is much smaller than Google and Facebook. And a spokesperson at Twitter tells me their investigation into the 2016 election mishap is setting back their efforts to fix the broken advertising tools in time for the midterms. They're concerned about that. And now, you know, each company has a fundamental tension.
By way of analogy, let's take a nightclub. The official policy would be no one under 21 allowed. But you don't want to check ID because you're afraid that'll slow down the line and you'll lose customers, so you try to figure some workaround, some other way to do it. Now, when I shared that (laughter) specific analogy with Twitter, got to say, the spokesperson laughed very hard and told me, these are exactly the questions we're asking.
KELLY: One related thing to this - I wanted to follow up on my question about Facebook, which is, Facebook last week said it's going to start picking and choosing what news to show us, what news we're going to see. This is related to the quest to show us less fake news.
SHAHANI: Yeah. And you know, CEO Mark Zuckerberg - he posted on his Facebook page that the company is going to start showing more news from sources that are high-quality, in his words, and push down stuff that purports to be news but may be lower-quality. This is the single biggest announcement the company has made directly in response to the fake news controversy.
And you know, it is important to point out that many experts say that this is how a powerful tech giant censors speech on the Internet. Now, granted, Facebook is not zapping away articles. What they're doing in an attention economy - right? - it's about a competition for people's eyeballs here - is that the company censors by pushing things so far down a bottomless feed that eyeballs never get to see it.
KELLY: Which of course prompts the question of how Facebook is going to decide what counts as high-quality news and what's low-quality news.
SHAHANI: They're going to let the crowd decide. They're asking Facebook users to vote on it.
KELLY: What's been the reaction from mainstream news media to that?
SHAHANI: Well, veteran news leaders are, to put it mildly, horrified that Facebook would take an important decision like that and throw it to a popular vote. There's another kind of criticism among conservatives who are concerned that liberal Facebook will push down right-wing content. And meanwhile, Facebook has decided they're not offering interviews to talk about the rationale for their move.
KELLY: NPR's Aarti Shahani - thank you, Aarti.
SHAHANI: Thank you.
(SOUNDBITE OF MINOTAUR SHOCK'S "MY BURR")