Bully Pulpit
Bully Pulpit
It Wasn't Me. It Was My Dog.
1
0:00
-26:42

It Wasn't Me. It Was My Dog.

Facebook Shuts Down Independent Researchers and Falsely Blames the Government
1

Bob talks with Princeton scholar Orestis Papakyriakopoulos about the social media titan’s latest assault on transparency, and the all-too-familiar blame-shifting that followed it. That has become standard operating procedure from a company Bob describes as “amoral, except when it’s immoral.”

TEDDY ROOSEVELT: Surely there never was a fight better worth making than the one which we are in.

BOB GARFIELD: Welcome to Bully Pulpit. That was Teddy Roosevelt, I'm Bob Garfield. Episode 4: It Wasn't Me, It Was My Dog. Last week, Facebook abruptly shut down a research program by scholars, at New York University's Ad Observatory, who had been monitoring the company's political advertising inventory.

NEWSCASTER: Now, this whole battle started on Tuesday when Facebook disabled the accounts of researchers at the NYU Ad Observatory, Facebook explaining, quote, “NYU’s Ad Observatory project studied political ads using unauthorized means to access and collect data from Facebook in violation of our terms of service. We took these actions to stop unauthorized scraping and protect people's privacy in line with our privacy program under the FTC order.”

BG: Yes, Facebook's product management director, Mike Clark, claimed in a blog post that the company's hands were tied by the government. You know, just like Son of Sam claimed it was his dog who ordered him to kill.

Within 24 hours, Wired magazine and others revealed that the FTC consent order provided no such thing. Even the agency's Bureau of Consumer Protection weighed in, with acting director Samuel Levine writing to Facebook founder Mark Zuckerberg saying, quote, “I am disappointed by how your company has conducted itself in this matter.”

Please note that Levine didn't say surprised, just disappointed, because the history of Facebook is the history of Facebook conducting itself in disappointing ways, voicing shame and regret from the bottom of its heart, and then returning to deceptive and greedy business as usual.

MARK ZUCKERBERG (MONTAGE): We didn't take a broad enough view of our responsibility, and that was a big mistake and it was my mistake. This was a major breach of trust and, and I'm really sorry that this happened. We have a basic responsibility to protect people's data. And if we can't do that, then we don't deserve to have the opportunity to serve people.
NEWSCASTER: In 2003, Zuckerberg apologized in the Harvard Crimson for any harm done after his website FaceMash asked users to rate people's hotness. Three years later, Zuckerberg said Facebook, quote, “really messed this one up,” following user complaints that the newly launched news feed invaded their privacy.
NEWSCASTER: Zuckerberg apologized once again in 2007 for an uproar over the company's Beacon advertising system, saying, “I know we can do better.”

BG: That last part courtesy of CBS News. So the FTC wasn't surprised about the latest phony excuse for systematic opacity, and neither was Orestis Papakyriakopoulos, a postdoctoral research director at Princeton University's Center for Information Technology Policy. He's speaking to me from Athens, Greece. Orestis, welcome to Bully Pulpit.

ORESTIS PAPAKYRIAKOPOULOS: Glad to be here, Bob.

BG: All right, we'll get to your work shortly. But I want to begin with the NYU project. What were they studying?

OP: So, the NYU researchers had an Ad Observatory project. They were trying to monitor what ads are placed on Facebook and who sees them, like which demographics are targeted and so on — in order to provide additional transparency on how online advertising takes place.

BG: And what was the method? Were they, in fact, scraping content or metadata from the site in some clandestine fashion, as Facebook alleged?

OP: No, actually, they've developed a plugin that you put on your browser, the Ad Observer, and they asked users all over the world to use their plugin, and practically the plugin was recording what the users saw. So in this way, they could see which ads a user was targeted.

BG: Wait, so when Facebook invoked protecting user privacy, all of the users had proactively downloaded the browser extension and were giving explicit permission to the NYU people to see what ads they were being served.

OP: Exactly, but when Facebook uses the term users, they mean the advertisers who placed the ads. The advertisers did not give their permission to NYU to collect the information about the targeted ads.

BG: [chuckling]

OP: Yeah, exactly.

BG: I see, so the advertisers who pay money to have their ads seen we're skittish about having their ads seen.

OP: Exactly.

BG: Now, the whole point of the Facebook algorithm is that consumers get more and more content they have demonstrated interest in by clicking on it or commenting or sharing. That very same algorithm, though, takes the same user behavior data and allows advertisers to micro target to exactly the consumer profile they're most interested in, whether to buy a car or toothpaste or a political worldview.

OP: Yeah, so Facebook's business model until today is to use this data they collect to place personalized advertisements and they sell the space and they sell the tool they've developed so advertisers can place their ads.

BG: Selling the tools they've developed. This gets to the next sensitive area of privacy, because the FTC order that the company invoked last week came with a five billion dollar fine for violating an earlier 2012 consent decree after Facebook was caught not only being careless, but mercenary with users personal data. Can you remind me what the specifics were of the original complaint?

OP: Sure. So back in 2012, the FTC claimed that Facebook was violating numerous privacy rules. And more specifically, for example, users believed that they had put their accounts to private settings or some information that they had on their profile were not public, but advertisers still had the opportunity to collect this data. Another example of what was violated back then is that although users were deleting their profiles or saying that taking their information down, third party entities were still able to collect this data, although the users had removed their consent access on the platform.

BG: So then came the new order in 2019, in which the FTC said Facebook was found to be, quote, “deceiving users about their ability to control the privacy of their personal information.” Can you summarize the 2019 case?

OP: Sure. So going back to 2012, because Facebook violated specific rules, the FTC said that Facebook needs to change how it functions to make more clearer representations of what holds in privacy terms and what not, to inform users as well as to switch off all these back doors that gave data about users to third party individuals. And although Facebook started doing that, for example, what happened is that although new apps were not able to get this data, if you had an older up, you still were able to collect information. And this is the window that was exploited also by Cambridge Analytica, that the company used an app that was created in the past for a different purpose and started collecting data about users, and these data the users have not given their consent to give the data to the company.

BG: And this wasn't like, oops, careless of me. This had to have been done with malice aforethought.

OP: Yeah. So definitely Cambridge Analytica did it because they found an opportunity there to collect all this data. I don't know if Facebook knew about the backdoor or not, but definitely they did not do their job right.

BG: And then sat on the information for two years before the story finally blew up in the media.

OP: And going back to now to 2019, the FTC said, hey, Facebook did not conform to our claims. There are still issues with data privacy and Facebook need to conform to the older rules. Plus, there were some new issues that appeared. For example, Facebook need to make more transparency in how they use their face recognition technology and their platform. The FTC implemented stronger accountability mechanisms in cases that Facebook violates against the norm, and so on.

BG: So once again, disappointing but unsurprising. And just ,as is was the case with Cambridge Analytica, simply astonishing indifference to the abuse of its targeting algorithm. And this is whether permitting Trump friendly or Boris Johnson friendly foreign agents to spread toxic lies in a political campaign, or the Myanmar Buddhist military to incite pogroms with false accusations against the Muslim Rohingya minority. I've often described the company as amoral, except when it is immoral. Would you care to argue against that proposition?

OP: So definitely Facebook as every company, they look at their self-interest. This is what they were doing in the past and they are keep doing now. Their model is to collect as much data they can and find ways to sell it to get the most profit out of it. That also means that not disclosing a lot of things that are going on on the platform because these might make them accountable and also make them impose restrictions on their business model.

BG: And in fact, in the Cambridge Analytica affair, there were a number of universities and the United States Senate trying to look into how it could have all taken place. Facebook vowed transparency, but instead actually tried to stymie some researchers by failing to make its API fully available and so on. How cooperative were they even when they were most in the crucible following Cambridge Analytica?

OP: Generally, Facebook I think that transparency efforts of Facebook belong more to the marketing part of the company rather than an actual effort of the company to be more open with scientists and policy makers and so on. So they always try to give minimal data under rules that protect them 100 percent. And also the quality of the data information they provide usually is not able to answer key questions about the nature of the platform, how does it affect the society, the democracy and so on.

BG: All right. Let's talk about your work at the Center for Information Technology Policy at Princeton. According to your center's website, your research, quote “provides ideas, frameworks and practical solutions towards just, inclusive and participatory socioalgorithmic ecosystems through the application of data intensive algorithms and social theories.” So what, what do you do?

OP: So, for example, in the case of Facebook and online platforms in general, we try to understand how the tools and the algorithms they deploy are used politically, to place political ads to influence the public opinion. And as part of it, we look at Facebook, Google and YouTube, which belongs to Google, for example — or other platforms like TikTok, which are used a lot for political communication — and we ask who has access to the tools of the platforms, how do the tools of the platforms function and what effects they might have in the society. Like, who sees an ad, why, why they don't see an ad, is there probably a potential from discrimination, to are there other issues that may come as a side effect of seeing specific ads, and other further research questions.

BG: Now, I want to go back very briefly to the NYU people. Facebook claimed they had offered those researchers an alternative method with its very own FORT researcher platform, which in the name of science and transparency and societal good, it beneficently makes available to scholars. In fact, FORT stands for Facebook Open Research and Transparency. But you read that Mike Clark blog post about NYU and you were like, yeah, right, because you and your team tried to take the FORT and found it heavily defended.

OP: Exactly, and they said first they have a political ads library that is open to the public and they also provide the FORT data set where researchers can get access. And to start with the minor thing, the political ads library’s too general and does not actually provide information about who placed an ad to whom. You can also more or less see some general statistics about ads, like general demographics and location, who saw it, as well as the content of the ad.

BG: It seems to me as if someone was being investigated for murder and the person of interest says to the cops, here is the evidence you may choose from. I will provide this. You can use this and only this for making your case.

OP: Exactly, that's the one thing, and they also claim that they have the FORT data set. And it's interesting because back in February, the group I am in, we tried to get access to that data set and they provided us with a contract which we had to sign in order to get the data set without telling us what the data set includes. And this agreement that Facebook gave us said that actually Facebook possesses the power to decide if our research can get published or not. So we could do some research. They could review it then and they would say, OK, this is publishable or this is not, otherwise you need to remove that or that and so on. Which we found really problematic. Research need to be free, otherwise it becomes censored. And we asked them first, OK, can you tell us more? We cannot sign a contract without knowing what data we are getting, of course. And second, are we going to have the freedom to answer our research question? And the first answers of Facebook was we are not able to negotiate the terms we are proposing because this is mandated by the FTC and the Cambridge Analytica scandal. Which of course did not hold. The FTC decrees don't say anything about how researchers can access Facebook data.

BG: When Facebook played the FTC card last week, you were you were like, oh, I've seen this movie before. They're invoking government regulation that in fact, doesn't regulate the thing that they're trying to hide.

OP: Exactly. And because we saw how they treated the NYU researchers, and we were frustrated that they used again the FTC argument, we said, OK, we need to speak up and talk about our own experience because this cannot go on.

BG: So just to reiterate, it's a mystery package that you don't get to unwrap until you've signed an onerous contract, which specifies, among other things, that if Facebook doesn't like what you want to publish based on your access to FORT, then it just censors you. I want to return to the letter that the FTC official wrote to Mark Zuckerberg after the NYU controversy erupted last week. He addressed the subject of Facebook's trustworthiness to keep its word, not only only the long haul, but like in any time period whatsoever. He observes, quote, “Only last week, Facebook's general counsel, Jennifer Newstead, committed the company to ‘timely, transparent communication to Bureau of Consumer Protection staff about significant developments.’ Yet the FTC received no notice that Facebook would be publicly invoking our consent decree to justify terminating academic research earlier this week. Had you honored your commitment to contact us in advance, we would have pointed out to you that the consent decree does not bar Facebook from creating exceptions for good-faith research in the public interest.” They broke their promise and they did it absolutely immediately. How is anybody supposed to, in academia or elsewhere, supposed to deal with a company that appears to be out of control?

OP: I think that the answer is not company specific, but more general, like there need to be regulations that define what data online platforms and tech companies should provide to researchers, as well as how, because it's not only about the data that Facebook holds, it's also the data that Google holds and all the other platforms. And although focus is usually on Facebook, the other platforms also have a very high degree of opacity. So I do believe that policymakers and politicians need to step up and say, we need to bring regulation that forces Facebook and the other platforms to change how they function, to change what they disclose and what they not.

BG: All right, so there was a 2012 consent decree in which Facebook promised to make corrections to how it does business. It violated that consent decree, leading to the 2019 update, which expanded the government regulation and also fined them five billion dollars. Now, I know you're a data scientist and an engineer, but I'm going to ask you now to be a lawyer, too, because in the 2019 decree, the FTC said, quote, “It is ordered that respondent (that's Facebook) in connection with any product or service, shall not misrepresent in any manner, expressly or by implication, the extent to which the respondent maintains the privacy or security of covered information” — including, and this skips a few lines, “the extent to which the respondent makes or has made covered information accessible to third parties.” Now, I'm not a lawyer either, but it seems to me that what happened last week with NYU is explicitly a violation of that clause. They misrepresented the way they treat covered information, data that is, under the pretext of privacy or security. Is there going to be a 2021 update to the 2019 update to the 2012 order?

OP: I'm not a lawyer, but Facebook tries to exploit ambiguity in ways that conforms to their interests. And for example, that is to say that we are protecting users privacy in order to not allow the NYU researchers to understand how their tools are used.

BG: All right. You say ambiguity. This looks pretty expressly stated to me, but I guess this isn't your table. I will ask you what this all means. What are the implications of this dust-up involving a, you know, relatively small research project? What are the implications for the rest of us?

OP: It is an issue for the academic community because we as academics struggle to understand technological tools and how they affect the society with very little help in general. And really, this tool has also been invaluable for a lot of researchers and was a useful resource to understand Facebook ads, but generally it also shows how much power we have as academics. And we we need to make calls to policymakers to change things, because the research and the knowledge we can extract will be useful for them and the rest of the society.

BG: And concerning your work at Princeton, I know you haven't published yet, but I wonder if there's a sneak preview that you can offer of, if not your absolute finding some interesting tidbits along the way.

OP: First, we find limitations, strong limitations, what the data the provided can actually say, like we find unexplainable moderation practices like why ads were removed or not removed, although they define specific guidelines about how ads should be. We also find that a lot of ads are related to protected groups. And there are questions to understand how these protected groups were targeted and make political statements about it. But also, it's not also about our research. Like we are able to access only the data that Facebook gives through their political ads library. So there are thousands or even millions of ads that are placed and researchers cannot get access to them at all. And that's why NYU’s project was such a great resource, because there was no other way to get information about these advertisements. I find it personally troubling that there is so much opacity about online ads, but on other ads, like on TV or on radio, you get so much information. And they know there are legal and historical reasons why they are treated differently, but they should not.

BG: I want to ask you one final thing, Orestis. Like the wildfires that right now are ravaging Greece and California and elsewhere around the world, authoritarianism is raging. Disinformation has become not just an art, but a science. Millions and millions of people are foolishly swallowing lies and disinformation fed them by cynical politicians. The world is literally in flames. Why do companies like Facebook not rush to provide whatever data they can in support of better academic understanding of what is happening on our screens and in our psyches?

OP: I think they followed the idea of the less we provide, the safer we are. I do believe that if we had access to data, we could find positive effects of social media as well. So I don't believe that everything is bad. It's not black and white, but I think they believe that the less they give, the more protected they are because they are afraid that if a very strong regulation is passed, they will lose the ability to, to keep having the same business model they have until today with the same profits.

BG: Orestis, thank you so much for joining me.

OP: Thanks for having me Bob.

BG: Orestis Papakyriakopoulos is a postdoctoral research associate at Princeton University's Center for Information Technology Policy. Papakyriakopoulos was perhaps admirably circumspect in casting doubt more on capitalist self-interest than Facebook per se. But whenever these blowups occur, I think back to the first scene of the 2010 movie The Social Network, in which Zuckerberg, played by actor Jesse Eisenberg, is getting dumped by his girlfriend.

GIRLFRIEND: You are probably going to be a very successful computer person. You're gonna go through life thinking that girls don't like you because you're a nerd. And I want you to know from the bottom of my heart that that won’t be true. It’ll be because you’re an asshole.

BG: OK, we're done here. Before I sign off, though, I must remind you, I must implore you to comment, rate, share what you've heard here today. And not just Bully Pulpit, but the other Booksmart Studios shows like John McWhorter's Lexicon Valley and Amna Khalid’s Banished, both of which programs are like, whoa — tell friends, neighbors, family members, stop strangers on the street. The success of Booksmart, the impact of our work depends as much on you as on us. So please spread the word.

Also, if you become a paying subscriber to Booksmart Studios, you will get extended interviews, additional content of other kinds, access to the hosts and in my case, continued access to my weekly column, which is, for the moment, free to sample. Now then, Bully Pulpit is produced by Mike Vuolo and Matthew Schwartz. Our theme was composed by Julie Miller and the team at Harvest Creative Services in Lansing, Michigan. Chris Mandra and N’Dinga Gaba are our audio engineers. Bully Pulpit is a production of Bookmart Studios. I'm Bob Garfield.

1 Comment
Bully Pulpit
Bully Pulpit
A wry and pointed take on politics, media and society from Bob Garfield.
Listen on
Substack App
RSS Feed
Appears in episode
Bob Garfield