There are times when it’s difficult to pick a topic to write about, because data privacy and data partnerships are such broad subjects that it’s like selecting from an unlimited menu of issues. We could discuss GDPR or CCPA or a data breach or technological trends or creating a mutually beneficial data partnership — the possibilities are endless, really. Sometimes, though, it would be nice if there was just an obvious topic to discuss, something so clearly the topic of the day that everything else fades into the background and I immediately know what to discuss. If only…
Yes, Mr. Zuckerberg’s Washington Post op-ed has generated more than a little buzz over this weekend, astonishing some and angering others. He outlines Facebook’s commitment to doing things more transparently, to controlling improper content, and to establishing a safe space for speech — especially political speech — that fits in with established legal frameworks around the world.
Oh, sorry, that wasn’t this week’s Washington Post op-ed from Zuckerberg — that’s the one he wrote six months ago, promising that Facebook would do better and be better. Or wait, hold on, was that the one he wrote in the Wall Street Journal back in January saying the same thing? No, sorry, wasn’t that the blog post from early March when he announced those same things and said that Facebook was now in the privacy business? I’m not sure how I mixed these up, because this week’s op-ed is where he promises to do things more transparently, to control improper content, and to . . . establish . . . a safe space for speech . . . . Hmm.
So maybe Zuckerberg is the boy who cried privacy — four major announcements on privacy in six months with no discernible change in how Facebook operates. But this week’s op-ed was different, wasn’t it? He came out and said that he welcomes more regulation, that global privacy rules are essential, and that data portability is going to be key. Finally, you say, Facebook has seen that privacy is important and we are all going to be less surveilled!
No one writes an op-ed announcing the end to their business model; no one even writes an op-ed to announce a fundamental change to their business model, because a CEO who wants to do that is too busy fending off derivative lawsuits or activist shareholders or a potential successor that wants the lead role. Nothing of the sort is going on here, nothing of the sort has changed since last year’s hearings on the fallout from Cambridge Analytica. Zuckerberg is writing these op-eds because nothing has changed — if it had, he wouldn’t have to keep repeating himself.
He sets out four main areas where he wants the Internet to change — notice that it’s the Internet that has to change, and not Facebook — and lists his proposals for how to do it. On close examination, you’ll see that his argument is really a carefully constructed effort to guide any regulation in the direction that best suits his company. To be clear, he has every right, and possibly even a fiduciary duty, to do this. We just want to make clear what he’s actually conveying.
Issue One — Harmful Content. Zuckerberg identifies “harmful content” online as a source of distress, and one that requires direct attention. He notes that Facebook tries to keep it off the web, and that the company is working with regulators to improve controls. His primary suggestion is:
One idea is for third-party bodies to set standards governing the distribution of harmful content and to measure companies against those standards. Regulation could set baselines for what’s prohibited and require companies to build systems for keeping harmful content to a bare minimum.
To clarify, he’s suggesting that a regulatory or quasi-regulatory body set up rules for what kind of speech can and cannot be allowed on the Internet, and not just on Facebook. That means an entity will have authority to set decency standards online, which means it will be regulating speech, which means its activities will almost certainly be subject to the First Amendment, which means Congress has to establish a Speech Regulatory Authority, which means that this. is. never. going. to happen. Never.
To the extent governments authorize a body to set standards for how harmful content is removed from the Internet, we’re coming awfully close to the regime imagined under the EU Copyright Directive, which all but guarantees that filtering systems must be deployed to scrub offensive or harmful content. So Zuckerberg is saying “if you globalize Article 13 from the Copyright Directive, it will establish systems for minimizing harmful content.” He explicitly states in the op-ed that Facebook has created systems and teams that analyze content and remove offensive material — now he’s asking regulators to force everyone else to establish something similar or, more likely, hire Facebook’s teams or license their software. And just like that, Facebook has created a new revenue stream.
Issue Two — Campaigning and Political Rules. He goes on to note that the risks to the political process are such that rules about political speech need to be updated.
Online political advertising laws primarily focus on candidates and elections, rather than divisive political issues where we’ve seen more attempted interference. Some laws only apply during elections, although information campaigns are nonstop. And there are also important questions about how political campaigns use data and targeting.
In other words, those filtering systems and teams that Facebook is now licensing or hiring out will be busy, because we’re not just looking to police political speech during campaigns or content about specific candidates. Instead, we’re going to be overseeing content related to “divisive political issues,” which includes every conceivable subject. How will we know if the subject matter is harmful content or protected speech? Why, “we’ll know it when we see it!” Or, more likely, we won’t. Issue Two, then, is simply a universal expansion of the theme in Issue One.
Issue Three — Global Privacy Regulation. Here, Zuckerberg makes the case for a harmonized, global approach to regulation, saying that he would welcome a new law like GDPR on a global scale. Nevermind that during the debates leading up to the passage of CCPA — which is substantially less burdensome than GDPR — Facebook was a vociferous, and consistent, critic. And let’s toss aside the troubling allegations of how Facebook attempted to leverage leading government figures to make GDPR more friendly.
Instead, let’s focus on two salient points. First, Zuckerberg bluntly calls for “a common global framework — rather than regulation that varies significantly by country and state — [to] ensure that the Internet does not get fractured, entrepreneurs can build products that serve everyone, and everyone gets the same protections.” In other words: preemption. Large companies with sizeable compliance departments are far better suited to handle a single, overarching federal regulation as opposed to multiple regimes, and vice versa for small businesses. It’s the same reason that smaller businesses (the ones that were not responsible for the 2002–2003 financial crisis) paid the heaviest price under Sarbanes-Oxley. And because a federal law is less likely to contain aggressive enforcement mechanisms (like private causes of action), whatever Congress makes is less likely to be scary than what Austin, Tallahassee, and Sacramento make.
The second point is that, even as he asks for a GDPR-style law, Zuckerberg sets parameters that are so broad that they all but preclude any law in the near future:
As lawmakers adopt new privacy regulations, I hope they can help answer some of the questions GDPR leaves open. We need clear rules on when information can be used to serve the public interest and how it should apply to new technologies such as artificial intelligence.
To reiterate, he’s asking Congress to identify the nature and scope of public interest uses for personal data, taking into consideration the effect of emergent technology like AI and ML. He’s asking this of the same group that had a difficult time forming a coherent question when he testified there last year.
Issue Four — Data portability. The GDPR mandates data portability as a right for data subjects, which means that an individual has the right to have her data transferred from one controller to another. What that means as a practical matter is far harder to grasp, because 1) very few people are trying it, and 2) the technical capability for doing so is limited. Zuckerberg notes that a shared language or data sharing framework would make this process much simpler, and we certainly agree.
But then, in the middle of a paragraph, there’s a shift:
True data portability should look more like the way people use our platform to sign into an app than the existing ways you can download an archive of your information.
Wait, what? Data portability, as presently understood, means the right of a user to take their data and move it to another platform if they choose. Zuckerberg, here, says that portability should be about the fluidity of moving data from one platform to another — that is, a seamless transfer of data across systems. Notice who is missing from this new framework? You no longer take your data and bring it to another platform: Facebook sends it over for you. Facebook becomes the medium by which your data is ported — hence, Facebook DataPorter©. It’ll be free, of course, you just have to let them see the data they’re porting (and maybe copy it, but just for archiving purposes you guys!)
The entire purpose of data portability is to give data subjects control over the movement of their data — transparency. What happens when your active role disappears from this process? Opacity. Facebook is offering you the same deal it always offers, in a new package: your data is transparent to Facebook, Facebook’s process is a black box to you. It’s portable and convenient, but you are removed from the active role in controlling its secondary or tertiary uses.
It isn’t cynical to take a hard look at what the op-ed means, because there is a non-zero chance that it will be the party line Facebook adopts when it lobbies Congress for privacy legislation. In fact, understanding the ramifications of the kind of law Facebook wants (and the resulting market position it would give Facebook) is essential for businesses large and small. It’s fine to roll your eyes or laugh at a snarky meme, but this isn’t just an April 1st issue — if you aren’t paying attention to how the biggest movers in the market are trying to shape the laws that affect you and your business, then regardless of when those laws come to be, the joke will be on you.
Originally published at wardpllc.com on April 1, 2019.