Politician’s Incentives Regarding Facebook

God I hope not but sounds plausible.

The Peltzman Model of Regulation and the Facebook Hearings – Marginal REVOLUTION

If you want understand the Facebook hearings it’s useful to think not about privacy or technology but about what politicians want. In the Peltzman model of regulation, politicians use regulation to tradeoff profits (wanted by firms) and lower prices (wanted by constituents) to maximize what politicians want, reelection.

Privacy Regulation Is Likely Unworkably Hard

Don't Count On The Government Regulating Facebook

Tyler Cowen provides a great analysis of one of the generic calls for regulating big data (and Facebook in particular). Putting this together with his previous post pointing out that it would cost us each ~$80/year to use facebook on a paid basis1. Taken together they make a compelling case that there is no appetite in the US for serious laws protecting data privacy and that whatever laws we do get will probably do more harm than good.

To expand on Cowen’s point a little bit let’s seriously consider for a moment what a world where the law granted individuals broad rights to control how their information was kept and used. That would be a world where it would suddenly be very hard to conduct a little poll on your blog. Scott Alexander came up with some interesting hypothesizes regarding brain functioning and trans-gender individuals by asking his readers to fill out a survey. But doing that survey meant collecting personal and medical information about his readers (their gender identification, age, other mental health diagnoses) and storing it for analysis. He certainly wouldn’t have bothered to do any such think if he was required to document regulatory compliance, include a mechanism for individuals to request their data be removed or navigate complex consent and disclosure rules (now you’ve gotta store emails and passwords making things worse and risk liability if you become unable to delete info). And what about the concerned parent afraid children in her town are getting sick too frequently. Will it now be so difficult for her to post a survey that we won’t discover the presence of environmental carcinogens?

One is tempted to respond that these cases are obviously different. These aren’t people using big data to track individuals but people choosing to share non-personally identifiable data on a survey. But how can we put that into a law and make it so obvious bloggers don’t feel any need to consult attorneys before running survey?

One might try and hang your hat on the fact that the surveys I described don’t record your email address or name2. However, if you don’t want repeated voting to be totally trivial that means recording an IP address. Enough questions and you’ll end up deanonymizing everyone and there is always a risk (Oops, turns out there is only one 45 year old Broglida). On the other hand google if it’s ok as long as you don’t deliberately request real world identifying information the regulation is toothless — google doesn’t really care what your name is they just want your age, politics, click history etc.. .

Well maybe it should only be about passively collected data. That’s damn hard to define already (why is a click on an ajax link in a form different than a click on a link to a story) and risks making normal http server logs illegal. Besides, it’s a huge benefit to consumers that startups are able to see which design or UI visitors prefer. So checking if users find a new theme or video controls preferable (say by serving it to 50% of them and seeing if they spend more time on the site) shouldn’t require corporate counsel be looped in or we make innovation and improvement hugely expensive. Moreover, users with special needs and other niche interests are likely to particularly suffer if there is no low cost hassle free way of trying out alternate page versions and evaluating user response.

Ultimately, we don’t really want the world that we could get by regulating data ownership. It’s not the world in which facebook doesn’t have scary power. It’s the world where companies like facebook have more scary power because they have the resources to hire legal counsel and lobby for regulatory changes to ensure their practices stay technically legal while the startups and potential competitors don’t have those advantages. Not only do we not want the world we would get by passing data ownership regulations I don’t think most people even have a clear idea why that would be a good thing. People just have a vague feeling of discomfort with companies like facebook not a clear conception of a particular harm to avoid and that’s a disastrous situation for regulation.

Having said this, I do fear the power of companies like facebook (and even governmental entities) to blackmail individuals based on the information they are able to uncover with big data. However, I believe the best response to this is more openness and, ideally, an open standards based social network that doesn’t leave everything in the hands of one company. Ultimately, that will mean less privacy and less protection for our data but that’s why specifying the harm you fear really matters. If the problem is, as I fear, the unique leverage being the sole possessor of this kind of data provides facebook and/or governments then the answer is to make sure they aren’t the sole possessor of anything.

Zeynep Tufekci’s Facebook solution – can it work? – Marginal REVOLUTION

Here is her NYT piece, I’ll go through her four main solutions, breaking up, paragraph by paragraph, what is one unified discussion: What would a genuine legislative remedy look like? First, personalized data collection would be allowed only through opt-in mechanisms that were clear, concise and transparent.


  1. Now, while a subscription funded facebook would surely be much much cheaper I think Cowen is completely correct when he points out that any fee based system would hugely reduce the user base and therefore the value of using facebook. Indeed, almost all of the benefit facebook provides over any random blogging platform is simply that everyone is on it. Personally, I favor an open social graph but this is even less protective of personal information. 
  2. Even that is pretty limiting. For instance, it prevents running any survey that wants to be able to do a follow up or simply email people their individual results