20 December 2011

Anonymity

The 43 page 'Lessons Learned Too Well', a paper by Michael Froomkin for the Oxford Internet Institute’s September 2011 conference A Decade in Internet Time: Symposium on the Dynamics of the Internet and Society, "examines, contextualizes, and critiques an international trend towards the regulation of anonymity".

The paper -
describes private incentives and initiatives during the past decade that resulted in the deployment of a variety of technologies and services each of which is unfriendly to anonymous communication. It then looks at three types of government regulation, relevant to anonymity: the general phenomenon of chokepoint regulation, and the more specific phenomena of online identification requirements and data retention (which can be understood as a special form of identification).

The concluding section takes a pessimistic view of the likelihood that given the rapid pace of technical and regulatory changes the fate of online anonymity in the next decade will be determined by human rights law rather than by the deployment of new technologies or, most likely, pragmatic political choices. It therefore offers normative and pragmatic arguments why anonymity is worth preserving and concludes with questions that proponents of further limits on anonymous online speech should be expected to answer.

The consequences of an anonymity ban are likely to be negative. This paper attempts to explain how we came to this pass, and what should be done to avoid making the problem worse.
Froomkin comments that -
There are those who say that in order to be safe we will have to create an infrastructure of mandatory identification. Some, including many of those charged with making decisions for the public’s safety, clearly say it in the best of faith. Other argue, sometimes despite the evidence, that we in the US must do so to protect the profits of an industry important to our trade balance. It is all very well for academics, often living in genteel surroundings, to ask that we not give in to fear, and to reply that before we create a regime that may be persistent and eventually ineradicable we should first ensure that there are no less restrictive means, and that we should consider all the externalities. But that is our job.

Here, then, are a few suggestions for avoiding what could otherwise be an outcome we likely will regret, also based on lessons learned from the past twenty years or so. Several of these concepts are already present in European data protection law, but none of them are legal requirements in the US today.
• Demand evidence of the need for mandatory identification and data retention rules, and insist the rules be proportional to the need.
• Avoid rules that lock technology into law.
• Always consider what an identification rule proposed for one purpose can do in the hands of despots.
• Empower user self-regulation whenever possible rather than chokepoint regulation.
• Design filters and annotators before designing walls and takedown mechanisms.
• Require transparency. Make it an offense for devices to make records without clear, knowing, and meaningful consent on the part of the speaker, reader, listener, or viewer.
• Build alternatives in technology and law that allow people to control how much their counterparts know about them, and which by making selective release of information easier reduce the need for a binary choice between anonymity or data nudity.
• Require that privacy-enhancement be built in at the design level.
Those who disagree with these suggestions worry, with some reason, about new technology undermining the powers of states and sovereigns. Why is allowing people to speak freely to each other, without fear of eavesdroppers or retaliation, such a terrible thing? After all, most core government powers, like the power to tax, will not in fact be undermined in any substantial way by unfettered communication so long as we still need to eat and we want physical things such as houses. The issues are the same ‘four horsemen’ they have been for many years: fear of terrorism, money-laundering, child pornographers and drug-dealers, to which one might add in some countries, revolutionaries.

The flip side of these fears is the recognition that even if the power to speak freely and privately is sometimes misused, it is also empowering. Communicative freedom allows people to share ideas, to form groups, and to engage not just in self-realization, but in small scale and even mass political organization. Here then is the most important lesson to be learned, but one that needs to be learned over and over again:
Protections for anonymous speech are vital to democratic discourse. Allowing dissenters to shield their identities frees them to express critical, minority views . . . Anonymity is a shield from the tyranny of the majority.
The Internet and related communications technologies have shown a great potential to empower end-users, but also to empower firms and especially governments at their expense. Governments (and firms) around the world have learned this lesson all too well, and are taking careful, thorough, and often coordinated steps to ensure that they will be among the winners when the bits settle.

The thing to watch out for, therefore, is whether we, and especially those individuals already burdened with repressive regimes, will be among the winners also.