• Aucun résultat trouvé

Full- Disclosure Future

Dans le document THE BLACK BOX SOCIETY (Page 65-68)

Even if absolute secrecy could somehow be demo cratized with a universally available cheap encryption tool, would we really want it?

I don’t think I want the NSA blinded to real terrorist plots. If some-one developed a fl eet of poison- dart drsome-ones, I’d want the authorities to know. I wouldn’t want so- called “cryptocurrencies” hiding ever more money from the tax authorities and further undermining public fi nances.214 Biosurveillance helps public health authorities spot emerging epidemics. Monitoring helps us understand the fl ow of traffi c, energy, food, and medicines.215

So while hiding— the temptingly symmetrical solution to surveillance— may be alluring on the surface, it’s not a good bet. The ability to hide— and to detect the hiders— is so comprehensively commodifi ed that only the rich and connected can win that game.

The help and the harm of information collection lies not in the in-formation itself but in how it is used. The decisions we make about that have plenty to tell us about our priorities.

The digital economy of the moment prioritizes marketing over productivity. It’s less likely to reward the builder of a better mouse-trap than to fund start- ups that identify people likely to buy one.

The critical point is no longer the trap or even the rodents, but the data: the constant streams of numbers that feed algorithmic systems of prediction and control. Profi ling is big business in an economy like that. Cyberlibertarians used to brag that the Inter-net “reads censorship as damage and routes around it”; replace

“censorship” with “privacy” and the statement would be just about as true.216

Much of the writing about the scored world focuses on how to outwit the evaluators— how to get an 800 credit score, how to “ace”

job personality tests. But this vast and growing literature ignores the possibility of criticism, much less re sis tance. Economic models of the data can be even worse, complacently characterizing person-alization as a mere matching problem (of, say, the riskiest borrowers to the highest interest rate loans). From a legal perspective, things can look very different: myriad penalties are imposed without even a semblance of due pro cess.

If we’re not going to be able to stop the fl ow of data, therefore, we need to become more knowledgeable about the entities behind it and learn to control their use of it. We need to hold business and government to the same standard of openness that they impose upon us— and complement their scrutiny with new forms of ac-countability. We need to enforce the laws that defi ne fair and un-fair uses of information. We need to equalize the surveillance that is now being aimed disproportionately at the vulnerable and ensure as best we can that critical decisions are made in fair and nondis-criminatory ways. We need to interrupt the relentless cascades of judgment that can turn one or two mistakes into a self- fulfi lling prophecy of recurrent failure. And we need to plan for the inevita-bility that as soon as we open one black box, new modes of opacity will arise.

Thomas Jefferson once said that “he who receives an idea from me, receives instruction himself without lessening mine; as he who lights his taper at mine, receives light without darkening me.”217 To many of us this is an inspiring vision. But the total in-formation dominance to which America’s defense, police, and cor-porate institutions now aspire refl ects a diametrically opposed mind- set. The black box society is animated by the belief that information is useful only to the extent that it is exclusive— that is, secret. Terrorists have to be kept in the dark because they’re dan-gerous. Sick people have to be kept in the dark because they’re expensive. To faceless algorithms, we might be terrorists, or sick.

So we are kept in the dark, too.

It is time to reclaim our right to the presumption of innocence, and to the security of the light. It may be that we cannot stop the collection of information, but we can regulate how it is used. This is easier said than done; data collection has run so wild that it will take time and effort to purify reputation systems of inaccurate or unfair data points. But the alternative is worse. One of the best- known pri-vacy blogs is entitled “Pogo Was Right,” in honor of the old comic book tag “We have met the enemy, and he is us.” The rebuke is ob-vious: we’d better stop being so careless about how technology cre-ates reputations, and start to rein in arbitrary, discriminatory, and unfair algorithms. Chapter 5 suggests some initiatives for achieving

that end. But to fully understand how they might work, and how needed they are, we need to turn from technologies of reputation (which increasingly mediate how we are perceived), to technologies of search (which mediate how we perceive). Search is the topic of the next chapter.

3

THE HIDDEN LOGICS

Dans le document THE BLACK BOX SOCIETY (Page 65-68)