[This post is the the third in a series (1, 2).]
Like Marshall Kirkpatrick, I want it all.
I want my data to be free, I want to be in control of it and I want to have control over my privacy as well. Is that too much to ask? The watchdog group Privacy International released their annual report today about privacy around the world and put the US in the lowest category - "endemic surveillance societies." Can we figure out how we can minimize surveillance while balancing privacy and the incredible opportunities that come from making at least some of our data open?
In the background of Marshall's overview of contemporary privacy issues are discussions of our "post-privacy era." Chris Messina, who has been involved in developing standards and technologies for handling personal data on the internet, writes:
My somewhat pessimistic view is that privacy is an illusion, and that more and more historic vestiges of so-called privacy are slipping through our fingers with the advent of increasingly ubiquitous and promiscuous technologies, the results of which are not all necessarily bad (take a look at just how captivating the Facebook Newsfeed is!)
Still ... there needs to be a robust dialogue about what it means to live in a post-privacy era, and what demands we must place on those companies, governments and institutions that store data about us, about the habits to which we’re prone and about the friends we keep...
I think there needs to be a broader, eyes-wide-open look at who has what data about whom and what they’re doing about — and perhaps more importantly — how the people about whom the data is being collected can get in on the game and get access to this data in the same way you’re guaranteed access and the ability to dispute your credit report. The same thing should be true for web services, the government and anyone else who’s been monitoring you, even if you’ve been sharing that information with them willingly.
The history of the US government's surveillance of its own citizens says to me that privacy has actually always been an illusion. Old FBI files show the government maintaining decades worth of minutia on people's affiliations and associations. For example, in close to 1000 pages of FBI documents that I have on the Greater NY Council for a Sane Nuclear Policy in the early 1960s (when my father was the Executive Director), for practically every person mentioned there are lists of political meetings they were known to have attended and organizations they had been members of, often dating back to the 1940s.
There's been an exponential increase of scale in the quantities of personal data available, numbers of people from whom it can be collected, breadth of access (i.e., public) and capacities to process and sort the data collected. But the problem 50 years ago and today is that, as Loren Feldman has aptly put it,
It's about people that you give the data to being honest and trustworthy with it ... It's not about you, it's about them.
The recent breakthrough (which Loren is being skeptical about) is that
The DataPortability Workgroup announced ... that representatives from both Google and Facebook are joining its ranks. The group is working on a variety of projects to foster an era of Data Portability - where users can take their data from the websites they use to reuse elsewhere and where vendors can leverage safe cross-site data exchange for a whole new level of innovation. Good bye customer lock-in, hello to new privacy challenges. If things go right, today could be a very important day in the history of the internet.
The non-participation of Google and Facebook, two companies that hold more user data and do more with it than almost any other consumer service on the market, was the biggest stumbling block to the viability of the project. These are two of the most important companies in recent history - what's being decided now is whether they will be walled-garden, data-horders or truly open platforms tied into a larger ecosystem of innovation with respect for user rights and sensible policies about data.
Yeah I'm worried about Facebook and Google, but I'm also worried about the US government's data silos at DHS fusion centers and the free pass Congress gave to companies handing data over to the NSA.
Having full control over one's own data will be a great improvement to online life and is a step towards our being able to hold companies accountable for how they use it. Sadly, The Data Portability Workgroup will have little overall impact as long as current domestic surveillance practices continue unchecked and other non-participating companies like Unisys hold private contracts with the government to handle vast quantities of our personal data. The hundreds of companies with Department of Homeland Security and Department of Defense contracts to handle our personal data have no regard for the consumer market, only for their government client.
And then there's the Total Information Awareness (TIA) program.
From Wired's Threat Level blog:
Those who follow the government's post-9/11 attempts to ferret out terrorism plots from massive amounts of data using algorithms have long known that the Total Information Awareness program never really died, despite Congress's funding cut.
Instead, parts of the program moved around and disappeared in the Pentagon and Intelligence communities' black budgets, which are classified.
But, Shane Harris of the National Journal brings news that the Director of National Intelligence is looking to bring TIA back to life in whole, this time with the name Tangram.
The system, which is run by the Office of the Director of National Intelligence, is in the early research phases and is being tested, in part, with government intelligence that may contain information on U.S. citizens and other people inside the country. It encompasses existing profiling and detection systems, including those that create "suspicion scores" for suspected terrorists by analyzing very large databases of government intelligence, as well as records of individuals' private communications, financial transactions, and other everyday activities.
The details of the program, called Tangram, are contained in an unclassified document that National Journal obtained from a government contracting Web site. The document, called a "proposer's information packet," is a technical description of Tangram written for potential contractors who would help design and test the system. [...]
In addition to descriptions of Tangram, the document offers a rare and surprisingly candid analysis of intelligence agencies' fits and starts -- and failures -- in other efforts to profile terrorists through data mining: Researchers, for example, haven't moved beyond "guilt-by-association models" that link suspected terrorists to other, potentially innocent people, and then rank the suspects by level of suspicion.
When this story broke back in 2006, the National Journal noted that the document it obtained from a government contracting site acknowledged some of the limitations of guilt by association:
"In the cases where we have knowledge of a seed entity [a known person] in an unknown group, we have been very successful at detecting the entire group. However, in the absence of a known seed entity, how do we score a person if nothing is known about their associates? In such an instance, guilt-by-association fails."
Updates made to the proposers information packet in 2007 show that Tangram, rather than working to overcome the limitations of guilt by association models, has decided merely to adapt to them.
Several fundamental challenges remain before the technology can be deployed broadly within the Intelligence Community. The four key challenges that define the essence of the Tangram program are: Reduce system and data configuration time of all automated entity and threat discovery processes by two orders of magnitude (100 x). Reduce threat entity and event discovery time by two orders of magnitude (100 x). Increase overall efficiency by three orders of magnitude and overall productivity by two orders of magnitude over current processes while delivering a consistently high intelligence value as determined by experienced analysts. Improve the detection of low observable threats and events where guilt by association assumptions may not apply.
In other words, guilt by association assumptions are now a given, but the system needs to be improved to handle situations where they can't be established.
The McCarthy era of the 1940s and 1950s, in which thousands of Americans were tarred with guilt by association, was simply an extension to citizens of a similar campaign using similar techniques against alien radicals in the first Red Scare thirty years earlier. The earlier Red Scare, which culminated in the arrests of thousands of aliens for their political associations during the Palmer Raids, was coordinated by a young J. Edgar Hoover, then in the Alien Radical division of the Justice Department. Hoover applied what he had learned in the first Red Scare to U.S. citizens during the second Red Scare, which targeted thousands of them.
The same pattern underlies the internment of U.S. citizens of Japanese descent during World War II. Since 1798, the Enemy Aliens Act has authorized the president during wartime to arrest, detain, deport, or otherwise restrict the freedom of anyone over fourteen years old who is a citizen of the country with which we are at war, without regard to any actual evidence of disloyalty, sabotage, or danger. The justification for that law, which the Supreme Court has upheld as constitutional, is that during wartime one can presume that citizens of the enemy country are loyal to their own country, not ours, and that there is insufficient time to identify those who are actually disloyal.
In World War II we simply extended that argument to U.S. citizens through the prism of race. The Army argued that persons of Japanese descent, even if they were technically American citizens because they were born here, remained for all practical purposes “enemy aliens,” presumptively likely to be loyal to Japan.... And so we locked up 110,000 persons solely because of their Japanese ancestry, 70,000 of them U.S. citizens. (David Cole)
TIA/Tangram "suspicion scores" sound eerily like a fully automated version of the Security Index of old. What was the Security Index? That was the 1950s and 60s version of the Custodial Detention Program (CDP), begun by J. Edgar Hover in the 1940s
to enable the government to make individual decisions as to the dangerousness of enemy aliens and citizens who might be arrested in the event of war.
( Book III of the Final Report of the US Senate Select Committee to Study Governmental Operations With Respect To Intelligence Activities, 1976)
If one qualified for the Security Index, one’s name was placed on a special Security Index card. If the FBI found that a subject did not qualify for the Security Index and his or her card should be canceled,
[t]he cancelled Security Index cards on individuals taken off the Index after 1955 were retained in the field offices. This was done because they remained “potential threats and in case of an all-out emergency, their identities should be readily accessible to permit restudy of their cases.” These cards would he destroyed only if the subject agreed to become an FBI source or informant or “otherwise indicates complete defection from subversive groups.” (Ibid.)
I'd like to know what recourse one has to appeal, revoke or change a "suspicion score" once one has one. It almost seems preferable to return the Congressional autos-da-fé of the 50s and 60s, where one had live accusers rather than computers to refute. Ironically,
The TIA program devoted more than $4 million to research aimed at ways to protect privacy while it was sifting databases, and former officials have said that although it was admittedly controversial, TIA was being designed all along with privacy protection and auditable logs to track those who used it. The privacy research, however, was abandoned when the program moved into the classified budget in the NSA.
It seems to me that the world that Marshall imagines, where my data is free, and I am in control of it and my privacy, requires some steps beyond adoption of data portability standards by some corners of industry. There must be government regulation across industries and much stronger Congressional oversight and rolling back of domestic intelligence activity. Data portability advocates, EFF, ACLU, other civil and human rights groups, immigrant rights groups and others will need to collaborate and develop coordinated plans for action.
~
Some of my comments and sources about the Security Index were previously published here.