New Historian

Big Data: A Lesson From History

Big Data

<![CDATA[Big data is a pertinent issue in today's world. Large amounts of data can be analysed, revealing correlations; these correlations allow experts to spot business trends, prevent diseases, and combat crime amongst other things. As of 2012, 2.5 exbytes (2.5x1018) of data were created every day. What to do with this vast amount of information raises several key points, particularly for governments. The use of Big Data analysis within governmental processes is beneficial and can allow for increases in efficiency in terms of cost, productivity, and innovation. In their use of big data analysis, however, governments can infringe on individual rights to privacy and fuel fears of a 'Big Brother'-style state; allegations of spying have recently been made against the National Security Agency in the United States of America and the Government Communication Headquarters in the United Kingdom. We would be wrong to assume, however, that the use of big data by governments is solely a twenty-first century phenomenon. Margo Anderson, Professor of History and Urban Studies at the University of Wisconsin, has been examining how the United States government used data during the Second World War. Her research, published in the most recent edition of Federal History, raises some intriguing points about how governments have used, and misused, information over the decades. During the Second World War, the US was taken completely by surprise by the Japanese attack on Pearl Harbour on December 7, 1941. As a result of the damage inflicted by this attack, the capacity of the state to fight a war and deal with threats was unclear. “Americans generally supported strong measures to keep the nation safe,” Anderson explains. Anderson argues that the evacuation and incarceration of the West Coast Japanese ancestry population, misleadingly termed the ‘Japanese internment’, reveals that the US government used available data to harm citizens. “[This] example merits historical investigation to understand both what happened at the time and what lessons we can draw for big data issues today,” Anderson writes. History, in this instance, can provide an important message for today’s governments. During the Second World War, the US had numerous instruments of surveillance and record-keeping. Foremost amongst all the records, however, was the census. Collected every ten years since 1790, Congress had forbid using the census to ‘harm’ citizens. The 1940 census data, however, was used to round up the Japanese ancestry population living on the West Coast in 1942. Over 100,000 people were held in concentration camps for most of the war. Interestingly, this use of the 1940 census data was well known at the time. It is clear that census officials and Congress realised the risks of individual-level data and put in place federal statutes to prevent disclosure. Between 1939 and 1941, these concerns and protections were not strong enough to withstand the rising tide of ‘national security’ threats. Further uses of big data were unsuccessful. The census was not a perfect record of every person in the United States and a proposal for a more thorough national population register failed. “When national proposals reemerged in the 1960s for a “federal data center,” they were quickly rejected by Congress and the public as Orwellian violations of privacy,” Anderson notes. In this modern age, however, with such huge reservoirs of data being created by our online activity, this debate is even more prevalent. Anderson hopes that the period of US history following the Pearl Harbour attacks, “can usefully inform debates about data stewardship within professional organizations, provide information for training modules for staff, and facilitate discussion among policy makers about how to build 21st-century data infrastructure that serves the public good.” For more information: www.shfg.org Image courtesy of Wikimedia Commons user: Kayaker]]>

Exit mobile version