close
close
The US national security state is here to make AI even less transparent and accountable

By Matthew Guariglia / Electronic Frontier Foundation (EFF)

The Biden White House has released a memorandum on “Promoting United States Leadership in Artificial Intelligence,” which includes, among other things, a directive to the national security apparatus to become a global leader in the use of AI. Led by the White House, the national security state is expected to assume this leadership position by poaching great minds from academia and the private sector and, most worryingly, leveraging already-working private AI models for national security goals.

Private AI systems like those run by tech companies are incredibly opaque. People are – and rightly so – uncomfortable with companies that use AI to decide all sorts of things in their lives, from whether they are likely to commit a crime to their suitability for a job to questions about their job Connection to immigration, insurance and housing. As you read this, for-profit companies are renting out their automated decision-making services to all sorts of companies and employers, and most of those affected will never know that a computer made a decision about them, will never be able to appeal that decision or understand how it was made became.

But things can get worse; Combining private AI with national security secrecy threatens to make an already secret system even more unaccountable and opaque. The constellation of organizations and agencies that make up the national security apparatus is notoriously secretive. EFF has had to fight in court multiple times to make public even the most basic framework of global dragnet surveillance and the rules that govern it. The combination of these two creates a Frankenstein monster of secrecy, irresponsibility and agency.

As the Executive Branch pushes agencies to leverage private AI expertise, we fear that more and more information about how these AI models work will be shrouded in the near-impenetrable veil of government secrecy. Because AI works by collecting and processing massive amounts of data, understanding what information it stores and how it reaches conclusions will be critical to how the national security state thinks about problems. This means that the state will likely argue not only that AI training data may need to be kept secret, but also potentially that companies must also keep the underlying algorithms secret, under penalty of law.

The memo states: “AI has emerged as a disruptive technology and has demonstrated significant and growing relevance to national security. “The United States must be a world leader in the responsible application of AI for appropriate national security functions.” As the U.S. national security state seeks to leverage powerful commercial AI to gain an advantage, a number of questions remain unanswered about how much This ever-closer relationship will impact much-needed transparency and accountability for private AI and for-profit automated decision-making systems.

Please share this story and help us expand our network!

Matthew Guariglia

Matthew Guariglia is a policy analyst who covers surveillance and policing issues at the local, state and federal levels. He earned his doctorate in history from the University of Connecticut, where his research focused on the intersection of race, immigration, U.S. imperialism, and policing in New York City. He is co-editor of The Essential Kerner Commission Report (Liveright, 2021) and his book Police in the Empire City is forthcoming from Duke University Press, and his bylines have appeared in NBC News, the Washington Post, Slate, Motherboard, and the Freedom of Information-centric outlet Muckrock. Matthew is an adjunct scholar at the University of California, San Francisco School of Law and editor of “Disciplining the City,” a series on the history of urban policing and incarceration on the Urban History Association’s blog The Metropole.

You can also make a donation to our PayPal account or subscribe to our Patreon.




Leave a Reply

Your email address will not be published. Required fields are marked *