Why Doesn’t the Intelligence Community Care Whether Its Security Programs Work?

A recent cybersecurity bill working its way through Congress is the latest surveillance program that does little to keep Americans safe.

March 26, 2015

Crossposted on Defense One.

The House and Senate Intelligence Committee just passed a cybersecurity bill that critics argue isn’t likely to improve cybersecurity. In fact, because it undermines the privacy of electronic communications by encouraging companies to broadly share private data with the government and each other, it may actually damage cybersecurity.

For anyone who follows intelligence policy, this shouldn’t be a surprise. The intelligence community all too often launches grand new programs without conducting the appropriate research and evaluations to determine whether they will work, or simply create new harms.

The examples are numerous and it is a problem identified long ago by Clark Kent Ervin, the Department of Homeland Security’s first inspector general.

WATCH:

As Ervin suggests, when intelligence agencies fail to evaluate their programs, a network of inspectors general, congressional auditors and outside watchdogs often fill the gap. But even when these oversight mechanisms identify an ineffective and wasteful security program, it’s all but impossible to end.

The FBI and National Security Agency had long told Congress and the Foreign Intelligence Surveillance Court that the bulk collection of all domestic telephony metadata was “vital” to its counterterrorism efforts. But once Edward Snowden leaked the program to journalists, these claims crumbled under public scrutiny. The government now admits it didn’t help interdict any terrorist attacks, a conclusion backed by a group of experts the President charged with reviewing it. Yet a bill that would not even have ended the program, but merely narrowed the government’s use of the data, failed last year.

Likewise, the Government Accountability Office, or GAO, has since 2010 issued a series of reports that criticize a Transportation Security Agency behavioral detection program that purportedly trained its airport personnel to identify subtle behavioral cues that reveal a passenger’s intent to harm an aircraft. Over four years the program sent more than 150,000 passengers to secondary screening, but didn’t identify a single threat to aviation. Meanwhile, GAO found that 16 people who were later convicted of terrorism-related crimes traveled through eight airports deploying behavioral detection officers 23 times without being identified. Last year, a follow-up GAO report confirmed the program’s continuing failure. Despite its $200 million annual price tag, bills to defund it regularly fail.

GAO has similarly criticized broader “suspicious activity reporting” programs run by the FBI and Director of National Intelligence, or DNI. These take state and local police reports, almost always reflecting innocuous activity rather than behavior that suggests criminal preparations, and feed them into federal databases. The FBI and DNI have so far refused a 2010 GAO request to develop performance metrics to measure the effectiveness of these programs.

There is a strong argument for ending these programs on the basis of their high cost and lack of effectiveness alone. But they actually do damage to our society. TSA agents participating in the behavioral detection program have claimed the program promotes racial profiling, and at least one inspector general report confirmed it. Victims unfairly caught up in the broader suspicious activity reporting programs have sued over the violations of their privacy. The Privacy and Civil Liberties Oversight Board concluded the telephone metadata program violated the Electronic Communications Privacy Act and raised serious constitutional concerns.

The Cybersecurity Information Sharing Act passed by Senate Intelligence Committee last week is yet another example of this phenomenon. Experts agree that the bill would do little, if anything, to reduce the large data breaches we’ve seen in recent years, which have been caused by bad cyber security practices rather than a lack of information about threats. If passed by the full Congress, it would further weaken electronic privacy laws and ultimately put our data at greater risk. The bill would add another layer of government surveillance on a U.S. tech industry that is already facing financial losses estimated at $180 billion as a result of the exposure of NSA’s aggressive collection programs.

I talked with Babak Pasdar, CEO of Bat Blue Networks and a network security expert , about the impact of the NSA’s previous efforts to undermine encryption standards and install backdoors into U.S. tech products and software.

WATCH:

Pasdar explains that from a security standpoint, if the U.S. government can gain access to data, chances are that someone else can too. Just as the weakening of standards governing encryption weakens the integrity of the entire system, our government’s weakening the laws governing the sharing of private data will lead to other governments doing the same.

Pasdar warns that the expansion of government surveillance in cyberspace has had a chilling effect on U.S. technology companies, particularly as data is moving to the cloud. “The U.S. has always been the central hub of technology, and we're starting to see a lot of organizations talking about moving their cloud infrastructure, or moving their data into Europe or other countries that don't have such a troubling history with privacy and integrity.” 

But Pasdar’s greatest concern is the damage to our constitutional system.

WATCH:

Intelligence agencies should be in the habit of evaluating all the possible consequences of an activity undertaken in the name of security before it is implemented. As Sen. Ron Wyden, D-Ore., the Intelligence Committee’s lone dissenting vote against the bill, argued, “If information-sharing legislation does not include adequate privacy protections then that’s not a cybersecurity bill – it’s a surveillance bill by another name.”

We don’t need another surveillance program that doesn’t improve our security.