By David Inserra
Senate majority leader Harry Reid (D–NV) has vowed to bring the Cybersecurity Act of 2012 (CSA) up for a vote in the lame-duck session, and it looks as though the vote could take place this week.
The CSA uses a standards and regulatory approach to cybersecurity, but many troubling questions that The Heritage Foundation asked in July still remain. Until Congress can adequately answer these important questions, this bill remains highly flawed due to problems with information sharing, regulation, cost, and security and effectiveness.
Is information sharing limited? Information sharing should not be unduly limited, since erecting barriers to sharing creates stovepipes of information that might be helpful or even necessary to other agencies. Instead of returning to a pre-9/11 security mindset, we should encourage sharing with appropriate oversight.
Does the legislation actually encourage litigation? The CSA’s liability protections are weak and force the private sector into a “damned if you do, damned if you don’t” situation by allowing private-sector actors to be sued both if they act and if they don’t act. This also condemns information sharing to obscurity, since no one will share information without appropriate protection.
How much will it cost? This is a simple question, but it has not been answered. The cost to the private sector is simply unknown, and this should raise grave concerns.
What “critical infrastructure” is covered? Because the cost to the private sector is unknown, how widely critical infrastructure is defined becomes an important question. If it is defined broadly, many private-sector actors will have yet another costly regulatory unknown hanging over their heads.
Are the standards really voluntary? The bill claims to create “voluntary” standards but then requires that sector-specific agencies explain to Congress why they haven’t made these standards mandatory. Giving regulatory agencies express permission to make standards mandatory and then encouraging and coercing them to make standards mandatory doesn’t sound like a voluntary program.
What will investors and innovators do? While waiting for the new standards, investors and innovators will be inclined to cease new work until they see what the standards require. Once the standards are issued, they will push innovators to create products that meet these new standards, even if a better (but not yet approved) cybersecurity approach could be developed.
Will the standards be outdated before they take effect? Simply put, government regulations usually take 24–36 month to complete, but the power of computers doubles every 18–24 months. This means that any standards developed will be written for threats that are two or three computer generations old.
Security and Effectiveness
Can the federal government develop good standards? The government has been hacked or experienced serious cybersecurity failures at least 75 times in the past eight years, and this trend continues. With the government experiencing so many failures of its own, why should it be in charge of setting standards?
Can the Department of Homeland Security (DHS) develop good regulatory standards?The CSA puts DHS in charge of this regulatory effort, but its only other major regulation—the Chemical Facility Anti-Terrorism Standards—is foundering seven years after its start. Considering this program was supposed to be pretty to easy to implement, it is hard to imagine that DHS will handle cybersecurity any better.
These plus other critical questions remain unanswered. The cyber realm is a source of incredible growth, so Congress should make sure it gets these answers right. The wrong answers would do more harm than good.