Some people are grateful to former National Security Agency contractor Edward Snowden for revealing massive detail about our government’s intelligence activities. Others think he’s a traitor who’s harmed our national security.
One thing you have to admit: He’s gotten us all thinking about the proper balance between privacy and national security. He’s also changed behavior and attitudes, judging by recent Congressional moves to curtail (for the first time since the 2001 terror attacks) the government’s ability to monitor citizens’ phone records.
Snowden changed the game by fearlessly (or recklessly, depending on your viewpoint) tearing away the veils of secrecy to reveal something he felt was endangering his fellow citizens. Do we also need an Edward Snowden to expose the number, and severity, of security breaches to finally force CEOs and CIOs to make security a top priority?
Denial on Denial
One argument in favor is the tendency of corporate management to focus on security in the wake of a highly publicized attack, but then quickly lapse back into complacency.
It’s easy to shrug this off as clueless C-level executives. But software developers and network administrators who should know better are equally to blame. A chief technology officer at a global IT services firm recently told me programmers still routinely fail to build in protections against common attacks such as buffer overflows.
Security experts routinely say 40 percent or more of successful attacks exploited security vulnerabilities that have been known about for years, and could have been prevented by following known, straightforward processes such as patching software and turning off unused services. Even when security officers or vendors quantify the risk vs. the cost of security, management will often vote for “I’ll accept the risk” rather than pay more for security.
Needed: Harsh Light of Disclosure?
In an era when software controls critical infrastructure such as power plants and dams, medical devices such as medical pumps, and aircraft (one of which was brought down by incorrectly installed software recently) this lax attitude towards security could cost lives.
While an event such as fires in the batteries in Boeing’s 787 prompts the FAA to ground the planes until the problem is solved, one security expert I spoke with recently complained there is no “Federal Power Security Authority” to force action if the national power grid were hacked. And in the absence of an outside authority, any company or government agency will always have more to lose than gain by fessing up to a dumb programming or network management mistake.
Such a government agency would, like the National Transportation Safety Board for aviation and rail accidents, be responsible for an impartial review and disclosure of all the facts to tell the public about the risks they face and what is being done to resolve them. After all, a dam that floods a river or two trains that collide due to a software failure kills people just as effectively as an airplane crash caused by a mechanical failure. The hidden dangers will only increase as billions of devices, ranging from self-driving cars to autonomous valves in oil pipelines, join the Internet of Things.
Given our tendency to act only after a disaster, creating such an outside “security review” agency (whether governmental or run by private industry) will probably require a horrific event. Could we get there more quickly if one or more Edward Snowden spills the beans, hurting companies and agencies in the short run but helping us all in the long-term by showing us how vulnerable we are and forcing corrective action?
Filed under: Tech Trends
Like this post? Subscribe to my RSS feed and get loads more!