I wrote a few months ago about United offering rewards for people that discovered security issues in the United Airlines software. Not the plane software, thankfully, but in their customer facing IT systems. Apparently a few people discovered flaws and were recently awarded frequent flyer miles, a couple of which received 1 million miles. That’s a nice bonus for some people, though I hope the end result of this is that United builds better security into their software and learns to code better. Certainly other manufacturers have programs that have helped in the tech world. Microsoft, Google, and others have programs that pay bounties for flaws that are discovered and reported.
However I wonder if this isn’t something that should become an accepted practice. I don’t necessarily want regulation here, though I would prefer our regulation not prevent the research and exploration of security issues. Imagine, however, if we had an accepted, known process for finding flaws in all software. Maybe this would be like a standard process, like reporting spam to the webmaster@ address for websites. If someone finds an issue, they let the company know. The company has a bounty of some sort, perhaps a token reward, and a limited time to fix the issue. One the time passes, the individual is free to disclose the issue to the public, the problem can be publicly discussed and analyzed. More importantly, the company now has liability for any data loss or productivity issues.
In theory this should already exist. However it doesn’t seem to regularly work, and we have lots of software with flaws, along with no incentive to fix the issues, even when data is exposed. There are some laws for personally identifiable information (PII), but what about random data that might be annoying to users? What about the various software packages that we use that monitor our systems or provide anti-virus protection or manage other aspects of business. I’d like to see perhaps more class action arbitration (not lawsuits, we need less lawyers involved in the world) that perhaps doesn’t award damages, but refunds purchase costs. If that CMS doesn’t work, your money is refunded. I’d think that would incentivize some secure coding practices. If vendors had to make insurance claims, I bet we’d start to see more requirements of code reviews and PEN testing of software in general.
This is certainly a hard issue to discuss. For every solution out there, plenty of edge cases or exceptions will be an issue. The openness and ease with which people can create software is a double edged sword. This encourages innovation and experimentation, resulting in some amazing new concepts, but that same freedom often also brings about lots of poorly written software, full of vulnerabilities and bugs. I hope that we find a way to mature this industry and start to build better, standard, well engineered practices and habits that encourage secure, robust, well written code.