One of Google's new objectives appears to be to the software security flaw police with their Project Zero security team. Project Zero's brief is to check other firms' software to look for vulnerabilities and if it finds one, it lets that respective company know. Furthermore, Google give a deadline of ninety days or they will expose the flaw to the public. The name "Project Zero" stems from "zero day" software bugs, which is a geeky way of explaining away a previously unknown software bug with no ready fix and potentially serious implications for users. Given the number of high profile cyberattacks during 2014 (Snapchat, Sony, Apple iCloud) this kind of service is important.
However, some businesses don't see Google's world software police service as useful. They consider tipping off hackers about a given software flaw is a very bad idea. The Chief Technology Officer at Portland, Oregan-based Tripwire, a software security company, had this to say on the matter: "What is the value of disclosing it beyond the vendor to the world? You are helping the bad guys." And yes, I do understand this perspective: by telling the world at large about discovered security flaws, Google may start to travel across a minefield.
But I also know from personal experience that independent security researchers often struggle to get companies to pay attention to software bugs. Some have even faced legal action. Telling a company that they have discovered an issue and giving them ninety days to fix it gives businesses an incentive. Let's not forget that Google are not the only people looking for vulnerabilities. We've also seen bug bounty hunts set up across the Internet, where a business will pay hackers to discover and notify them of security flaws in their software.
What were to happen if Google's Project Zero discovered a security flaw and a week later, an independent hacker discovered the same issue Google notify the company immediately and the hacker starts working on a way to exploit the weakness. The clock is always ticking and to assume otherwise is foolish. And then we can write about Microsoft, where Project Zero has exposed at least four unfixed Microsoft security flaws. One of these flaws was released the day before Microsoft released the necessary security patch as part of a scheduled software update. I am in two minds about this: Microsoft asked Google for more time, but Google disclosed the error with the full ninety days of notice. Microsoft took ninety one days. If Google had given Microsoft that extra day, where would it stop? What if some users didn't download the patch immediately..? Should Google have given Microsoft an extra week?
Deadlines are deadlines... but perhaps there was scope for the big G to have given Microsoft a couple of days leeway? What do our readers think? If Google announce at the start of a ninety day process what the deadline is, should they then renegade on this if the software company discloses that they have a fix but need "just a little more time" to bring it to users? Let us know in the comments below.