Open source applications are used by millions of people every day. Of course, there are those of us who use open source applications as part of our job; building the systems that keep the internet and email running smoothly. But it would be hard to find a person whose life isn’t touched at some point by open source software. Open source applications are installed on everything from TVs to phones, via fridges, desktop computers, wearable devices, and servers. It’s not an exaggeration to say that many of these applications are essential to modern life.
cURL is one such application. cURL is a tool that does something seemingly simple and does it very well. It takes a URL, and fetches whatever that URL points to, whether that’s a file, the result of an API call, or anything else. It’s a utility application relied on by thousands of other applications.
cURL was recently put through a voluntary security audit, which discovered several critical vulnerabilities. The fact that cURL had vulnerabilities is no surprise. Every piece of software complex enough to be useful is likely to have a few bugs, and sometimes those bugs cause security vulnerabilities.
It’s remarkable that cURL had so few serious vulnerabilities, given its functionality and age. But we shouldn’t take vulnerabilities in tools like cURL lightly. cURL is everywhere, and it’s not the only one — there are lots of small useful applications like cURL that find ubiquitous employment on millions of internet-connected devices across the world.
cURL’s maintainer voluntarily submitted his project’s code for audit via the Mozilla Secure Open Source project. In theory, he shouldn’t have needed to. Because the application is open source, the code is available for anyone to scrutinize at any time. The vulnerabilities could have been found earlier, if anyone with the right skills had taken the time to look.
But the same was true of the Heartbleed vulnerability in OpenSSL, or the recent vulnerability in the Linux kernel.
This article isn’t intended to be a negative comment on the open source community. If this software was proprietary, there would almost certainly be many more vulnerabilities and no one would know anything about them. The code wouldn’t be open to scrutiny. And it’s quite unlikely that proprietary software would be voluntarily submitted for a public security audit.
But the reality is that careful, painstaking code audits are not fun. There’s not much incentive for developers to spend their time combing through possibly ancient, certainly complex code, hunting for minor coding errors with an outsized effect.Projects like the Core Infrastructure Initiative and the Mozilla Secure Open Source project are doing vital work that makes the web safer for all of us.