An interview with OSTIF, the team behind the OpenVPN audit

Privacy newsTips & tricks
11 mins
OSTIF is auditing OpenVPN for the benefit of all.

You’ve just completed your security audit of OpenVPN. Two people worked for almost two months on this project. How does such an audit work?

It actually wound up being three researchers working a total of 50 days (around 1000 hours) on the security review.

When we plan to audit a piece of software, there’s a substantial amount of work that goes around planning the timing of the audit, who will be doing the work, and which areas of the software we’ll cover.

For OpenVPN, we waited until the release of OpenVPN 2.4 which featured some major code changes. We could then evaluate the new features, as well as a lot of under-the-hood changes.

Updates that contain significant code changes are good times to evaluate software because coding errors could make it through testing, or regressions on edge features might slip through the cracks.

OpenVPN is a unique piece of software, in that it’s a monolithic code with lots of features that must be compatible with older versions. Ensuring legacy compatibility slows down the process of the security review. We have to navigate a complex web of functions rather than in a modular design where the application can be evaluated in chunks. OpenVPN also relies on two different libraries (OpenSSL and PolarSSL) for cryptography, meaning that there are two completely different crypto-environments powering the security.

Even further, there’s OpenVPN 3.0 which is a unique version that’s not entirely open source. OpenVPN 3.0 was created due to licensing issues with the Apple App store that prevent free software on the store. OpenVPN 3.0 code is used for OpenVPN Connect for Android and iOS. If we were to evaluate this entire ecosystem, it would take many researchers many months to comb through all of these variations of OpenVPN, and then they’d still have to consider all the different network and hardware configurations these various apps can face. The complexity and cost would be tremendous.

We consulted experts and worked with the OpenVPN team and QuarksLab to figure out what to focus on. It was decided that OpenVPN 2.4 for Windows and Linux covered the most users and would do the most good. Most commercial VPN providers use OpenVPN 2.4 code for their custom VPN clients because of the license structure around it.

We also decided to focus on any cryptography created by OpenVPN itself, and the application’s security. This means looking for logic errors, memory allocation errors, improper buffer handling, or other improper error state vulnerabilities.

A separate audit of OpenSSL would allow us to closely evaluate the OpenVPN cryptography itself to ensure that both the cryptography and the application are sound. It’s important to create a safe and difficult-to-exploit application for users to enjoy.

As for the actual auditing process, QuarksLab does an excellent job of documenting the processes and tools used when evaluating software. Our work focuses on planning the audit scope and setting attainable goals. We then rally the open source, security, and privacy communities around the cause to raise the money to get it done.

Are there any surprising/noteworthy findings from your audit that you’d be able to share with us now?

We’re in the blackout stage of the audit process for OpenVPN so I won’t be able to discuss any specifics that might clue-in people to the results, but they will be publicly available very soon. We are waiting for OpenVPN 2.4.2.

What is the rationale behind such an audit? Are you being tipped off to potential security holes, or do you simply want to take a closer look at software that you regularly rely on?

Our strategy as an organization is to cover different areas of security and privacy and select widely used applications.

VeraCrypt was a much-needed successor to TrueCrypt, which the community greatly relied upon, but the people running the project were relatively unknown and were taking on a massive project with complex code. It made logical sense to approach it as our first audit because we could evaluate the changes to the code that went into TrueCrypt 7.1a and compare it to the current version of VeraCrypt. This narrow scope allowed us to dramatically reduce costs and show people that the organization is effective at getting results.

OpenVPN is our first “wide” audit of an application. It required a much larger budget but also had a large community of VPN providers (who themselves are privacy activists). The VPN providers are interested in both the privacy of their users and directly concerned with the safety of OpenVPN which allowed us to fundraise from both OpenVPN commercial interests and private users simultaneously.

OpenSSL is larger again, but has industry support all around it, as OpenSSL code (and other libraries derived from it) powers around 70% of the top 1,000,000 websites. This gives us a lot of business interests that we can ask for funding to help evaluate OpenSSL 1.1.1, which will be the first OpenSSL version with new TLS 1.3 code.

As we go further down the list of applications we plan to audit; it gets harder to raise funds. Either because the communities surrounding them are smaller, or because there’s no vested business interest in the success of the application.

We hope that after repeated successes we’ll be able to secure larger corporate sponsors that will enable us to more efficiently direct funds toward these projects without relying entirely on small public donations. This would also greatly help us to establish our other programs, which involve working with projects to make their applications easier to use, improving testing methods and tools, and the creation of easy-to-follow guides for privacy and security software that we support.

In short, right now it’s all part of a larger strategy to support one application from each major area of privacy and security, then expand from there. Our criteria is the perceived strength of the software, combined with widespread use.

For your OpenVPN project, you have received support largely from the VPN industry. Did you expect support beyond that? How satisfied are you with this support?

We also received a large amount of support from the community both in word of mouth and direct donations.

Our goal was surpassed surprisingly quickly, as we originally believed that the 1-month window we’d allocated for fund raising would be insufficient. But we passed our goal and raised substantially more than planned within 20 days. That money was set aside for the bug-bounty program that is planned to start in the summer/fall.

I was surprised by the positive community response and the outpouring of support for the project. It truly was remarkable! I’m very happy with the community support for the project, but was also surprised at the number of larger organizations that didn’t respond to our inquiries or had no point of contact at all for their management.

However, overall the good far outweighed the bad, and we look forward to working with all of our supporters on the OpenVPN initiatives and beyond!

You’ve moved from a fundraising model with pooled resources to a direct fundraising model, in which you raise funds for each project separately. This seemed to have worked well for the OpenVPN project, where the VPN industry was happy to donate. Do you expect future projects to be funded similarly, and how will this work for software projects that don’t have a commercial industry surrounding them, such as OTR?

The change in the funding model was due to feedback from the community regarding sticker shock. During our first round of fundraising we planned a year of activities and then tried to fundraise the effort through KickStarter. This led to financial hurdles, like offering rewards for donations, KickStarter fees, and payment services skimming money from donations. Also, the 8 planned projects combined pushed the goal well into millions of dollars. As a newcomer to the industry with no track record, the huge amount of money involved, and after a few prominent KickStarter failures, it was doomed from the start.

Our shift in strategy brought the overhead and the numbers down to earth and set more attainable goals, but it also requires substantially more work for each fundraise. We’re hoping that after building a reputation of responsibility and effectiveness we’ll be able to secure larger donors that will allow us to focus more on getting things done and less on directly soliciting for donations. Larger donations will also have the added benefit of allowing us to fund less commercially interesting projects like OTR, Nginx, Tunnelblick, and more.

How do you see privacy and security-enhancing technology evolving? Especially in regards to mobile phones and proprietary systems?

We’ve repeatedly seen through various government agency leaks that if the cryptography around the information is good, they can’t break it en-masse.

This fact at least disables the “listening in on everyone” form of mass-surveillance that has become pervasive in the last few years. As these privacy tools continue to improve and crypto becomes harder to break and easier to use, we’ll see substantially increased efforts to attack and compromise devices.

There is evidence of this through the massive theft of SIM card keys with Gemalto, huge lists of pilfered RSA keys in NSA leaks, backdoors inserted into Cisco and Juniper systems, and so on.

The security community has long been calling for a “full stack” of open source code surrounding the devices that hold our most private information. The biggest hurdle right now is funding and organizing the support to actually do it.

Some companies appear to be doing great work on the proprietary side, but we have repeatedly learned that we cannot trust a black box of code. See this months heap overflow in iOS: https://googleprojectzero.blogspot.com/2017/04/exception-oriented-exploitation-on-ios.html

Android has a lot of ecosystem-related issues related to updates lagging behind, creating millions of vulnerable devices. Or companies negligently stopping updates for their phones once sales stop. Then there are even deeper issues such as vulnerable Broadcom radio firmware that will never be fixed, as recently demonstrated by Project Zero.

A truly open source phone is a big ask, but we can certainly try to push the open-source community in the right direction be developing pieces of the puzzle independently. I truly hope that we can get there, as the current situation is a mess. I’m shocked that there isn’t already a smartphone-based Mirai knocking out cell towers around the world with data floods.

Apple has been making a lot of positive news with its proprietary systems regarding security and privacy. What do you think will open-source projects play in bringing usable technology to the masses while respects user rights?

Apple has put tremendous resources into building a phone ecosystem that focuses on security. The problem is that Apple doesn’t open source this technology, so we’re dealing with the same problem that hits commercial software, like Windows.

We have a black box with millions of lines of code of unknown quality, all interacting with one another in known ways. Apple is relying on the inability of malware makers and security researchers to reverse-engineer their code and find flaws. Part of this motivation is to lock the software to the phones, so that iOS can only be installed on genuine Apple hardware. A another motivation is locking the phones to the software, so you can’t buy an iPhone and put an alternative operating system on it, preserving their ability to draw money through the app store with a captive audience.

To be clear, as of right now, they are doing an objectively better job than Google is when it comes to general security. The problem is that this black box can’t be trusted. It has bugs just like all software does—thousands of bugs. Because this software is proprietary and the source isn’t available, those bugs lie in wait for discovery by an Apple security team, or anyone else in the world that finds them first.

Open Source software can be reviewed. It removes the “just trust me” ask that no privacy interested person can objectively accept.

I’m hoping that Google moves in the direction of Apple in that updates will be forced across all devices regardless of vendor, and hardware requirements will have to be tightened to make that happen. I also hope that we can open source the currently closed parts of Google firmware and its related drivers so that we can trust the full stack the phone relies on for security. That would put an open solution in a position to lead the market with good security and privacy practices.

You look a lot at other people’s code. What common mistakes do you observe? What kind of bugs are the most common?

I actually don’t do the security reviews themselves, that’s left to the contracted auditors. But the most common issues are problems with memory management, and proper deletion of security related data when it’s no longer in use.

The other big mistake is trying to write your own cryptography. It’s wildly complicated, and there are many, many ways to defeat cryptography that has been invented over the last few decades. You have to carefully consider them all and adhere to many standards to create strong cryptography. Using already compliant libraries avoids this security minefield entirely.

Do you have any advice to share with the many coders reading this?

Support an open source security or privacy initiative. Volunteering your time and knowledge as a coder is extremely valuable, even if you only do a single commit per month to a worthy project.

The sum of the communities abilities and time adds up to applications that can change the internet and the world for the better. If you don’t have a security background, make a small recurring donation to an organization that helps build and improve these tools and libraries. I’m not just talking about OSTIF, I’m talking about the Free Software Foundation, or any of those donate buttons you see when you download a piece of open-source software.

You’d be shocked how much a few dollars helps small projects function and improve. Small contributions add up to a better digital world for all of us.

I'm a Network Operations Analyst at ExpressVPN, and I love all things tech, privacy, and martial arts!