The Freedom House report Leaping over the Firewall  is a new report designed to help users understand, evaluate and select a tool or series of tools for security, privacy, anonymity, and most importantly, for circumventing Internet censorship. As a long time developer with The Tor Project  and as a member of the circumvention community, I feel that it is important to set the record straight about a number of issues. My motivation for writing this response is to inform readers of the serious concerns that many people, myself included, have about the recent Freedom House report. I am always pleased to see more analysis of censorship circumvention and Internet security tools, but I have concerns about this report’s methodologies and resulting conclusions. The report in its current form could be dangerous to the users it aims to help.
The reporting methodology is sloppy at best and the information in the report is often inaccurate or poorly written. The report demonstrates a general disconnection from the language used by the projects and the circumvention community as a whole.
The report’s ratings conflate two important issues, support and security, into a single item. It thus labels a survey rating of users’ perceptions of a tool’s support as a ‘Security’ rating. The problem here should be obvious — the level of technical support available for a tool has little relation with the security that tool provides. (Indeed, a tool provider who profits from selling users’ browsing habits has a much stronger incentive to provide support than one that protects its users’ privacy.) Worse, they emphasize the support rating from their survey over the results of their technical evaluation by placing the survey result closer to the ‘Security:’ label; this will lead readers to mistake the support rating for a security rating.
The Freedom House security evaluation was a superficial examination of the tools’ externally visible behaviour, rather than the thorough review of both design and source code needed to determine whether any program is in fact secure. As one example of the inability of a black-box test to determine a program’s security properties, they state that they consider a tool to provide ‘anonymity’ if it appears to encrypt data sent to and from the circumvention tool’s servers. This is not sufficient to determine whether the tool uses cryptography properly or whether it includes client-side software to prevent web sites from tracking and de-anonymizing users through long-term identifiers, as the Tor Browser Bundle  does. Nor is it possible for their tests to determine anything about traffic after it leaves the user’s computer; they seem to have simply accepted the tool authors’ claims at face value.
The report should rate tools based on availability, whether the tool requires administrative privileges, validated security claims, anonymity, design and implementation details available for peer review, centralization or de-centralization, and other qualities that clearly show a distinction between tools in a meaningful sense. By understanding these qualities, users will be able to understand how their tool may or may not function in the event of a major Internet outage; users will better informed about the security claims and about actual risk that is mitigated by the tools of their choosing.
The report recommends that users use less secure tools if they believe that their activities would not endanger them. This advice is dangerous — if followed, it reveals to anyone monitoring a user how subversive she considers her current activities to be.
The report often suggests the use of a VPN while never specifying what VPN software or provider is supposedly secure. The report states “VPN tunnels are invariably encrypted and thus not susceptible to snooping (interception of traffic).” It is not a given that VPN software provides strong encryption, if any encryption at all, and many popular implementations such as PPTP have known practical weaknesses. Additionally, VPNs are subject to a variety of different privacy and security issues; the general purpose of a VPN is not to provide traffic analysis resistance nor will a user be protected against legal or technical compromise of the central VPN service itself. This report misstates the security and privacy properties of VPN software. This is a major mistake as not all VPN software is equal and not all VPN providers are trustworthy.
The survey neglects to mention some important issues regarding the selected software. Two well known examples stand out. JAP is a product that has previously  had a secret law-enforcement backdoor  forced into their program. This well-known backdoor scandal is not disclosed in the report, nor is any resolution that demonstrates proof that this is no longer an issue for users. UltraSurf is well known to perform filtering and to log extensive user data as they have disclosed to the press ; this is not discussed or disclosed in the report. These are just two examples out of many where informing a user would change the general perception of the tools named in the report. The general issue of data retention, law enforcement access, and censorship by the tool authors or by the tool infrastructure is one that cannot be overlooked.
The evaluation of Tor is especially inaccurate: the report suggests that Tor’s security is no better than that of tools whose authors admit that they are best used in situations where circumvention without security will suffice.
The report suggests that Tor was designed for the sole reason of protecting government communications. This is not true and confuses the fact that the concept of Onion Routing was originally designed by researchers at the Naval Research Laboratory for their own purposes. The development of Tor by MIT graduates is at its core about enabling anonymous communication for a diverse user base and creating a base for other systems that require an anonymous communications channel. Tor has no software in common with the original Onion Routing project.
A user using Tor with a private bridge is likely to be the least censored and least monitored of any user of any circumvention system by design. While Tor was not originally designed with the specific goal of resisting Internet censorship, it was designed to protect against traffic analysis from the beginning – censorship is one of the most basic outcomes of even the most simple traffic analysis attacks. So while we didn’t originally attempt to put an emphasis on censorship circumvention in Tor, we’ve been working on circumvention as a core feature of Tor for many years now. It is worth noting that Tor originally did not attempt to hide that Tor was in use, but early blocking events by simple corporate proxy filters encouraged us to blend in with normal web browsing traffic. We have for quite some time attempted to look like Firefox sending HTTPS traffic to an Apache web server. While we don’t claim to offer steganographic protections, we do believe that we have improved on previous designs that were easy to filter. In addition to bridges, we’ve found that these improvements are leaps and bounds above the competition.
Torbutton  is a security add-on that enhances the anonymity of Firefox users when used with any proxy. No other circumvention tool even attempts to provide this level of security for their users  and this includes the other proxy systems that use web browsers specifically. We have gone to great lengths to attempt to protect web browsing with Firefox. The fact that the report does not understand the threat model or security requirements that required the creation of Torbutton is another failing. It should be used by all users that configure a proxy in their web browser even if they do not use Tor as their proxy. It should count as a security problem with other tools or as a bonus for Tor’s serious, pro-active security stance – the report does not consider either of these possibilities in any meaningful sense.
The Freedom House report excluded actual performance results from their technical evaluation. Still, they chose to publish various comments about performance. The realities about the Tor network and speed are objectively measured in our metrics project . From an objective and data driven standpoint, we’ve seen the speed of the Tor network improve in both throughput and latency in the last few years from eight seconds to four seconds on average for 50KB  file downloads. There is a general perception that Tor is slow and while worth mentioning, it’s also worth mentioning that we’re the only project addressing these speed issues in a measurable and open manner.
The report misinterprets our design roadmap and blog entries as an admission of failure rather than a series of future performance optimizations. We are working on a design that will reduce the initial start-up time (and bandwidth usage in general) of Tor. In many cases, it’s unlikely that a user will notice these improvements but in aggregate, the network will become more efficient. It is important for all tools to improve and Tor is also on a constant plan for improvement. Tor performs adequately in congested networks and dynamically adjusts itself to compensate for network performance issues. This is why Tor functions well on every type of Internet connection, from a high-speed cable modem to a high-latency, low-bandwidth satellite connection without any user tuning at all. We are the only circumvention tool that has published a design for our dynamic tuning and to our knowledge, we’re the only tool that does this kind of tuning at all. When the report suggests that “non-technical users will struggle to fine-tune the software”, it seems to indicate that the authors of the report were unaware that network tuning happens automatically and it has for quite some time.
The report does not appear to have evaluated the Tor Browser Bundle , available for all popular platforms, which requires no installation on the host computer. It is designed to not leave any data behind on the computer where it is run and it is specifically designed to be used without any configuration at all. Other tools are given credit for ‘portability’ and the Tor evaluation seems to miss that we offer both portable and locally installable versions of Tor; at the heart of the issue, they’re the same software with the same protections. The report appears to only have evaluated the non-portable packages for more advanced users. We explicitly state that a user should take the time to learn about all of the different moving pieces if they don’t use the portable packages. This is something that the evaluators themselves entirely failed to do as evidenced by the report.
Most of these issues would have been easy to clear up by simply contacting us for comment about the Tor section before finalizing the review document. We’re all working towards the same goals and working together allows us to properly address community wide concerns. However, for this report no attempts were made by the authors or the reviewers of the document to fact check Tor related issues before the document was finished.
The actual basis for the Freedom House review is a scientifically flawed survey that does not measure the properties that it claims to measure. From performance to security, the report presents mistake after mistake. The core of the review is non-technical in nature and yet the entire circumvention landscape is quite technical. To eschew technology when performing an actual review of technical merits is a grave mistake and it serves to discredit any merit that the report might otherwise have had as a whole. In the future, we hope that an open and technically sound review performed with community input will take the place of this flawed, superficial qualitative report.