BLOG

Heartbleed Disproves that Open Source is Safer

19:33 Wednesday Apr 16, 2014

            

An open community is supposed to make software development safer – or at least open source adherents have always claimed so. The Heartbleed OpenSSL vulnerability has severely damaged this idea.

The discovery of the Heartbleed bug sent service providers scrambling to patch their versions of OpenSSL and customers to change their compromised passwords. The affect was so widespread that Heartbleed is widely considered as the worst security bug ever to hit the Internet.

As security expert Bruce Schneier wrote, "'Catastrophic' is the right word. On the scale of 1 to 10, this is an 11."

Almost as devastating, however, is the blow Heartbleed has dealt to the image of free and open source software (FOSS). In the self-mythology of FOSS, bugs like Heartbleed aren't supposed to happen when the source code is freely available and being worked with daily.

Or, as Eric Raymond famously said, "given enough eyeballs, all bugs are shallow."

Yet, somehow, Heartbleed appears to have existed for over two years before being discovered. It may even have been used by American security agencies in their surveillance of the public.

Tired of FOSS's continual claims of superior security, some Windows and OS X users welcome the idea that Heartbleed has punctured FOSS pretensions. But is that what has happened? To what extent does Heartbleed challenge or re-affirm FOSS' belief that it represents a superior method of software development?

The Original Statement

Raymond made his famous statement in his 1999 book The Cathedral and the Bazaar. A comparison of proprietary and FOSS methods of software development, the book summarizes the beliefs of many FOSS developers – then and now – about why their work habits are supposed to produce higher quality software with fewer bugs.

Implicit in the description is not only the idea that peer review can substitute for software testing, but also that no special effort is needed to detect bugs. Simply by going about their business as developers, FOSS project members are likely to notice bugs so that they can be repaired.

This claim has not gone unchallenged. It is a statement of belief, not the conclusion of a scientific study, a rationalization of the fact that peer review in FOSS has always been easier than software testing. Moreover, in Facts and Fallacies about Software Engineering, Robert L. Glass claims that no correlation exists between the number of bugs reported and the number of reviewers.

Yet despite the claim's weaknesses, it remains one of FOSS's major assertions of superiority. Heartbleed seems an exception that at least challenges the widely believed rule, or maybe even overturns it completely.

The Problems with Eyeballs

At first glance, Raymond's statement seems to survive any challenge from Heartbleed. Unproved or not, the statement is conditional; it is only true if enough eyes are constantly on the code. However, as the idea is examined, the flaws and unstated assumptions start to reveal themselves.

Robin Seggelmann, the OpenSSL developer who claims responsibility for Heartbleed, says that both he and a reviewer missed the bug. He concludes that more reviewers are needed to avoid a repetition of the incident -- that there were not enough eyes in this case.

Another conclusion that might be drawn from Seggelmann's account is that depending on developers to review their own work is not a good idea. Unless considerable time passes between the writing of the code and the review, the developers are probably too close to the code to be likely to observe the flaws in it.

However, the weakness of Seggelmann's perspective is that the argument is circular: if Heartbleed was undiscovered, then there must not have been enough eyes on the code. The proof is in the discovery or the failure to discover, which is not exactly a useful argument.

A more useful analysis has been offered by Theo de Raadt, the founder of OpenBSD and OpenSSH. De Raadt notes that malloc, a memory allocation library, was long ago patched to prevent Heartbleed-type exploitations. However, at the same time, OpenSSL added "a wrapper around malloc & free so that the library will cache memory on its own, and not free it to the protective malloc" -- all in the name of improving performance on some systems.

In other words, the potential for a bug was detected and patched, but was by-passed by an engineering decision that favored efficiency over security. Perhaps, too, the wrapper was never examined closely because it was assumed to be trivial and to add nothing new. It had become an established part of the code that nobody was likely to modify. But, whatever the case, de Raadt concludes scathingly, "OpenSSL is not developed by a responsible team."

Assuming that de Raadt is right, then one take-away for FOSS is that all the eyes in the world cannot be counted on to catch basic design problems.

Taken together, Segglemann's and de Raadt's comments also suggest that assuming no special effort is needed to discover bugs is a mistake. Perhaps more attention needs to be paid to formal reviews and software testing than FOSS traditionally has managed. The fact that FOSS development often involves remote cooperation does not mean that log-in test or in-person testing sessions could not be added to many project's development cycle.

What Heartbleed proves is that FOSS needs at to examine the unexamined assumption it has held for years. Greg DeKoenigsberg, a vice president at Eucalyptus Systems, summed up the situation neatly on Facebook: "we don't put enough eyes in the right places, because we assume [bug-detection] will just happen because of open source pixie dust -- and now we're paying the price for it."

Redemption by Response

None of these comments are meant to suggest that the entire FOSS development model requires revision. If Heartbleed challenges Raymond's statement about enough eyes, the response to Heartbleed more than justifies it.

Knowledge of Heartbleed was apparently concealed for several weeks, but once it was announced, FOSS-based projects and sites quickly publicized it. A few hours more, and it was being patched. Individual users, of course, still need to change their passwords after sites apply their patches, but while some of the effects could linger for months, the FOSS response could not have been quicker or more responsible once the discovery was general knowledge.

By contrast, imagining a similar response from proprietary software is almost impossible. Based on past revelations of bugs and malware, a more likely reaction from proprietary software would have been to keep the problem secret while a patch was written and tested so that no one could exploit it. Meanwhile, millions of users would have remained exposed for weeks or months without realizing the danger.

Heartbleed is forcing another look at one of FOSS' basic beliefs, but the reaction to it is proving FOSS' ability to respond in a crisis. In the short run, FOSS will face ridicule because of its failure to detect Heartbleed earlier. Yet, already, the challenge to FOSS' basic beliefs is proving the ability of its developers to learn from their mistakes and improve.

By: Bruce Byfield
Datamation

P.S. from Websitenews.co editor:

Having debated this in the office, we all felt that the most dangerous aspect of Open Source software is that the people implementing it most often have little to no experience in developing software. Even implementing security patch updates is beyond the 'if-it-is-free-to-download and free-to-use I-can-make-money-selling-it-even-though-I-don't-understand-it' crowd. This is particularly evident when it comes to open source WCMS (Website Content Management Systems).

In itself, open source software is a great idea and the truth is most proprietary software development companies use it to some degree in their products. However, the caveat is that they are software companies - not freeloaders trying to make money out of something they could neither replicate themselves nor have any real understanding when it comes to fixing any issues that may arise. Which is very dangerous for their clients who think they are paying less because the software is free - which is not true as in most cases they don't pay any less because the set-up and service costs are so high.

The golden rule should be: "If it looks like too good a deal to be real, it probably is."

 

< Back

    Add your comment

    We aim to have healthy debate. But we won't publish comments that abuse others

    1200 characters left

     

     

    LATEST NEWS

     
      

    © copyright 2013 Website News. All rights reserved.

     

    SECTIONS

    ABOUT

    SUBSCRIBE

     

    Website News is for and about the website design, development, marketing industry. We will endeavor to bring you up-to-date news and information to help you in your work as well as give you useful information and tips for your clients and their businesses.

    We are always keen for you to submit any information you find from elsewhere, or about your business, that you feel will be relevant.

     

     

     

     

    Contact Us:

    For advertising enquiries or to submit a story, please email us at: editor@websitenews.co

     

    Login

    Website News

    Sign-up to Website News and create your universal Woogloo ID

    Your details

    Your login details

    Your address


    Is your address not being found?

    Company

    Company address

    Yes No


    To register on the Website News website you either need to use your
    exisitng Woogloo ID or create a new one (see below).

    Sign Up

    Why sign up?

    • Get access to Registered User's priviledges, which may include hidden pages, special features and special pricing, if they exist, on this website.
    • Get access to all sites powered by Woogloo V3 without having to enter your details everytime.

    Login Error

    Forgot your password?

    Enter your email address below and click 'Reset Password' Button




    What is a Woogloo ID

    Logging in...