Superclean code puts LibreOffice at the head of the trend

If you’ve been around open source for a while, perhaps your understanding of the quality of open source code is based on older projects like the infamously unstable codebase. Times have changed however and evidence of this can be found in the work of code improvement vendor Coverity. They recently announced that LibreOffice, (four years old this weekend and based on the old OpenOffice code), has a defect density of just 0.08, compared with similar sized open source projects which averages at 0.65 and proprietary code which averages  at 0.71.

LibreOffice is an outlier, an extreme example of clean, defect free code, but it also fits into a larger trend. Since the publication of its 2013 Coverity Scan Open Source Report, Coverity has asserted that open source code quality now outpaces that of proprietary code. While an open source license doesn’t guarantee quality, it does allow for evaluation of quality and encourages collaborative efforts toward improvement. Which is why you can expect to see the trend noticed by Coverity continue over the coming years.

Read Simon’s full article on InfoWorld.

CloudStack success not tied to Citrix

Since the news of Citrix’s recent shake up around its CloudStack business, some have been inclined to predict a host of negative consequences for the Apache project moving into the future. At the other end of the spectrum, Giles Sirett a PMC member in the Apache CloudStack project, claims “they [Citrix] have no ‘role.’ CloudStack is driven mainly by its users.”

To see a change in focus at Citrix as an ending point for CloudStack is to disregard the fact that CloudStack is a user-driven community which brilliantly models what Apache does best: providing a neutral space where many users of a code base can come together to quietly and effectively collaborate. Simply donating the original code to open source, doesn’t mean that Citrix has control over the project as it is today. With it’s impressive user list, CloudStack will continue to thrive whatever Citrix’s involvement.

For more on this topic, check out Simon’s article on InfoWorld.

Bringing LibreOffice to Android

At the LibreOffice Conference, The Document Foundation issued a tender document looking for bids to develop an Android implementation of LibreOffice. Could this influx of money affect the ethics and work ethic of today’s open source community? It’s not actually the first time the community has experimented with bringing a LibreOffice editor to Android. The scale of the task however is enough to dampen volunteer enthusiasm and given the lack of commercial motivations for engagement, disappointing low volunteer turnout is not actually a great surprise.

When it comes to money, The Document Foundation is faced with the mixed blessing of having plenty available. In what way is that blessing mixed? As a charity the foundation is legally required to spend those funds over the course of the following year or so. With the clock moving swiftly on, The Document Foundation has already invested in development infrastructure for testing, the backing of community activities and the hiring of sys admin and administrative staff. There’s still a sizeable portion left over though. Knowing that spending in areas where the community is already intrinsically motivated might well reduce contributions, TDF has decided to focus remaining funds on development of the Android port, hoping to bootstrap a necessary new community in the process. Now it simply remains to be seen if anyone will bid to do the work!

Read Simon’s full coverage in his InfoWorld article.

Alice kills trolls

The effects of Alice Corp. v CLS Bank are beginning to be seen and they highlight the landmark status of the case. The Court of Appeals for the Federal Circuit, previously considered by many to be strongly pro-patent, has now used the Alice decision to resolve large numbers of patent cases by finding the software patents in question to be invalid. In fact, even lower courts are beginning to use Alice to strike down patent cases, declaring them invalid because of unpatentable subject matter.

Of course, the patent trolls are not going to be caught sleeping. Conceding that Alice invalidates many of their cases, pro-patent advocates will now begin hunting ways to get around the ruling. As software patent consultant Bob Zeidman said “every time there’s a court ruling it just means that you have to word the patent claims differently.” This might well mean adding function claims to patents, a move which also significantly limits their disruptive power. Perhaps that’s a compromise open source developers could live with, but let’s not let our guards down. In the meantime though, I propose a toast, “to Alice”.

For more on this topic, read Simon’s article on InfoWorld.

All your data are belong to us

If we store… data in a place with public access, it will eventually become public.

Simon’s conclusion might seem a little paranoid, but it seem’s there’s plenty in the news at the moment to back up his position. From the hacking of celebrity Apple iCloud accounts, to the DEA using decades old phone records (stored by AT&T) as a covert source in investigations, vulnerable data is in the headlines.

The NSA and GCHQ also come under the spotlight. Simon’s contention is that all intelligence agencies worth the name are now actively gathering all online information that doesn’t require illegal action to take. This information includes all email, instant messages, Web pages, and social media traffic just for starters. The legality of the information harvesting is justified by a whole series of explanations, definitions and broad interpretations of legal doctrines, which enable such agencies to claim the information gathered is “public”.

Most of this data never gets looked at. Instead it’s cached in gigantic data lakes. According to the NSA’s legal advisers, “wiretapping” or “hacking” starts at the point a human being actually analyses or interprets the data. Of course, given the availability of the data, as soon as legal permission is given, tools like the NSA’s XKeyscore enable them to fish the data lake for relevant information. Given that you can’t dissociate data gathering from data usage, we all need to account for all the costs of putting information online, not just the ones associated with our primary goals. Don’t just think about what your data is being used for now; think about what it could be used for when it’s no longer your job. Read Simon’s full article in InfoWorld.

An Interview with Simon

After discussing a little history, (some of the things that have brought Simon to the place he’s at today), Simon’s interview for Australian Science mostly concentrates on his role at OSI and the work of the Open Rights Group. Check up on some of the things he’s involved with at the moment as well as some insight into institutions with an anti-open source bias, by reading the full interview.

Good News Roundup Podcast

The effects of the Alice v. CLS Bank Supreme Court case have been felt in the recent Federal Court of appeals, Digitech case. The court decided to not even check for infringements, as the initial image processing software was deemed not to be a significant improvement to the computer, but merely a computer implementing a non‑patent‑eligible technique.

On an entirely separate, but equally positive note, last week the UK government announced that from hence forth it will be using an open document format as its standard. To hear (or read) more detail and insight on both these stories, check out Simon’s recent podcast with Red Hat Cloud Evangelist Gordan Haff.

Formal certification for open source projects, is this progress?

By announcing its new certification process for Linux professionals at Linuxcon, The Linux Foundation made their pro-certification stance pretty clear. They’re not the only open source foundation endorsing peer-verified certification as an effective and useful way for those outside a community to place their trust in an individuals community credentials. The Document Foundation also offers a certification scheme, in their case for for LibreOffice migration professionals.

The two qualifications use slightly different procedures to assess candidates, but the outcome is a similar endorsement of community-recognised skills. How many other projects might be a good fit for this sort of certification? Should this become a more widespread practice? There are some obvious benefits to the practice, for a start it creates a concrete parameter for those outside the community to use when making hiring decisions. Both certifications appear to have made an impact in their respective fields, with the TDF certification already a requirement in some recruiting activities and The Linux Foundation’s introductory offer $50 certifications already sold out.

For more details about both certifications as well as more detailed discussion of potential criteria for new qualifications, see Simon’s InfoWorld article.