Home Interview with Jim Zemlin, Linux Foundation

Interview with Jim Zemlin, Linux Foundation

 

 
The Linux Foundation was set up in 2007, and initially occupied a rather marginal position in the open source ecosystem. That’s changed more recently, and the Linux Foundation has become one of the most important forces defending and promoting Linux and associated free software.
The person largely behind that transformation, and a well-known voice in his own right, is Jim Zemlin. During LinuxCon Europe, which took place in Prague at the end of October, I was able to interview Zemlin at some length about his own career and the larger open source world (disclosure: the Linux Foundation paid for my day-return trip).
I started off by asking him about his own background, and why he decided to take on the job of Executive Director at the Linux Foundation, which he joined after working at the pioneering open source company Covalent.
“Covalent was a company that was commercialising open source software fairly early in that cycle. I was previously at the ASP applications software Corio, which went public the summer 2000 just before the bubble burst. I left the firm at the IPO and started working on Covalent, and got hooked on the concept of open source software. Throughout, Covalent worked with a lot of guys from the Apache Software Foundation, so I got to know people from the Linux community.
After that, I started working with the Free Standards Group [FSG]. I had been approached about trying to standardise the different versions of Linux in order to create a broader application ecosystem for primarily server but also for desktop. This was one of these things you thought: OK, not going to make any money here, but it’s interesting to pursue in terms of the impact on computing if it was successful. That grew into what today is the Linux Standard Base.”
The Linux Standard Base (LSB) was a key area back when people were worried about fragmentation in the Linux world (I dread to think what they would have said about today’s Android.) Here’s the background from the Linux Foundation’s Web pages on the subject:
When targeting Linux as a platform, application developers want to have some assurance that the code they write on one Linux distribution will run on other Linux distributions without having to go through extra effort. This matches their experiences on other popular platforms, such as Windows or Mac OS X.
In addition, application developers want to ensure that the platform as a whole does not diverge. Even if an application works on today’s distributions, will it work on tomorrow’s?
The LSB workgroup has, as its core goal, to address these two concerns. We publish a standard that describes the minimum set of APIs a distribution must support, in consultation with the major distribution vendors. We also provide tests and tools which measure support for the standard, and enable application developers to target the common set. Finally, through our testing work, we seek to prevent unnecessary divergence between the distributions.
Important stuff, if rather forgotten these days.
Zemlin continues:
“Then through a series of different transactions we [the FSG] merged with an organisation that was called OSDL [Open Source Development Labs] and several other organisations [to form the Linux Foundation.] So my initial interest was through that, and as the Linux Foundation grew and matured this became pretty fun, the Linux community became something that exceeded my expectations, and they became my friends.
Zemlin has a simple set of rules for guiding the Linux Foundation:
“The way we think about what we are doing is to ask three questions. One, are we significantly helping move the needle on the adoption of Linux or Linux-related technology in a variety of industries? That could be server computing or supercomputing or embedded systems.
The second question is: is anybody else doing this? If there is already a market mechanism or pre-existing consortium or some other initiative that is succeeding in helping moving the needle around this particular aspect around linux, why would we want to do it, it’s already being done. Which sort of begs the question do you need a consortium to do this, do people need to cooperate to do this.
The third question is, can we get the resources to do this – do we have the skills, do we have the knowledge, do we have the financial resources to do this particular effort?”
Despite its promotional activities, Zemlin sees the central role of Linux Foundation as very practical:
“My sense is that [companies] understand that this idea of the Internet of things is not just a bunch of hype, that a picture frame is at some point going have a chip in it and be connected to a network, and this light bulb will, and all sorts of different things are going to be successful that way. They don’t quite know what the next iPhone is going to be, or what the next great digital picture frame is going to look like, but they have a pretty good idea that it’s going to be made up of Linux kernel, and some set of libraries that are related to Linux – unless it’s coming from Apple, Microsoft or RIM.
So if we could make that a little easier, if we could make the understanding of how to comply with the licences, the ability to have efficient build systems, efficiency of having a common kernel, I think we would have done our job.”
Amidst all this positivity, I had to bring up Meego/Tizen, which has been one of the less glorious episodes in the open source world recently. I asked whether it was really a good thing that every year we seemed to have a new incarnation of this project – but that it never really seemed to go anywhere.
“First of all, I have the fortunate position of being able to say every year a new one is not necessarily a bad thing for Linux. If you look at each of these projects in succession, do they add to the suitability of Linux in mobile devices? Are those an abject failure from a market perspective? They might be, but do they add to Amazon’s Kindle Fire? Yeah, they do, they make the kernel better for it, they make the libraries better. So I don’t view these as abject failures.
Nor do I believe that the fat lady has sung in the mobile market place: it’s hypercompetitive. I don’t see the success as one of these efforts as mutually exclusive of another. In other words I don’t think that Meego and now Tizen means that Android needs to be a failure and that they compete. To a large degree a lot of those efforts are dependent upon a configuration of market players who have reasonable critical mass, speed at which they can execute on getting code into the hands of developers, and organisations who can produce real products with it. Each of those iterations can build on the next.”
Those were fair points, but I wondered about the poor developers on the ground who are being forced in invest time and energy in one platform, only to have it discarded and replaced by another.
“When you enter an ecosystem you are entering into an implicit futures contract with that ecosystem, and you inherit the risk associated with that. You have to look at what configuration of players are there, what is the technology in terms of how sustainable will it be for future, and do I think that has a reasonable shot at going at it? And you make your bets. The investment in these projects is not trivial, so I do think it’s reasonable for developers to be somewhat confident given the scale of the investment.”
Zemlin went on to make an important point about the role of failure:
“I think communities grow, evolve, learn lessons from mistakes. I’m from Silicon Valley; a CEO who fails is not a scarlet letter person. They will say: that person is better now, they’ve learned their lessons. Every failure of Linux on the desktop each year has made Linux on the desktop better. If you use a modern linux desktop, it’s pretty good. and that’s the result of a lot of risk-taking over a lot of time.”
In the second part of this interview, Zemlin talks about Linux’s key advantages, and the state of open source in Asia.
 
Source ComputerWorld

About ReadWrite’s Editorial Process

The ReadWrite Editorial policy involves closely monitoring the tech industry for major developments, new product launches, AI breakthroughs, video game releases and other newsworthy events. Editors assign relevant stories to staff writers or freelance contributors with expertise in each particular topic area. Before publication, articles go through a rigorous round of editing for accuracy, clarity, and to ensure adherence to ReadWrite's style guidelines.

Get the biggest tech headlines of the day delivered to your inbox

    By signing up, you agree to our Terms and Privacy Policy. Unsubscribe anytime.

    Tech News

    Explore the latest in tech with our Tech News. We cut through the noise for concise, relevant updates, keeping you informed about the rapidly evolving tech landscape with curated content that separates signal from noise.

    In-Depth Tech Stories

    Explore tech impact in In-Depth Stories. Narrative data journalism offers comprehensive analyses, revealing stories behind data. Understand industry trends for a deeper perspective on tech's intricate relationships with society.

    Expert Reviews

    Empower decisions with Expert Reviews, merging industry expertise and insightful analysis. Delve into tech intricacies, get the best deals, and stay ahead with our trustworthy guide to navigating the ever-changing tech market.