Livin’ la vida SaaS

I bit the bullet: effective today, we are giving S3 a spin as our nextgen backup solution. The experience so far has been quite impressive: it took a couple of minutes to set up an account, download a couple of nice tools and start backing up stuff. I must admit that doing backups in an era of virtual machines and LVM snapshot is much easier than it used to be, and having a virtually unlimited backup media at your fingertips can really make the difference.

There are still a lot of i’s to dot and t’s to cross indeed, yet using S3 for backups seems to be very promising. My cost overguesstimate, including bandwith for incremental backups and the occasional restore, lands around €2/GiB/year, which is definitely not bad for an off-site, fully on line, secure, serviced and completely outsourced solution. Assuming we get past the apparent bandwith limitations (we’re having peaks of 3Mbps, but it takes a few parallel backup processes to get there), I see no reason not to provide every Sourcesense employee with an S3 account, having no more excuses for failing hard drives and no more rushes to the stationery cupboard to grab the last DVD left in the office.

True enough, it’s not like we have stellar requirements when it comes to backups: all we need is storage, and we’ll do the scripts. We don’t have massive amounts of data, we don’t have convoluted applications requiring cumbersome hot backup procedures and tols, and most definitely we don’t even see us getting there in short time, which means S3 fits the bill just fine. I wouldn’t suggest it to a bank or a telco just now, but I do have the feeling it could really be a viable solution for a lot of SMEs and/or vertical scenarios.

While we’re at it, let’s not forget the Open Source side of SaaS. The good news is that the sky is not falling, and there is still a lot of room for hacking: just look at the impressive list of available S3 tools. Building such an amazing plethora of software didn’t require any knowledge about the Amazon internals: we don’t need to know whether Amazon is using some custom Linux storage driver or just moving disks around using elves and hobbits, as there has never been a real need to know how a chipset is designed to hack around it. Open Source just needs  APIs and documentation, access to information that is, to thrive: luckily enough, it seems there are very few Nvidias and Broadcoms in the SaaS world. We’re in for some serious fun!

Livin’la vida SaaS

I bit the bullet: effective today, we are giving S3 a spin as our nextgen backup solution. The experience so far has been quite impressive: it took a couple of minutes to set up an account, download a couple of nice tools and start backing up stuff. I must admit that doing backups in an era of virtual machines and LVM snapshot is much easier than it used to be, and having a virtually unlimited backup media at your fingertips can really make the difference.

There are still a lot of i’s to dot and t’s to cross indeed, yet using S3 for backups seems to be very promising. My cost overguesstimate, including bandwith for incremental backups and the occasional restore, lands around €2/GiB/year, which is definitely not bad for an off-site, fully on line, secure, serviced and completely outsourced solution. Assuming we get past the apparent bandwith limitations (we’re having peaks of 3Mbps, but it takes a few parallel backup processes to get there), I see no reason not to provide every Sourcesense employee with an S3 account, having no more excuses for failing hard drives and no more rushes to the stationery cupboard to grab the last DVD left in the office.

True enough, it’s not like we have stellar requirements when it comes to backups: all we need is storage, and we’ll do the scripts. We don’t have massive amounts of data, we don’t have convoluted applications requiring cumbersome hot backup procedures and tols, and most definitely we don’t even see us getting there in short time, which means S3 fits the bill just fine. I wouldn’t suggest it to a bank or a telco just now, but I do have the feeling it could really be a viable solution for a lot of SMEs and/or vertical scenarios.

While we’re at it, let’s not forget the Open Source side of SaaS. The good news is that the sky is not falling, and there is still a lot of room for hacking: just look at the impressive list of available S3 tools. Building such an amazing plethora of software didn’t require any knowledge about the Amazon internals: we don’t need to know whether Amazon is using some custom Linux storage driver or just moving disks around using elves and hobbits, as there has never been a real need to know how a chipset is designed to hack around it. Open Source just needs  APIs and documentation, access to information that is, to thrive: luckily enough, it seems there are very few Nvidias and Broadcoms in the SaaS world. We’re in for some serious fun!

Let there be badgeware. And more.

It’s official now: the OSI has approved the Common Public Attribution License. Badgeware is now a legitimate citizen of the Open Source ecosystem.

I’m not going to reiterate my rants and restate why this is bad in so many ways. I just have to note how the official introduction of badgeware is going to make life difficult to many of us, as the current wording is far from being clear, it’s redundant and and brings a lot of hassles when it comes to compliance. Just go and read section #14 to get an idea of what I’m talking about: I dare anyone to figure out the hairy mess of borderline situations that will arise.

Unfortunately, when it rains it pours. As if attribution wasn’t enough, the CPAL, with a stroke of evil genius which reminds me of a story of ancient Greeks, a siege and a wooden horse, includes a mostly unnoticed proviso for the dreaded SaaS loophole. Yes, you read it right: the OSI not only gave a green light to attribution, they are even endorsing network use as distribution now, something that even the FSF has been reluctant to do, and something that I can’t help but reading as a violation of Freedom 0 and 1 from the Free Software Definition, not to mention criteria #6 of the OSD. Again, I see no point in going over and over what I’ve been muttering about when it comes to AGPL and friends, but I just have to point how the OSI managed to to stick in our throats the two major issues that were on the table. All this with the usual public discussion and process: none whatsoever, that is.

Sad, sad day for Open Source. By approving the CPAL, the OSI is sanctioning what I like to call defensive Open Source, that is Open Source seen as a necessary evil by companies realizing that the only way to get to the market with software products nowadays is by using the Open Source marketing weapon. Yet, something that needs to be limited so that it doesn’t threaten traditional business models. Get us the free marketing ride and forget about the rest. What we see behind this Open Source is cathedrals instead than bazaars, code dumps instead than communities, booby traps instead than participatory process, upsells instead than leverage and differentiate, Antarctica instead than Rain Forest.

Oh well. The JCP is almost gone. ISO is supposedly for sale. The OSI is apparently bending to powers that be. Am I the only one seeing a problem?

The Sunday post: Harry Potter pizza

Doing good stuff properly requires time and dedication. That’s the case for pizza: other than being a bit about chemistry and alchemy (as all things baked), it entails quite a bit of time for proper processing. Twenty hours, to be more precise, which is by the way the time it took me to read the latest Harry Potter novel.

Let me start by admitting that homemade pizza is sort of a lost cause: very few home stoves are able to reach the 350°C required to cook the dough without burning the mozzarella on top, not to mention that a proper wood oven makes all the difference in flavour. This said, there are a few tricks that make homemade pizza a great experience altogether: you won’t beat some of the best pizza restaurants in Naples, but you will still be able to impress your friends and have a great meal.

The first ingredient you need, and the hardest to find, it’s time. Pizza is all about dough, and that’s a time consuming job that knows no shortcuts. The good news is that there’s no need for a lot of effort: you will be all set with little more than an hour of overall work, but you will need to spread that time over nearly a full day. Which is why having a weekend with a good book to read makes the perfect match. You might wonder why it takes so much to just mix some flour and water, especially now that we have industrial yeast and self-raising stuff: just know that the difference between your home made bread and the fragrance of a professional bakery is all in the process. The not so hidden secret of proper baking is called pre-fermentation, which (roughly speaking) can translate to providing the smallest possible quantity of yeast with a suitable environment to perform as its best. Roughly speaking again, yeast and time are reciprocal: the more time you have, the less quantity of yeast you will need. And you really want to have the least possible yeast in your mixture, as that will buy you better flavour, richer texture, and easier digestion. Just give it a try: I’m sure you will see the difference.

Back to our dough, now. There are many ways to achieve pre-fermentation: what I prefer is the Poolish method, that is a liquid mixture of equal quantities of flour and water, with some yeast added on top. For my pizza dough I started yesterday at 10PM by melting between 1 and 3g of fresh yeast (use 2/3 of that if all you’ve got is the dried variant) with 0.5l of water. Yes, that’s an impressive low quantity of yeast, and no, I didn’t miss a zero: you really don’t need much raising powder for this magic to happen, that’s what the Poolish method is all about. Once I was dead sure that all the yeast had melted, I threw 500g of sifted strong bread flour (we call it “Manitoba” over here) and started mixing. It took me just a few minutes to obtain a very sticky and fluid mixture, which needs to be covered with a towel or anything that allows the mixture to “breathe”, avoid tin foil and film. I called it a night and start wandering in Harry Potter land.

Today being Sunday, I was in for a late start, given also that my book got me hooked until very late at night. That was OK anyway, since it roughly takes from 12 to 16 hours for the mix to grow (it mainly depends on the temperature, just know that the whole thing is ready when it looks like not so inviting bubbly mess, roughly 2-3 times in size from the initial mixture): in my case, pre-fermentation was over around noon, and it was time for some late morning exercise, that is preparation of 300g of white durum wheat flour, 100g of semolina (both sifted), 25g of extra virgin olive oil and 25g of salt. It’s important to sieve the flour carefully, and add it to the mixture two-three spoonfuls at a time, mixing it in before adding more, adding olive oil and salt before getting on to serious kneading. Kneading isn’t about sparing efforts: it usually takes me 30 to 40 minutes to end up with an elastic silky paste that leaves me proud and satisfied, even though a bit tired. Shaping the dough into a ball and covering it with a wet towel was the last bit of action before getting on to some well deserved lunch treat (a great fresh mozzarella, some green salad and bresaola, in case you were curious).

After an hour, lunch was over, the dough had risen quite a bit, and I wanted to get back to my book. I spent 10 minutes in splitting the paste in four smaller balls, whacking them in the refrigerator for another two-three hours of rising in a cold temperature. Roughly 2.5 hours before dinner, I took a very short break from Hogwarts tales, taking the dough out from the fridge for the final two hours of maturation (summing it up, that is 12-16 hours of pre-fermentation, one hour of initial rising, three hours in the fridge and the final two back to the kneading board: makes 18 to 22 hours overall, which is quite a bit but still much less than it takes for a proper Polijuice potion). The final two hours were all I needed to finish reading the book and finding out what Harry Potter was to be about: the oven was pre-heated to maximum temp, and it was time to shape the balls of dough into pizzas. This is incredibly easy if the paste has been done properly: a couple of slaps, a bit of cool-looking juggling, absolutely no rolling pin (that’s a Unforgivable Curse in pizza land), and the dish-shaped paste is ready to land in a proper pan which needs nothing else than a sprinkle of flour on the bottom to avoid sticking. Filling is next: my personal choice for home-made pizza is very very simple, just a bit of tomato sauce (lightly seasoned with some olive oil, salt, pepper, and origano) and enough mozzarella cheese to coarsely blanket the surface. A drizzle of olive oil, a couple of basil leaves, and it’s time to whack everything in the oven. 15 minutes will be more than enough in my not so powerful oven to end up with a nice thin pizza with a thick crust (as it should be), which makes for a rewarding sunday dinner. I just wish I had Butterbeer to go with it.

Competitive puberty in Open Source land

Facts: a well known player in the Open Source ESB space posts a few benchmarks against competing OSS projects. A prominent blogger, whose company is developing one of those ESBs, suggests that Open Source companies shouldn’t compete amongst themselves, as there are plenty of proprietary alternatives to target first. Matt agrees, and Loopfuse’s Roy Russo goes the extra mile, describing the OSS business model as inherently monopolistic: when a player enters a market sector with an Open Source alternative, the barrier to entry is raised so much to leave little to no space for others.

My take: benchmarks stink, to start with. In my personal lie detector, they’re sitting between seasoned politicians and statistics. Competition by benchmarking is shortsighted to say the least: find me a CIO who considers vendor-driven benchmarks as a selling point, and I will easily convince him to buy Rome’s Coliseum. Extra brownie points go to Dave for avoiding the numbers food fight and moving the conversation to the next level.

Having said that, I find myself in significant disagreement with the rest of the conversation. I see competition among Open Source companies as a healthy sign of growth, and a necessary step to a consolidated and level marketplace. Acne and perspiration might be bothersome, but they both lead to adulthood and maturity. Open Source needs to understand that there are no special provisions in the IT marketplace: we must play by the rules, and it’s a wild world out there. Luckily so.

The early commercial Open Source has been based on substitute competition, building alternatives for a commoditized marketplace and avoiding direct confrontation with competitors by providing a different value proposition for substitutes. The commercially-declined Open Source proposition has been margarine for butter so far (which is an unfortunate example, actually: the way I see software history, Open Source – butter, that is – was there first, then the industrial alternative – proprietary margarine – kicked in pitching a supposedly healthier and modern alternative. We are now rediscovering that butter wasn’t so bad to start with. But I digress). The problem with substitute competition is that it’s short-lived by design: it doesn’t take much for others to see an opportunity window and chime in, moving markets to the next logical step, that is direct competition.

By the way, this is why the Open Source model is far from being monopolistic: it’s true that the first player is getting a nice headstart and a notable free marketing ride, but this is just another declination of the “yet another” concept, a myth easily debunked as all it takes to recover is being better, not to mention that pioneers might end up getting all the arrows. Most of the prominent Open Source players of today have actually been followers back then: old farts like me might remember how Yggdrasil was there before Red Hat, MySQL started as a mSQL (ok, not quite Open Source) spin-off, and we had several Exchange replacements before Zimbra was even conceived.

The field is level: the sooner we realize it, the better we will be able to react. Commercial vendors are entering the Open Source space with mixed propositions and hybrids: there is no point in complaining about competition amongst Open Source companies as if it was a “us” vs. “them” game: it’s not, anymore. I can clearly see a case for co-opetition in the commercial Open Source circle: fair play and joint initiatives can play an important role, but this is just a tactical perspective. What we need to do is focus on strategic innovation: substitute marketing and budget competition have gone for good, and this is excellent news in perspective. We are growing!

My 0.2 eurocents on SaaS

I was there some eight years ago, when ASP was glamoured as the Next Big Thing: I saw it horribly failing, I witnessed entire datacenters going belly-up and I danced on the ashes of what was back then a solution looking for a problem, with a wrong proposition and a sheer cultural and infrastructural gap to overcome. Fast forward almost a decade: we are now confronted with ASP v2.0, known by the sexier name of SaaS, and I’m convinced this could just be the right time for a huge success. I can see at least three factors that will drive SaaS to a major blast:

  • Much better connectivity and technology. High-speed Internet access is reality now, Moore’s law has moved from CPUs to bandwith, and connectivity is only going to get better (modulo issues on network neutrality, of course). Being online is the default case by now, whereas just a few years ago it was stuff for early adopters. Meanwhile, the browser has become a real platform, with some interesting contaminations from the field of Rich Internet Applications and the like. I know I’m stating the obvious, but still this is a major change if you compare it to the Citrix-based ASP offering we used to see.
  • The business case for real on-demand, scalable, managed and secure IT solutions. Sure, there is a lot of inertia to get the ball rolling, but once the fear of giving data away is gone (see below), there is no real blocker for SaaS to get huge. There are just too many IT scenarios that would be much better served by a service-based solution: it’s just a matter of time before enterprise realize that there is little use in maintaining complex hardware and software infrastructures with little value to the business side. If you disagree, chances are you would have been in the camp that, just a few years ago, didn’t believe organizations would have moved their servers to colocation facilities, or outsource key business processes to India.
  • The huge cultural shift. The whole concept of data and application is getting ethereal, bringing an entirely new view on software. SaaS has won by far in our private lives: we are getting less and less software on our hard drives, and we are slowly moving data outside as well. Our relationships happen on social networks, and even our beloved telly is going away. Where is the software? Do we care anymore? In Geir‘s words, do we know or bother about what version of Amazon are we using? And it’s getting even better: the next generation will have a much different mindset than ours: youngsters have no idea of where they data are, they use IM much more than email and PCs much more than television. We’re in for a big change once those guys hit the job market.

Just for completeness sake, let me add a fourth and somewhat far-fetched ingredient to the mix: the computing industry has a huge carbon footprint and energy consumption is becoming a major cost for enterprises. It’s easy to imagine that sooner or later some hard decisions will be tabled, such as computing carbon taxes and, in general, increasing energy charges. I’m not saying this is going to happen overnight, but I can clearly see how centralized datacenters with optimal load distribution and smart power generation facilities can do a lot to ease carbon emissions and avoid additional pollution costs.

SaaS will bring a cultural shift to the IT industry, and Open Source will be no exception. There will be good news and bad news, but the outlook looks interesting indeed. It will take a very long post (actually several ones) to try and figure out at least the major implications and challenges, but given how SaaS is going to affect our lives, this is a worth debate to have. More later.

The Sunday post: summertime cooking

I’ve been prodded a number of times into getting back to cooking blogs, and who am I not to oblige? Let me start with a clarification, as in the past few months a number of events have kept me away from the kitchen: first of all, my Sundays have been busy with the golfing season. Golf is a strange beast: you walk for 8 miles and when you get home either you’re just too mad about your poor performance to enjoy anything other than some inventive cursing, or you had the perfect golfing day and it’s time to dine out and celebrate. On top of this, I also started having some back problems that pushed me into some diet attempts, and I thought you wouldn’t have been that much interested in unseasoned tomato salads and skimmed yogurt.

Last but not least, summer kicked in: as much as we didn’t quite experience a steaming hot climate so far, still summertime tends to keep me away from serious cooking. Ingredients are just too fresh to justify any convoluted recipe, and anyway the last thing you want to do on a scorching summer day is spending time in a overheated kitchen, sweating over a steamy risotto or burning your fingers on some chunky piece of meat. This doesn’t mean I’m not eating: I’m just trying to stay away from the stove, sticking to nice fresh stuff as much as I can and limiting my exposure to anything warmer than an iced tea.

This is why, during the summer, my oven comes to the rescue: baking doesn’t require physical presence, it’s just a matter of slicing, dicing, seasoning, whacking in the oven and sipping a freshener while the roaster does the rest. My typical weekends are now mostly about grabbing some fresh food and do whatever I can to find the shortest path from the grocery bag to the dining plate. I am still cooking, though, and I do have something to share: given I have to apologize for the long hiatus, this post will contain not one but two suggestions for your summertime meals! How’s that as a deal?

The first proposal is a simplified rendition of a Neapolitan classic, and it’s actually just a start to get your fantasy in motion, as this is a dish that can be easily adapted to whatever your taste is and what your refrigerator contains. We call it “Gateau di patate” (some misspell the French word and use “Gattò” instead), and it’s basically an easy peasy mashed potatoes pie. Start by boiling some spuds (remember to leave the skin on and, yes, if you’re brave enough, you could just bake them, as I suggested in a previous recipe). Mash them coarsely, don’t be overzealot: a few small lumps here and there actually help the texture. Add a handful of parmesan cheese, season to taste with salt and pepper. Throw a whole egg and two-three spoonfuls of white flour, carefully mixing it all together.

Get a plum cake mold now. Smear it with olive oil, then add some breadcrumbs. Shake the mold to ensure that the breadcrumbs stick to the bottom and walls, than grab your mashed potatoes, a glass of water and a spoon. Fill the bottom half of the mold, helping yourself with the back of a spoon which you should continuously keep wet so that potatoes don’t stick: keep in mind that if you move the mixture too much, the breadcrumbs will just get into the mix instead than forming a nice crust on the outside, so work your way with care. Try to carve some room in the bottom half, as the next step is adding some filling.

The golden rule with filling is “use your imagination”: our favorite is thick-sliced ham, slightly toasted in a frying pan to make it crisply, diced mozzarella cheese and some Parmesan, but don’t let our preferences hold you off. This stuff can be excellent with salami, chorizo, bacon, emmental, gorgonzola or goat cheese, and the mashed potatoes are just an excuse to enjoy your creativity. My suggestion would be to always have a mix of meat and cheese, but of course YMMV.

Once you’re done with the filling, just throw the remaining mashed potatoes, level the top and add some breadcrumbs and grated cheese which will form a nice crust. Whack everything in the pre-heated oven (180C) for 20-30 minutes, or until golden brown. Leave it to rest for at least 20 minutes before extracting your gateau from the mold. Slice it in thick slices and serve it with some green salad. Extra kick if you’re lucky enough to have leftovers: get a non-stick pan, a drizzle of oil and fry the remaining slices over a low heat. Serve it with some warm goat cheese, some nice peppery rocket salad, and a chilled glass of white wine: the world will be a much better place.

The second suggestion is perfect for a summertime dinner in a working day, when you come home exhausted from work and you need something fresh, quick and tasty. A typical summer dish in Italy is carpaccio, that is thin-sliced raw meat (or fish), seasoned with lemon juice of vinegar which perform a sort of cooking by acidity. The problem with carpaccio is that you really need to trust your butcher: health issues aside, raw meat needs to be just perfect to enjoy it, and given this times of grocery stores and supermarkets, finding a great chunk of meat is getting increasingly harder. This is why I find this variation an excellent alternative, as it requires just a little bit of cooking which is more then enough to guarantee an outstanding result in a few minutes.

The basic ingredient for this dish is (surprise!) carpaccio meat, which might not be that easy to find. If your local store doesn’t know how to prepare it, ask them to get a chunk of lean meat (fillet will be excellent, but you can use something cheaper like thick flank as well) and slice it as thin as a bacon strip or prosciutto. Of course there’s nothing preventing you from slicing it up: a good trick, even if it will somewhat compromise the taste, is throwing your chunck of meat for a couple of hours in the freezer, so that it’s firmer and easier to handle. Finally, you can have this dish with thicker meat as well: just adjust the cooking times accordingly.
Start by pre-heating your oven at the maximum possible temperature, moving the tray as high as you possibly can. Now cut a good portion of ripe cherry tomatoes: according to the size, you might want to slice them into halves or quarters. Crush a couple of garlic cloves, and toss them into a bowl with the tomatoes, some excellent olive oil, and possibly some rosemary and origano or thyme. Season with salt and pepper (hold the vinegar), and leave everything to rest for a couple of minutes.

Get a large baking pan now, capable enough to hold your meat in just one layer. A pizza pan is usually just perfect, given you want to have your meat as close to the grill as possible. Drizzle a bit of olive oil, and start coating your pan with the meat. Add the tomatoes on top, switch your oven to grill mode, and whack everything in. Mind you, cooking is going to be fast: two-three minutes are usually more than enough to heat the meat and have the tomatoes loose some precious juices. If you’re using a thicker cut of meat, chances are you’ll need more time for cooking: in that case, hold the tomatoes for the final two minutes. Serve it right away, maybe throwing some fresh green salad on top, and get ready for a round of applause.

Have a laugh at us…

(Update: the Alexa graph should display correctly now)

… while I’ll be banging my head against the wall for a while.

Today, the Italian Government officially rejected a formal request for information from a gathering of professionals and citizens who would like to understand why we have spent a whopping 75M$ (yes, that’s millions) for italia.it, our official national gateway for tourism, launched with a huge fanfare a few months ago. I’m not surprised: it would be highly embarrassing to explain why it took us the cost of a brand new Boeing 737-800 to come up with such an abominable example of poor web design, even poorer content and arguable usability.

I have to say that every time the stupid Italian portal makes the news, I have shivers down my spine and I feel the urge to grab a pickax and start a revolt. Just consider how our beloved website (blue) is faring on Alexa when compared to the national tourist sites of Britain (red), France (yellow), Spain (green) and Switzerland (black):

(Yes, even Switzerland – with all due respect – is kicking our butts as of late)

My cardiologist is probably happy about my ignorance of German and Chinese, as I guess my blood pressure would jump to unforeseen levels if the current translations are on par with the following gem fresh from the English version of the site (emphasis mine):

At the first glimpse of sunshine you rush out to buy a new bathing costume? You instinctively spout poetry in front of Parmigiano Reggiano? In that case, you are truly in love with Italy. And that’s an excellent reason for registering on Italia.it. We’ll get to know you know you better and understand what you like best about our country.

(from http://www.italia.it/wps/portal/en_enroll)

Now, nevermind my rants: please, do pay a visit to our portal. Given the sheer amount of money we wastedinvested, at a very least I would love to be paying less than 1$ per page view. Given the numbers, I think it will take quite a while.

French cheese and Open Source

I was in Geneva last week, holding a Open Source seminar with a few selected IT directors of various UN organizations. Bright minds, needless to say, and definitely an intriguing day full of discussions and insights. Lecturing is a great way to learn, and I’ve been thinking a lot about the questions I’ve been asked: one, in particular, got me thinking even though I guess I answered that a million times already. I was asked about a good value proposition to pitch embracing Open Source to a non-IT CEO: a bread-and-butter question, but the way it was asked, the moment I was questioned about it, the body language or something else just started a flurry of thoughts that kept me busy for most of my way home.

While driving back to Milano, I called my wife and told her that, since I was returning through France, I would have stopped by a supermarket to buy some cheese. And this is when everything came together. I was going to stop in a randomly picked grocery store, shopping for cheese since, being in France, it must have been good. I had no particular address, no recommended shop, no idea of what I should have been buying: I just trusted the idea of French cheese being, in average, good stuff. As I should have expected, my bag full of stinky matter has been quite good overall, but I had some unpleasant surprises and a few well below-par bits of commercial matter.

I’m starting to think this is the same for Open Source: just like French cheese, Italian shoes or German cars, what you’ve got is a concept, a perception, a general idea built upon hard facts and experience creating allure and fascination, but at the end of the day you still have to do your math. Italian shoes are, generally speaking, very well known for their quality and design, but of course we do have our share of poor manufactures, not to mention counterfeits. German cars are by and large reliable mechanical masterpieces, but there are exceptions indeed. French cheese is usually a godsend, but sometimes it can be just some stinky rotten milk.

That’s strikingly similar to Open Source: as a development and distribution methodology, Open Source software has a wealth of experience and track record along the lines of better oversight, increased standard compliance, lesser vendor lock-in and so on. The Open Source proposition is attracting and enticing, yet it’s not Midas touch or a philosopher’s stone. Open Source will not and cannot turn bad software into excellent stuff, transform a proprietary and closed company into a good community citizen, or solve a customer IT problem just like magic but, in average, it will be good stuff. Open Source is a German car, an Italian shoe or a mold of French cheese. In most cases Open Source can be the best solution, just as expected. But do exercise your judgement, look for recommendations, make sure you’re not dealing with a counterfeit. In a word, be pragmatic. However, don’t resist the lure: sell it to your CEO.

Fighting windmills: Open Source and the public sector

A new research paper on Open Source and government is out, with a few notable comments from Matt and Roberto, among others. Interesting read, but I have to confess that as of late I feel increasingly tired and demotivated when it comes to OSS in the public sector. I’m really fed up with useless forges, forgotten observatories and hopeless committees going nowhere, and I don’t see how the situation is going to change in the near future.

This is ever so frustrating if you consider how, from the peanut gallery, Open Source looks like the best thing since sliced bread for governments: vendor lock-in reduction, better adherence to standards, protection of IT investment, optimization of costs. A killer proposition, yet it’s not taking off apart from a few exceptions. Why is that?

Maybe we are faced with a cart and horse problem: instead of looking at what should be the goals for IT in governments, most of the current research is taking for granted that governments must use Open Source, and struggling to back up this claim. We should probably start asking a few basic questions instead, such as what we really want from IT in the public sector and how we can achieve it. I think that the basic answer to the above question is “more, with less”. We definitely want more and better IT, reducing duplications, providing information and making processes more efficient. As taxpayers, we want every single euro to be spent in the best possible way. It’s as simple as that: add some open standards on top to ensure citizens have equal access to government data, and Bob’s your uncle. Now, is Open Source the best way to achieve that? Short answer: no. Or at least it’s not enough.

Just look at the current e-government landscape: it’s not like there isn’t enough Open Source, it’s just that most Open Source out there, in all honesty, sucks (pardon my French). in Italy we have a plethora of Open Source projects, committees, focus groups, mailing lists and all that. Too bad they’re not talking to each other. The basic mantra is that most public tenders see a bespoke Open Source solution as a winner, which means something gets developed and dumped somewhere. No one ever picks it up for the next tender, because you just don’t give in to your competition, so the next winner comes along with his own Open Source implementation of the same thing. And the wheel keeps on spinning.

I know I’m sounding like a broken record, but this is yet another proof of how Open Source without Open Development can be a hollow proposition with little value. For one, there is little chance of reuse with Open Source alone: if a solution is Open Source in the “code dump” meaning of the word, with no openness to external participation, no neutrality and no diversity, it’s just a bunch of code and nothing more, and it will do no public good in terms of reuse and vendor lock-in reduction. Competitors will not contribute outside of a neutral environment (it took Apache to have Sun and IBM exchange code and collaborate), and lack of diversity will make any project die of thermal death sooner or later.

Please note how I carefully avoided the term “community” so far: as much as I strongly believe in communities, I also realize that you can’t just shove community participation down someone’s throat and pretend it will work: expecting public sector communities to blossom just because Open Source is cool is a bit far fetched, to say. Presuming that Open Source adoption in government will grow just because public employees participate in communities is just a day dream. What we need is more than a community: we’re desperate for a real ecosystem, run by the public sector, with clear participation guidelines under neutral terms, sound technical guidance focused on standards and interoperability, open development processes addressing neutrality and collaboration, and room for different value propositions when it comes to who’s getting to milk the government cash cow in the next tender. What we don’t need is preferential lanes for Open Source: all we want is solution to be measured using specific metrics such as reusable, interoperable and lock-free software. Open Source will then be able to speak for itself and prove its worth in the marketplace.

But I guess we will have to do with another bunch of going-nowhere committees instead.