Wednesday 26 October 2011

Unix on hypervisors, what have you done for us, lately?

Unix might be considered to be a successful operating system now, dominating the server side market via Linux and increasingly in peoples pockets through Android and iOS.

Conceived at the birth of big timeshare systems, Unix contains many design decisions that were acceptable then, but now seem somewhat archaic, especially when we compare them to the actual use now. I'll focus on one specific aspect - multiple applications per Unix instance.

A very common (2011) paradigm is to deploy multiple Unix instances per hypervisor, with the hypervisor taking the lions share of responsibility for scheduling and isolation. The irony lies in these being the specific things that Unix is supposed to be good at!

I personally lay the blame at the feet of cowardly system administrators unwilling to take a risk with an operating system they claim to love, retreating into the mystical protection of a hypervisor. Or at the feet of "best practice" which leads to avoiding anything approaching accepting adult responsibility for what happens on a machine.

A more charitable view is that the nature of workloads actually changed between the 1990s and the 2000s, as did the nature of the machines that ran them and the expectations of the cost of those workloads. The fact remains that if we are now using purpose built hypervisors to offload responsibility from the client OS, what is that client OS now doing?

As alternatives - BEA had a tilt at running Java straight on hypervisor supervised virtual machines, IBM's VM OS has a very light weight OS (CMS) that runs on its hypervisor and there's no shortage of smaller, lighter implementations of Unix like OSes. Amazon via Elastic Beanstalk and Google through their app server have shown us that we probably don't care in the end what is actually running underneath our apps.

I quite like aspects of the Unix architecture - but its a general purpose OS thats being increasingly used for a single purpose. If you can't trust it to isolate processes from each other and prioritise those processes, what can you trust it for?

So, Unix on hypervisors, what have you done for us, lately?

Monday 10 October 2011

Quality

I remember rather vividly a particularly poor lecture from university on Quality. The lecturer was asking for what constituted quality. At that stage I took (and still take) a fairly dim view of the unqualified use of the term of quality, since it has a fairly specific meaning in traditional English roughly akin to "posh". I pointed out that quality, to me, meant a brand like Rolls Royce. I got laughed at but you can see the consequences of the unqualified use of the term "quality" all around us in IT.
Quality control is meant to establish a minimum acceptable standard for a given item, with testing at a statistically set frequency to gain a level of certainty that an item meets those standards. But that isn't what was discussed during the lecture - there was a lot of talk about items expecting to work all the time.
Certainly in the 15 years since then, I've seen quality usually equated with testing, be that manual testing, automated testing or unit test coverage. It is hard to argue that all these activities aren't desirable in the production of high quality code, but its again somewhat missing the point.
The use of quality unqualified is the topic of Zen and the Art of Motorcycle Maintenance which did come up in the lecture, and the lecturer confessed to not having read. If he had, he would have been more open to my Rolls Royce definition. Each of us as an intrinsic understanding of what quality is and it extends past listing off a description of the properties of the product.
The most common example nowadays might be the use of Apple products - you could take the same base set of components and build them into an Apple MacBook or a Windows based laptop. Both will have passed quality control. Both have operating systems that generally work. Both will cost roughly the same amount. One will usually be described as being higher quality than the other. Not everyone will agree on which one is which.
On a coarser example, given the same set of ingredients, using the same recipe, 2 people may come up with altogether different results. The resulting dish might be identical when come to being described but will have an intrinsic difference in terms of unqualified quality.
This leads us to the interesting area of when an item can pass all its quality checks but be of undeniably poor quality - often that it just doesn't feel "right". In an industry that loves measurable quantities (or at least professes to) this is a difficult message to get across. The question is then how do we determine quality? I don't believe that the answer can found by spending more time on work - calligraphy masters take their brush and sweep past a paper once to create a masterpiece. Nor is it to establish a hierarchy of masters and apprentices - the greatest examples of genius can come from outside formal structures.
Perhaps its as simple to be open about feelings as to quality, and to know that by practicing things that aren't masterpieces or obsessing about the detail on everything that when the call really comes we will be ready to do truly quality work.