In Tracey Kidder’s book “Soul of a New Machine” I recall Data General’s Tom West as saying that the design that the team at Data General came up with for the MV/8000 minicomputer was so complex that he was worried. He had a friend who had just purchased a first run Digital Equipment Corp VAX, and Tom went to visit him and picked through the VAX main boards counting and recording the IDs of all of the components used. He then realised that his design wasn’t so complex after all, compared to the VAX and so Tom proceeded to build the MV/8000 with confidence.
In this example, deconstruction of one product helped Tom to understand another product, and sanity check that he wasn’t making things too complicated. It didn’t tell him if MV/8000 would be better than VAX however.
I have many times seen buyers approach a storage solution evaluation using a deconstructionist approach. Once a solution is broken down into its isolated elements, it can be compared at a component level to another very different solution. It’s a pointless exercise in most cases. It tells us nothing about real value. It simply gives us something to chew on, so makes us feel like we are doing a detailed evaluation and uncovering the truth.
Examples of this deconstructionist approach are when we compare bus component bandwidths of two storage systems, or compare the number and speed of Intel cores used in two different disk systems, or even the number of disk drives.
We can also deconstruct an architectural approach and turn it into a religious war, to no useful purpose, hence some of the arguments over the years about things like in-band versus out-of-band virtualization.
Amount learned about the business value of the solution? Still zero.
It’s important when evaluating a technology solution to listen to what the inventors of the solution have to tell us about its architecture – the way it works. That gives us a real opportunity to embrace the innovation they have tried to deliver.
Within an IBM framework for example, you cannot usefully compare a DS5000 with an N6000 and an XIV and a Storwize V7000 via deconstruction. In deciding between these 4 architectures you need to understand how they approach storage, not what their components are.
When you understand an approach you have an opportunity to think about whether it might deliver real benefits. For example, if you believe that the primary sin committed by storage solutions is complexity, then if you really understand a given architecture you can think about how that might deliver simplicity in your situation.
Another potentially useful approach to evaluation is reference checking. This can be misused in a similarly comfortable and limiting fashion, but it can also be a much better way to evaluate than any deconstructionist approach.
I’m not talking about a last minute sanity check before you place an order, nor am I talking about an exclusive requirement to talk to people in one’s own industry, in one’s local state or country, running the exact same setup, etc. I would want to talk to existing users of the same technology family, early in my evaluation. I’d want to understand if the innovation intended by its inventors is actually delivered, especially in terms of efficiency and simplicity. Referencing in this way will tell you much more than trying to do an evaluation based on technical detail.
When we evaluate technology solutions, do we try hard enough to avoid being driven by prevailing memes, or being overly-influenced by our own comfort zones? These are forces that work against innovation. Sometimes it seems we rearrange things a bit to create the illusion of innovation, whilst still ultimately building a situation we already know. Maybe we’re doing this when we use SSDs where once we used 146GB 15K RPM drives for example.
Ultimately, evaluating storage solutions should be about understanding how you can exploit the intelligence of inventors, it’s not about deconstruction, and it shouldn’t be about comfort zones.