By: Darren House, Technologist-Data Center, Unicom Government
Virtualization has ruined everything. Well, not everything, but it has made a lot of things harder, while making a lot of other things easier. Confused yet?
Let’s start with virtualization because in the modern era, everything starts with virtualization. Effective virtual designs have pressured organizations to standardize on shared infrastructure which in turn has accelerated business conversations about commoditization cycles for servers, storage and networking. This has created an enormous buzz about the value of commodity in IT, how commodity can be differentiated and how these impact downward pressures on price. Virtualization has done this on the business side, while also significantly reducing infrastructure costs through consolidation, increased provisioning and deployment agility and improved overall systems management and integration on the IT side.
Understanding that virtualization really impacted the discussion about commodity is important. What is more important is understanding that the discussion about commodity isn’t really a discussion about commodity. It’s a discussion about two things. First it’s about the difference between commodity, specialization and complex systems and how their evolutionary behavior within a lifecycle of differentiation should influence decision making. Second, it’s about achieving sustained differentiation for your organization and for your customers.
Today, servers, networks and storage are considered commodities, which really means their features and performance are in relative parity, making the real competition solely on price. The opposition to this is specialty, which involves technology features and/or performance capabilities that differentiate vendors enough so the competition is on features and performance first, and then on price. Technical specialty means there is a perception that the specific features and\or performance provides a needed business value that cannot be achieved by a commodity item.
However, IT commodities are different from other types of traditional commodities. This is for many reasons, partially because of Moore’s Law, partially because of the continuing R&D that brings constant change to features and performance. Information Technology is also different because many of these technologies are really more complex systems not individual technical components. While they have relative parity in some components, they also have specialization in others. In IT, many products and solutions are a hybrid of commodity components and specialty components. Vendors are continually in flux between commodity and differentiation, sustainment and disruption. This mix has created a differentiating lifecycle within commodity technologies.
An example is the Cisco UCS Server System. From the beginning, the blades were in relative parity with other vendor blades: they all had main boards, CPU, memory. They all had storage and networking interfaces\options that were in relative parity. However, the UCS System originally had specialties unlike other vendors. The UCS server had innovative technology to break the bond between the number of CPU’s and the total amount of memory a computer could recognize. This provided a business value of greater consolidation ratios, meaning less hardware, less rack space, less power and cooling costs and less licensing costs. It also abstracted the physical hardware from the OS that ran on it through Service Profiles, enabling capabilities that delivered technical and business agility and simplicity value that no other vendor offered. It came with the Fabric Interconnects and the UCS Manager where you can break out these abstractions into different resource based configurations, profiles and templates. It was sold not as a server, but a system. The system as a whole is not a commodity, while a number of its parts are. Initially the UCS system was part commodity, part differentiation and part disruption. Over the years the UCS system drove other vendors to create converged solutions that delivered similar features. The followers created solutions that were part commodity and part differentiation. In this arena, the discussions were and still are on features and performance first, and price second. As this technology solution area evolves eventually the features and performance will be in relative parity so the discussion will eventually be driven by price more than anything else. This is a 10,000 foot view of the nuances within commodity and the differentiating lifecycle within commodity technologies.
The storage industry can be another example. Disk trays are a commodity because all vendors have SAS, SATA and SSD options in their disk trays and the real competition is on the price of the tray at a given form factor. Storage controllers are often specialty goods. The hardware and software architecture of different vendor storage controllers enables different features to perform at different speeds that align with specific business value discussions first, then discussions on price. While there is perceived feature parity between storage vendors, the different architectures provide commodity features at different performance levels and costs.
For example, all storage controllers provide the Snapshot feature where a second copy of a set of data can be replicated almost instantly, providing many valuable capabilities. Because of the underlying architecture, some vendors can perform this feature instantly; some require a little more time and resource allocation. Each vendor has limits on how many snapshot copies they can maintain. The number differs because of the architecture.
As features and performance continually improve, storage technologies will move through the lifecycle from relative parity, to differentiation to unequal competition and back to relative parity. At some point there will be disruption in this lifecycle as well, but after a period of time it will go back into this lifecycle.
Why does it matter? It matters because technology is pervasive in both our consumer and business lives. When perceived and managed properly, it is an enabler of business value. Organization cannot compete without leveraging technology to achieve desired business outcomes, which means decisions about how to leverage technology are increasingly important to the business outcome. When addressing technology decisions by looking through the business value lens, one realizes that commodity isn’t always commodity. Don’t get caught up in the Cloud marketing mindset that all infrastructures are commodity where one vendor is indistinguishable from another and the decision is solely on the lowest price for the technology. This can lead to strategic missteps as you follow your business plan.
Lastly, understanding that every technology is located somewhere within the differentiation lifecycle can help in planning for the duration that a given technology will provide a differentiated value to you over your competition or to your customers over their competition. It can assist in both the strategic planning for which technologies to adopt in the long term and which technologies to focus on for a fiscal year or a couple quarters to provide short term differentiation. In the end, differentiation is achieved and maintained through a multi-layered approach combining short and long term strategies. Understanding the nuances within the commodity discussion and the location of technologies in the differentiation lifecycle will help organization make better business decisions about the technology that supports their strategic objectives.
Mr. House brings 17 years of technology leadership, vision, partner relationships and business alignment to organizations. Currently, he is a Technologist at Unicom Government, focused on business aligned IT solutions that empower information technology to be an enabler of business services. He has been the lead architect at the firm for many large projects and GWAC captures, authored numerous whitepapers and developed proof of concept solutions for customers that align technology solutions with business objectives. He was a member of the 2012 Tech America Big Data commission and has been interviewed by Federal News Radio and Government Executive relating to Big Data and Data Center technologies. He has held several roles including Federal Solutions Architect, IT Architect in the Healthcare sector and a Senior Engineer for the USMC in Quantico, VA