Where Simplicity and Technology Really Intersect
The prevailing wisdom says that technology is supposed to be getting simpler all the time. This belief appears to be pervasive with many corporate leaders who insist that there should be less of a need for expensive technology experts in an organization over time, since technology is becoming simpler and less error prone.
I touched on this fallacy very briefly in a recent article about Information Security investments. While technology is easier to use in many ways, the fact is that the complexity of technology has actually increased – significantly – over time. This has also lead to greater challenges in the management of technology infrastructure and technology teams.
We are still in the relative infancy of the technology industry. There is much growth yet to be attained before this industry has fully matured. It is not unreasonable to suggest that the current pace of change will continue unabated for at least another 15-20 years, by which time we may find viable ways to spread simplicity throughout the entire infrastructure. But that time is not now. For now, all we can do is move the complexity from one place to another.
For example, let’s take a look at the microprocessor.
In 1999, Intel released the 32-bit Pentium III microprocessor at speeds of 450 and 500 MHz, with 9.5 million transistors on a 128 mm² die, using a .25 micron (250 nm) fabrication process.
By 2009, Intel was selling mainstream desktop processors such as the 64-bit quad-core Core i5 family, which was released at speeds of 2.66 GHz, with 774 million transistors on a 296 mm² die, using a 45 nm (nanometer) fabrication process. That’s not an increase in simplicity by any stretch of the imagination.
What technology has done over the years, has been to facilitate simplicity for end-users in many ways. It has enabled people who know little about technology to harness considerable power and functionality, which required greater knowledge in the past. It has enabled people to perform simple jobs more rapidly, and granted them the ability to perform complex jobs without understanding the complexity behind the work. We have taken the massive computing power available to us at the desktop, and added more code to make things user-friendly for end users.
This simplicity of use has come at a price, however. What we have done is to take the complexity that the user previously faced, and transferred it from the front-end (where the user is) to the back-end (where the professionals maintain the system). The complexity has not been removed – it has merely been displaced. Now, it must be effectively managed, if there is to be any realization of benefits by the end users.
Server virtualization is one popular area in which we can see this transfer of complexity from one place to another. Basically, server virtualization allows you to take one large physical machine, and carve discrete server instances out of it, so that applications and processes can be isolated from each other at a logical level. Some of the benefits that come from using this technology include:
- Faster provisioning of new servers
- More precise allocation of resources
- More Disaster Recovery options
Here are just a few of the other things that virtualization also brings to the table:
- Greater dependency on network performance
- More single point of failure scenarios
- Greater collateral damage because of a system, security or storage problem
- Increased training requirements for staff
An environment where virtualization is used extensively, requires technologists who can manage not just networking or servers or storage, but all three. In the event that the staff responsible for the virtualized infrastructure is skilled only in one or two of the needed disciplines, it will be necessary for multiple team members to get involved in all troubleshooting endeavors, in order to effectively solve problems in the environment. Needless to say, this requires more time and effort, and is a clear indication that complexity has not been reduced across the board.
While technology has made air travel more standard for regular people, and has made automobile travel virtually ubiquitous, there is still considerable complexity and cost in flying a plane, to say nothing of repairing them. (Becoming a pilot is still neither trivial nor inexpensive). Likewise, the sophistication of automobiles has actually made it harder for individuals to work on all aspects of their own vehicles, as compared with the past.
As technology improves, the lower end of the food chain becomes commoditized, and can be easily outsourced, because it is no longer adds competitive advantage. We see this with call centers and desktop support, where the cost to provide the service is much lower for external entities than it is for in-house staff. While this is happening, however, the higher end of the food chain is becoming more complex and sophisticated. Software-as-a-Service (Saas), Cloud Computing, Grid Computing and High-Performance Computing (HPC) require more expertise and skill in planning, architecture, implementation and integration in a variety of technology disciplines. And no matter how you slice it, this means increased costs (for increased benefit).
You do pay more, whether in increased wages per IT professional, or in a need for more professionals to cover all the specialized areas. But, the benefits are also huge when you implement these technologies correctly. The investments made in IT professionals (including training), will pay back dividends in increased flexibility, improved productivity, additional revenue opportunities, and reduced business risk.
As the saying goes, it takes money to make money. This is no less true of technology management and deployment, as compared to other areas. Proper investments in people, tools, and processes will yield solid, sustainable results. Conversely, an excessive focus on cost-cutting will strip the organization of vital capabilities and flexibility, and ultimately undermine its competitive advantage.