When virtualized servers finally became something that even small and medium sized businesses could afford, it was assumed by many in the industry that the technology would simply “take off” and all previous physical iterations of server technology would simply be cast off. As the industry quickly found out, however, this was not the case.
It began with legends of poorly-integrated virtual systems in which IT staff and the employee population at large were at war. Virtual servers were often installed in a “one size fits all” mentality where companies were not given a great deal of choice in how they would like their server set up, leading to conflicts among staff.
Not only that, but virtual servers changed the way both general staff and the IT department dealt with their technology. For the IT department, virtualization meant that servers were no longer under their direct control and they were afraid that they would become redundant or unnecessary to the company. In many cases, they had made “fixes” to the current operating system and believed any new system would not support what they had created.
Employees at large were afraid of a number of problems with virtual servers including the fact that the technology was not as “invisible” as advertised. In many cases, programs and cut-and-paste applications across platforms worked differently or not at all, leading to general dissatisfaction.
Then too was the matter of cost. Physical servers and desktops are bought and paid for, and even if a virtual environment meant lower costs in the long term it meant costs up front, something many companies were unwilling to bear. Fortunately, both server provision and effectiveness are increasing, leading to a more general acceptance of a highly efficient technology.