According to executives at IBM, one of the most common mistakes made by companies that are looking to get into the private cloud is to leap before they are ready and before they have the storage efficiency to handle the infrastructure the cloud brings.
To some extent, it seems backward – many companies think of the cloud as a way to improve their storage abilities. But private cloud storage suffers from a number of problems that physical storage does not, including the fact that multiple OS instances on a server can cause serious I/O and data retrieval problems. Companies that switch to the private cloud hoping for a magical resolution to their storage issues will often find themselves stuck with many of the same problems as when they began.
IBM recommends that companies use technologies like thin provisioning, data reduction and storage virtualization locally before they consider a move to the cloud, public or private. Without using the latest storage management features and without an IT department that is on-side and developing SLAs before a move is made, a company may be looking at paying far more than it needs to for storage, and not getting as much out of it as they need.
According to Dan Galvan, VP of marketing and strategy at IBM, companies can see up to a 30% increase in their utilization of data by using storage virtualization at a local level, which is a number that is consistent with server virtualization on a broader scale. The idea IBM is pushing is that with products like its Easy Tier software, companies will be able to vastly improve their data storage and access times before they make the jump to the cloud – making the move that much more cost-effective.