HP Technology at Work

The must-read IT business eNewsletter
Subscribe

Defining the software-defined data center

Defining the software-defined data center

November 2013

The Software-Defined Data Center (SDDC) allows you to control all of a data center’s underlying hardware—servers, networks and storage—from one unified logical abstraction. This abstraction not only helps you manage current services more efficiently, it also lets you reconfigure on the fly, roll out new projects quickly using programmatic, automated provisioning, and achieve a level of scalability that traditional approaches can’t match. Here is a look at how the SDDC is being defined and some questions to ask when considering if a SDDC is right for your organization.
 
Legacy vs. software-defined
All data center hardware components are designed with a “control plane” that determines which actions to perform, and a “data plane” that executes those actions. In legacy components, the control plane is baked into the device’s circuits. These components perform specific functions and deliver high performance, but they’re hard to manage all at once and often have to be reconfigured manually.
 
In a software-defined device, essential functions remain in the data plane in the hardware, while the control plane’s operational logic is placed in a controller application that can oversee many physical devices at once. The hardware can be swapped out as needed, providing greater resiliency, flexibility and agility. This model of virtualized abstraction of the control plane is at the heart of software-defined computing, networking, and storage, and the SDDC is the unification of all three.
 
Building SDDC on open standards
The development of open protocols is making it possible to build a SDDC that can control virtualized infrastructures from a mix of vendors that all conform to a common application programming interface (API). OpenStack, a key open-source project for controlling pools of computing, storage, and network resources, offers a common platform so private cloud-based services can seamlessly extend to and from any public cloud that provides the OpenStack APIs. Openstack began as a collaboration between Rackspace and NASA, and now benefits from a large list of contributors including HP.
 
Another important open standard is OpenFlow, the software-defined networking API. OpenFlow provides a model for remotely controlling switches and routers. OpenFlow is vendor neutral, so any OpenFlow-enabled controller can control any OpenFlow-enabled device. Like Openstack, Openflow has the backing of many companies, including HP. By contributing and supporting these initiatives, participating companies ensure that their hardware and data-center services remain viable in a SDDC environment.
 
DevOps and cloud management
One driver of the SDDC model is the development and deployment process for new software. A traditional IT infrastructure typically has separate stacks for development, testing, pilot programs and full deployment. Syncing the configuration of these stacks is challenging, and often serious issues are discovered only after an application is fully deployed.
 
In the SDDC, the same underlying abstraction is used from initial code check-in all the way through deployment. This makes it possible to constantly deploy bug fixes or new features in multiple versions a day rather than waiting weeks or months for changes to be tested and deployed across the various stacks. And if changes do cause unexpected errors, backing out to previous versions is simple.
 
A related issue faced by many organizations is the need to reign in the “shadow IT department.”  In recent years, individual departments have independently sought out pay-per-use IT services from cloud-based service vendors. This has led to “cloud sprawl,” where no one department can effectively monitor who is using which cloud services.
 
The SDDC model makes it easier for internal IT teams to provide on-demand resources in house through service catalogs or act as the broker for public cloud services and maintain awareness of all IT resources being used throughout the organization. The SDDC model also makes it easier to maintain privacy and security policies across internal and cloud-hosted resources.
 
Are you ready for SDDC?
While SDDC is a very promising operational model, deploying a full SDDC for every enterprise doesn’t necessarily make sense. The decision to deploy a SDDC should be driven by the business goals.

  • How much agility do you require now and in the near future? 
  • Are you supporting mobile devices?
  • Do compliance issues constrain where you can store client data?
  • How much of your data center is currently virtualized?

Some organizations may opt to adopt SDDC over time. They may only virtualize networking or just create a software-defined storage array for a specific project. SDDC is still a work in progress, but it provides a template on which to map your future virtualization efforts and data center growth for maximum return.

You may also like
Read about Software Defined Data Center & VMware vCloud
Learn more about OpenFlow
Discover the top 5 things you need to know about SDDC

Subscribe

Popular tags

Most read articles

HP Technology at Work

Contact Us
Search archive
Customize your content