Discover PerformanceHP Software's community for IT leaders // September 2012
An easy on-ramp for Hadoop
The pressure to tame big data makes Hadoop attractive, but the open-source complexity means the powerful platform hasn’t been ready for the enterprise. That’s changing.
Big data is a golden opportunity. To exploit it, though, you must capture a tremendous volume and velocity of business data, in myriad forms, and hold it in a contained system capable of running comprehensive analysis.
The cost of continuing to scale traditional systems has grown prohibitive. The pressure to find a new solution to these challenges has led every information management executive to stalk the same watering hole: Hadoop. Apache Hadoop is a freeform data storage platform that lets organizations store raw, high-fidelity data in a boggling array of formats—and combine it in comprehensive analysis. Hadoop is a distributed data processing system that is designed to take advantage of MPP x86 scale-out to sort and analyze data in a cost-effective manner.
Stymied by complexity
Unfortunately, many of the enterprises that have been looking at Hadoop haven't yet made a move. While Hadoop has proven to be a good solution to today's big data problem, it has a few deficiencies: most notably, its complexity.
“While Hadoop is actively crossing the chasm to the enterprise, its platform and application complexities can be a challenge for many companies not familiar with open source, Linux and distributed computing,” says Steve Watt, chief technologist for Hadoop at HP.
Hadoop is going through the same maturation cycle Linux did about 10 or 15 years ago, with a community around it and lots of distributions. The platform grew out of the pure coder's world of open-source software, making everything about its implementation and support a challenge for the enterprise. The time-consuming effort to manually search online mailing lists, forums and wikis runs counter to enterprise IT’s efficiency culture. For many companies, the on-ramp to Hadoop is simply too steep.
To this end, HP has developed a Hadoop appliance called The HP AppSystem for ApacheTM HadoopTM—an end-to-end technology stack, including hardware and software, that lets organizations quickly reap the analytic value of Hadoop without first tackling its significant acquisition hurdles.
Ready for prime time
A look at HP’s Hadoop appliance offers a strong perspective on why a skillful packaging of the open-source platform can work in the enterprise environment.
In addition to providing an enterprise-ready version of the HP appliance with a market-leading level of performance, there are reference architectures for all three major Hadoop distributions.
Through the use of Vertica and Autonomy’s Hadoop connectors, the HP AppSystem for Hadoop allows customers to easily add advanced analytics from either Vertica or Autonomy, putting real brains behind Hadoop’s muscle by providing analytics capabilities that aren’t available within Hadoop.
Adding speed to power
For all the innovation that Hadoop has applied to the problem of analyzing massive data of varying types, the one thing Hadoop can't guarantee is speed. Hadoop's strength lies in making sure that data-intense computations will always finish, but those computations can run for a long time. For organizations that need to create a rapid analytics feedback loop, Hadoop will need a technology assistant.
“There isn’t really an open-source alternative that gives you the same high-performance and scale to do what, at Vertica, we call real-time analytics,” says Shilpa Lawande, vice president of engineering at Vertica.
The HP appliance’s seamless integration with Vertica's real-time analytics engine makes maximum analytics power and efficiency possible with minimal effort and disruption to IT.
"When you really need your answer to come fast, so that you can operationalize the results and consume them in real time, that's what Vertica is built for," Lawande says. "You can build an interactive dashboard that lets people browse around in their data, right at their fingertips."
Tackling big data problems
Like many open-source projects, Hadoop is growing and maturing, but commercial Hadoop distributions still leave much for enterprises to hash out, including provisioning, deployment and optimization. The appliance form factor is a natural fit for organizations that want to immediately act on, understand and manage 100 percent of their data. HP’s solution gives information management leaders a way to address their big data challenges without taxing IT, database administrators or data scientists.
After all, getting better and easier access to potential insights from structured and unstructured information is why so many organizations are looking to Hadoop in the first place.
To learn more about how HP can help you easily tackle big data problems using Hadoop, go to http://www.hp.com/go/hadoop.
HP CEO Meg Whitman discusses how connected intelligence will drive IT operations, application development, IT security, marketing, compliance—and the bottom line. Register now.
HP Software’s Paul Muller hosts a weekly video digging into the hottest IT issues. Check out the latest episode.
Speed, reliability, and quality are essential, but hard to balance. Get better insight into cloud resourcing and consumption.
Network with your peers and our experts and partners to learn how to maximize your Big Data analytics outcomes.
Welcome to a new reality of split-second decisions and marketing by the numbers.
Looking toward the era when everyone — and everything — is connected.
Introduction to Enterprise 20/20
What will a successful enterprise look like in the future?
Challenges and opportunities for the CIO of the future.
Dev Center 20/20
How will we organize development centers for the apps that will power our enterprises?
IT Operations 20/20
How can you achieve the data center of the future?
What the workforce of 2020 can expect from IT, and what IT can expect from the workforce.
Preparing today for tomorrow’s threats.
Data Center 20/20
The innovation and revenue engine of the enterprise.