Platform Computing Announces Commercial Support for Apache Hadoop Distributed File System (HDFS) In Conjunction with the Release of Platform MapReduce, Company Joins HDFS Project and Open Source Community to Enable Support for its Enterprise-Class Distributed Runtime Engine
SAN JOSE, Calif., June 28, 2011 /PRNewswire/ -- Platform Computing, the leader in cluster, grid and cloud management software, today announced it has signed the Apache Corporate Contributor License Agreement allowing the company to contribute to the Apache Software Foundation for developing Apache-based, open-source Hadoop Distributed File System (HDFS). Platform Computing and its developers will contribute to the ongoing development of HDFS, focusing on commercial support for the recently announced Platform MapReduce, an enterprise-class, distributed runtime engine for MapReduce applications. For more details on Platform MapReduce's key features and capabilities please see today's related announcement.
"Platform is committed to the growth of a robust enterprise-ready Hadoop ecosystem and excited to be an active member of the open source development community," said Rohit Valia, Director, HPC & Analytics Solutions, Platform Computing. "The Hadoop community is leading the charge in big data computing, and Platform intends to support these efforts, drawing from our 18-year history of superior support for workload management in high performance computing environments, including complex clusters for government and some of the largest financial services, retail, life sciences and manufacturing companies worldwide."
The data explosion has shifted the computing paradigm, requiring applications to move closer to the data, to effectively leverage analytics and other line of business functions. Platform Computing recognizes that global enterprises need the support of the IT development community to tackle this challenge. This has led to Platform Computing's agreement with the Apache Software Foundation to ensure the development of open-source HDFS retains full compatibility with the company's commercial products, such as Platform MapReduce. This will enable the company to provide the best-in-class solutions that IT needs to effectively manage and capitalize on the data explosion. When integrated with Apache HDFS, Platform MapReduce optimally manages the movement of applications closer to the data, dramatically delivering enterprise-class manageability and scale, high resource utilization and availability, ease of operation, and multiple application support.
"Platform MapReduce, coupled with HDFS integration, is a powerful new tool allowing the enterprise to not only manage the data explosion but also maximize ROI," said Dave Henry, Senior Vice President, Enterprise Solutions, Pentaho Corporation. "Pentaho enables business to effectively leverage Apache Hadoop with solutions for end-to-end business intelligence and data integration, and we're thrilled about the synergies made possible through Platform Computing's addition to the community."
- For more information on Platform Computing's MapReduce solutions, visit: http://www.platform.com/mapreduce
- For today's announcement regarding Platform MapReduce, visit: www.platform.com/press-releases/2011/PlatformBringsMapReducetotheEnterprise
About Platform Computing
Platform Computing is the leader in cluster, grid and cloud management software – serving more than 2,000 of the world's most demanding organizations. For 18 years, our workload and resource management solutions have delivered IT responsiveness and lower costs for enterprise and HPC applications. Platform has strategic relationships with Cray, Dell, Fujitsu, HP, IBM, Intel, Microsoft, Red Hat, and SAS. Visit www.platform.com.
SOURCE Platform Computing