The Open Data Platform (ODP) initiative is an industry effort focused on simplifying adoption of Apache Hadoop for the enterprise, and enabling big data solutions to flourish through improved ecosystem interoperability. It relies on the governance of the Apache Software Foundation community to innovate and deliver the Apache project technologies included in the ODP core of Apache™ Hadoop® 2.6 (inclusive of HDFS, YARN, and MapReduce) and Apache Ambari software.
The Open Data Platform aims to provide the following benefits to the big data industry:
- Promote interoperability within and beyond the ODP Core, to drive a broad set of use cases
- Reduce industry fragmentation for customers and partners
- Reduce R&D costs across the industry, by delivering one shared qualification effort
- Share technical marketing materials such as industry benchmarks, reference architectures and white papers
As a common industry standard, the ODP Core also aims to provide additional benefits to Hadoop users and developers:
- Help users avoid vendor lock-in
- Simplify development of interoperable 3rd-party technologies
- Provide consistent documentation to simplify development efforts
- Allow developers easy integration with existing systems
Panel Discussion : Upcoming First Release of the ODPi!
• Specifications for runtime
• Specifications for management
• Reference of implementation
• Validation test suite for compliance with specifications
• Richard Pelavin – Co-Founder and CTO, Reactor8, Inc.
• Sumit Mohanty – Senior Engineering Manager at Hortonworks and Ambari PMC Member
• Alan Gates – Co-Founder at Hortonworks
• Roman Shaposhnik – Director of Open Source at Pivotal and Founder of Apache Bigtop
• Satheesh Bandaram – Director, Hadoop, Spark and Cloud at IBM
• Raj Desai – Software Engineer at IBM and Member of the ODPi Technical Steering Committee
The moderator is Gregory Chase, Director of Big Data Community Marketing at Pivotal.
The Open Data Platform will:
- Accelerate the delivery of Big Data solutions by providing a well-defined core platform to target.
- Define, integrate, test, and certify a standard “ODP Core” of compatible versions of select Big Data open source projects.
- Provide a stable base against which Big Data solutions providers can qualify solutions.
- Produce a set of tools and methods that enable members to create and test differentiated offerings based on the ODP Core.
- Reinforce the role of the Apache Software Foundation (ASF) in the development and governance of upstream projects.
- Contribute to ASF projects in accordance with ASF processes and Intellectual Property guidelines.
- Support community development and outreach activities that accelerate the rollout of modern data architectures that leverage Apache Hadoop®.
- Will help minimize the fragmentation and duplication of effort within the industry.