Pentaho 8.1 Release – Doubling Down on its Core

by Zachary Zeus
July 2, 2018

At BizCubed, we are proud to be a Hitachi partner organisation. And a key reason is, Hitachi has remained committed and consistent to interoperability of data processing and analytics across complex data ecosystems; a core cause we are also driving.

Through the projects we have been involved in, we have found that customers rarely ‘fully’ implement the cloud and big data strategies that are drawn up as part of the solution architecture. This is owing to the on-ground scenario – Customers have complex on-premise, private cloud and public cloud ecosystems that mean that delivering analytics and insights on time can be extremely challenging.

Here are a couple of customer examples to help illustrate this:

One of our customers has more than a dozen Hadoop clusters that are part of their ecosystem. Yet, all their data does not reside in those environments.

Another customer has a distinct physical infrastructure for each site, making their networking process extremely complicated. When the company connects IoT devices, they implement their own network and communicate over the cloud, process data on GCP (Google Cloud Platform) and integrate the results back to their on-premise Oracle data warehouse (private IoT network -> Public Cloud -> On-premise warehouse).

The above examples are not outliers. They are not exceptions. We find this to be the norm.

Our customers need to be able to process, monitor and maintain data processes and insights across these ecosystems. There is an abundance of tools out there, but very few provide what we think are the keys to managing this level of complexity.

At BizCubed, we help organisations make better decisions by having access to their information, based on all their data sources, and that too, fast.

We leverage the Hitachi Pentaho business intelligence software for its data integration, OLAP services, reporting, information dashboards, data mining and extract, transform, load capabilities.

Recently, Hitachi released v8.1 of Pentaho. Although, this is a minor follow-up to v8.0, nevertheless, Hitachi has issued new exciting features and improvements, including:

  • Pentaho Data Integration (PDI) features several improvements to its streaming step profile, including the addition of new steps.
  • You can now run PDI transformations with the Spark engine using the improved steps.
  • You can now seamlessly connect to the Google Cloud Storage using a VFS browser for importing and exporting data to and from Google Drive.
  • Increased AWS S3 Security
  • New and Updated Big Data Steps
  • Data Integration Improvements
  • Business Analytics Improvements

So yes, there are some mammoth Improvements in Pentaho 8.1.

I want to revisit something I mentioned. Our customers need to process, monitor and maintain data processes and insights across ecosystems. Even though there are many tools out there, Pentaho is excellent for several reasons:

First off, we believe data processing tools must be open. This means they need to work on all cloud platforms and in all formats (Private, Hybrid and Public). They should also be pluggable, allowing users to take advantage of native capability. Finally, they should have an open-source based licensing model, which enables future flexibility and prevents locking.

Next, they should be scalable, meaning they work virtually anywhere: on your desktop, on the cloud, on-premise, on multiple machines and inside clusters.

Last, they should be easy to use. You’d be best to find a tool that has an easy drag-and-drop interface, is metadata driven (not a code generator), deploys what you see, and allows you to see your data building over time.

So, there you go. That was a high-level intro on the 8.1 release.

These key considerations will help you accelerate your use and data reach management. Interested in learning more about the Pentaho 8.1 release? Chat with us.

Portrait of Maxx Silver
Zachary Zeus

Zachary Zeus is the Co-CEO & Founder of BizCubed. He provides the business with more than 20 years' engineering experience and a solid background in providing large financial services with data capability. He maintains a passion for providing engineering solutions to real world problems, lending his considerable experience to enabling people to make better data driven decisions.

More blog posts