by Zachary Zeus | Jul 20, 2017 | BizCubed, How To, Main Blog, Pentaho, Resources
What is the “Filling the Data Lake” blueprint? The blueprint for filling the data lake refers to a modern data onboarding process for ingesting big data into Hadoop data lakes that is flexible, scalable, and repeatable. It streamlines data ingestion from...
by Zachary Zeus | Jul 17, 2017 | BizCubed, How To, Main Blog, Pentaho
Recently Pentaho announced the release of Pentaho 7.1. Don’t let the point release numbering make you think this is a small release. This is one of the most significant releases of Pentaho Data Integration! With the introduction of the Adaptive Execution Layer (AEL)...
by Zachary Zeus | May 29, 2017 | Analytics at Operational Scale, How To, Main Blog, Pentaho, Resources
Data preparation is the foundation of any analytics project, but it has become more complex than ever as organizations struggle to handle new, messy data types and growing data volumes. Teams that want to build competitive advantage through analytic insight must...
by Andrew Cave | Apr 10, 2017 | How To, Main Blog, Resources
Pentaho offers a step that handles the process of connecting to an SFTP server and putting and getting files, and (as always) orchestrating the steps leading up to and after the file transfer. The step relies on the Java implementation of the SSH2 connectors done by...
by Andrew Cave | Apr 4, 2017 | How To, Main Blog, Resources
As a programmer (and the lonely Windows hold-out at Bizcubed), I have a strong habit of hitting ctrl+s to save whatever I am working on, early and often. As a wobbly typist, I also have a tendency to sometimes clip the wrong key. So what’s next to the...
by Zachary Zeus | Mar 6, 2017 | Analytics Products, How To, Main Blog, Resources
The CTools are a set of community-driven tools which are installed as a stack on top of the Pentaho BA server. They extend the Pentaho platform capabilities by providing a framework for more flexible development. This blog covers few tips and trick for developing CCC...