September 26, 2016
Spark and its Potential for Big Data Over the last 18 months or so, we at Pentaho have witnessed the hype train around Spark crank into full gear. The huge interest in Spark is of course justified. As a data processing engine, Spark can scream because it leverages in-memory computing. Spark is flexible – able to handle workloads like streaming, machine learning, and batch processing in a single application. Finally, Spark is developer-friendly – equipped to work with popular programming languages and simpler than traditional...
September 23, 2016
We recently published research commissioned from iGov revealing that despite 76% of respondents believing that Big Data could benefit their organisations, only 26% regard it as a top priority. This 50 point adoption chasm exists despite 79% of survey participants stating they believed big data could help improve efficiencies/ reduce costs and 74% recognising the potential for big data to help deliver personalised services. For a big data pioneer like Pentaho, this doesn’t come as a surprise. We’ve acknowledged the fact that Hadoop is hard...
September 16, 2016
Today, we are excited to announce the winners of the 2016 Pentaho Excellence Awards! Now in its third year, the annual award program recognizes our customers for the most innovative and impactful implementations of Pentaho technology. The submissions were evaluated based on creativity, impact and business value of their Pentaho deployment, and choosing this year’s winners was no easy task. In fact, to count, we received double the number of submissions from last year. Our customers represent a wide variety of vertical markets, including government,...
September 12, 2016
If you are in an IT role responding to data prep needs from the business, you have likely seen some variation of a citizen data scientist emerge at your company. Viewed as the most plausible answer to a shortage of Ph.D. data scientists, citizen data scientists work with data but may not have a formal education in business intelligence and statistics, as this article in Forbes notes. But as 451 Research recently reported, “the role might need some careful handling in order to live up...
September 2, 2016
1) Tell us about yourself. My name is Pedro Alves; I’m the SVP for Community, Pentaho Product Designer and GM of Webdetails (aka Pentaho Portugal office). It may sound like a lot of stuff to do, but it’s actually very useful; I often tell people I have to focus on one of my “other” jobs and just go to the beach to enjoy the amazing Portuguese weather, food and drinks! 2) How are customers contributing to the community today? Customers and partners are actually some...
August 26, 2016
For a long time, car insurance companies had based insurance premiums on aggregated data such as age, gender, and type and color of vehicle. However, if you were a young male driver, then you were probably paying a higher premium as this demographic had traditionally been involved in more accidents. But what about young male drivers who drive perfectly safe and have never caused an accident? Usage-Based Insurance (UBI), a somewhat recent innovation by auto insurers, closely aligns driving behaviors with premium rates for auto...
August 22, 2016
I speak to a lot of customers that are all facing the same issue – they have a limited IT staff, they have a shrinking budget, they are using legacy tools to manage their growing data needs and they just don’t have enough time to accomplish it all. We have all heard the statistic from Ventana Research that organizations spend 46% of their time preparing the data and a whopping 52% of their time checking for data quality and consistency. That means IT groups responsible...
August 19, 2016
A blueprint for big data success What is the “Filling the Data Lake” blueprint? The blueprint for filling the data lake refers to a modern data onboarding process for ingesting big data into Hadoop data lakes that is flexible, scalable, and repeatable. It streamlines data ingestion from a wide variety of source data and business users, reduces dependence on hard-coded data movement procedures, and it simplifies regular data movement at scale into the data lake. The “ Filling the Data Lake ”blueprint provides developers with...
August 16, 2016
The time has come again to recognize the outstanding achievements of our customers using Pentaho solutions for big data analysis and integration. That’s right, the 2016 Pentaho Excellence Awards are upon us once again, and we just sent this year’s submissions to our judges for review. By the time our nomination process closed, we received more than double the number submissions than last year. The entries come from around the world and represent a wide variety of vertical markets, including government , healthcare and retail...
August 12, 2016
Modern Data Preparation Roadblocks by Kevin Haas - Partner, Inquidia Consulting Now that we’re in the summer heat of election season, political rhetoric filling the air, all are focused on the potential results of the democratic process. I’m not writing, however, to discuss the power of the people to drive candidates, but rather the power struggle happening inside businesses around using technology to solve problems. In particular, I am referring to a growing trend wherein business analysts have more control than ever to prepare and...

Pages