Skip to main content

Some Decisions AI Can’t Make; For Everything Else, There’s Quality Data

Bake and serve frozen dough: $5 

Package of pepperoni: $4 

Shredded mozzarella cheese: $3 

Knowing not to add glue to your sauce: Priceless 

There are some things artificial intelligence can’t tell you; for everything else, we turn to common sense.  

In all seriousness though, raise your hand if you have been experimenting with some of the new trending opensource AI (artificial intelligence) models? I know I have. In fact, a quick nonscientific poll shows that basically a lot of people are finding ways to test the so-called “intelligence” of AI. But it isn’t going well, including the previously mentioned pizza recipe that suggested glue for stickier cheese. 
 

 

It’s no wonder that, even with all the advancements in technology, 92% of executives are concerned about the negative impacts of data and analytics on their company’s reputation. 

How did we get to this? 

The Decline of Trust 
 According to a 2018 study conducted by FMI, construction companies were producing over 2.5 quintillion bytes of data daily. That’s 2018 numbers. With data inflation where it is today, it would be no surprise if the numbers were double in 2024. Yet, with all this incredible information at our fingertips, 96% of all data captured continues to go unused. Why is that? 

For starters, information remains difficult to find. Have you ever been asked to produce a “quick” report analyzing the historical trend of steel prices per ton over the last few years? Is a report like this ever that quick? Going back to the FMI study, construction teams are spending 1.5 hours or more chasing down project data per day. Other studies quote up to 11 hours per week are wasted trying to track down the right data. 

On top of that, additional studies reveal a lack of quality data. Whether you look at construction specifically and hear how 14% of all rework is caused by bad data, or you look across industries and see a $3 trillion per year price tag on poor data quality, the numbers speak for themselves. We see story after story every week of a project that is over budget or way behind schedule, and yet teams seemed surprised by such a revelation. 

At the heart of these issues is data segregation. Despite the advent of project management platforms over the last few decades, companies are struggling to consolidate their tech stack. In fact, they’re moving backward. The average digital worker of 2023 relied on 11 different applications each day, up from 6 in 2019.  

Lower productivity, more people needed and higher overall cost of usage are all negative impacts from using multiple tools. But add to that the realization that many of these applications don’t integrate data at all and data quality issues begin to compound.  

Suddenly it becomes easy to see why 65% of executives just don’t trust their use of data. 

The Search for Truth 
 So, what does it take to consolidate the data silos and eliminate the suggestions of glue in our sauce? 

It begins with the understanding that true data standardization requires a centralized platform, where platform is the key word. As we already discussed, even with the advent of popular “project management platforms” around the industry many companies are left using a plethora of systems that may or may not integrate.  

How is "platform" then also the answer? Well, simply put, because those aren’t platforms. They are software as a service (SaaS) products built on a foundation that was locked in years ago. Any future changes or additional functionality must happen through a partner network or acquisition. That makes data quality not only difficult, but also costly. 

Kahua, which means platform in Hawaiian, was built to solve this problem. By offering a SaaS solution out-of-the-box that is built on a platform as a service (PaaS), companies can begin centralizing data collection from day one with the freedom to extend and grow the application set as they mature. This true consolidation of solutions reduces the challenges of fragmented technology and data by creating a seamless thread of business intelligence that is accessible from any device.  

More importantly, though, this is a collection of your data. Not industry-wide metrics. Not best-case-scenario cost libraries. Not random user comments from Reddit. This is your information, collected on your platform, to deliver your analytics. 

And when your folks are empowered to move past simply trying to connect the dots to dealing with a single, coherent and optimized interface, that’s when high quality data delivers insights into true project trends and potential risks.  

The Value of Quality Data 
As regulations and budgets continue to tighten and the late delivery of high-profile projects litters the news, how do you put a price on high quality data? What if you could identify declining trends on a project early enough to take corrective action? What would that mean for your organization? Look at the data; it could mean delivering that next mega project on time and on budget. Imagine that. 

Total cost of a PMIS: Yeah, it varies based on the need. 

Total cost of change management and implementation: Sure, there might be a bit of a learning curve. 

The value of quality data to inform your project decisions: Priceless. 

*** 

Discover how Kahua can increase the value and accessibility of your data: Raising the Bar for Quality Data. 

 

About the Author

AJ Waters is the Chief Evangelist at Kahua, leveraging his extensive experience as Vice President of Industry Solutions at InEight and as a program manager at Google to champion innovative solutions in the construction industry. With a background as a structural engineer at Kiewit, AJ combines technical expertise with a passion for advancing customer profitability and agility.

Profile Photo of AJ Waters