4 Comments
User's avatar
Jeff's avatar

so many dbt models feeding so many power bi models yet the business can’t answer simple questions without talking to an engineer, manager, and 3 analysts.

8Lee's avatar
1dEdited

Sadly, so many of the issues that come up in the supposed paradox can be easily mitigated if the right principles / practices are in place.

For instance, sure, engineering teams will allow increased individual speed but if the team is using the same ruleset then they will all move at the increased speed, together, with very predictable failure modes.

The issue is when one person is moving at 100mph and another is at a variable speed between 60-80mph and the rest of the team is at 40mph. Imagine if everyone was running at 70mph, together. It's like how cycle teams win as a team by coasting at the same speed, even if one has intrinsically more horsepower.

It's the delta between humans, not just the models or agents they use (or refuse to use).

Maury Carollo's avatar

"We gave more people access to data, yet many companies are still struggling to answer basic questions and in turn struggling to make decisions." There is a difference between data and creating information to make decisions. Have we forgotten to delivery usable data that answers the business questions? Seeing more of removing tools due to budget constraints and lack of value. Also and unfortunately seeing removal of layers, and just plopping operational data into a data lake with no regard to delivering useful information that can connected and analyzed quickly. Reduce the engineers, adding more business people not doing their daily business job, but having to build very complex unmaintainable, resource intensive, Power BI reporting models.

Peter Andrew Nolan's avatar

Hello Ben, this whole idea of "data pipelines" was doomed from the start just as much as the idea of hadoop or "schema on read" was doomed from the start. Ideas like "medallion architecture" were nothing but marketing slogans.

We figured out how to build data warehouses in the mid 90s. We have improved on those processes ever since. In the early 90s we could map maybe 100-200 fields to the target data warehouse in a work month. Heck, in 1994-5 I spent 18 months building two fact tables and about a dozen dimension tables with less than 1,000 fields in the lot of them.

Today we can map 12K to 15K fields in one work month and we can maintain that ETL for pennies. We can now send the source data for one large operational system installed at many accounts into one target mega model so that we have only one set of ETL per N customers.

And we know how to use this data to make millions, tens of millions, hundreds of millions, and in one case of one of my customers more than a billion dollars new profit per year.

Ben, it is not that we do not know how to deliver long term sustainable profit improvement to enterprises using the data in the enterprise as well as from outside the enterprise.

It is that almost no one really wants to do that. Companies are much more concerned with "how do we hire more DEIs?" than "how do we make more profit?".

One day that will change and men will actually ask me for my help to make more money again.