I can honestly say that my favorite musical group is The Talking Heads. I’ve always been attracted to their catchy riffs as well as their insightful and sometimes quirky lyrics. I find that songs such as “Once in a Lifetime” have several lines that I find myself muttering at times.
What, might you ask, does this have to do with Enterprise Information Management?
Without a formal data strategy to guide the work of your data team (Business Steering group, Data office, and IT), “you may find yourself behind the wheel of a large automobile,” asking yourself, “Well, how did I(we) get here?”
In past Data Warehousing efforts, organizations copied data into a collocated environment and worried about connecting the tables and fields as needed. This created a proliferation of data copies, not reproductions, as they were out of synch from a timing aspect or individually manipulated with additionally created fields. These tables lacked good documentation and metadata, and so when someone didn’t understand what they were looking at, they created their version to fit their requirements.
If You Don’t Start to Move Away From Data Warehousing and Take Advantage of the Improved Technology, Your Organization Will Be Stuck In The Past
Over time, without a data strategy and “Letting the days go by,” the data warehouses increased in size and complexity, and only those with the proficiency, gumption, or concerns for job security had the keys to unlock the warehouse. Allowing this to continue obliged the data warehouses to remain in the hands of a select few and be interpreted a small population.
Enter the formal Data Strategy and the evolving technology to manage/automate the approach.
The pathway to understanding what business problems you are trying to solve, whether they be growth, efficiency, or risk management issues, is all about understanding your Business Partners and Initiatives' requirements. (Think of this as the “Why?”)
These are the primary users/manipulators/explorers of the data and could be Data Scientists, model builders, business analysts, or report generators. They are trying to get information/answers (Who, Where, and When) into the hands of the decision-makers in the organization, preferably in a way that is easy to comprehend/understand. This could take the form of a spreadsheet for interpretation or visualization that tells a graphical story.
One of the most exciting items that have been maturing at an increasing rate is Mastering or data integration. This potential real-time integration of data at a level that looks across all customers, products, objects, or locations, for example, allows for an understanding of cross-functional or interdisciplinary information. Having a repository that integrates these views allows consistency of a single version of the truth, access to most current information, accuracy across several lenses, secure information, and roles-based access, to name just a few.
With their requirements in hand, you now understand what data needs to be integrated as well as the frequency. (This would be the “How?”) The proliferation of new ETL and Mastering tools leveraging Machine Learning (ML) and Artificial Intelligence (AI) have lent insightful capabilities and created integration paths of data into a methodology supported by new workflow techniques. Some vendors are using ML to identify and train unknown data fields into known data models. This would accelerate the access to data as well as improve the consistency in classification.
To understand what data we can bring together, we have to go back upstream to the Acquisition layer. At this layer, an inventory needs to be taken of what you have and have access to (What).
What is accessible (free vs. purchased), what is the value and accuracy of the data, where it stored (cloud vs. on-prem), are just some of the questions that will need to be answered about what data you have and whether it is worth bringing together.
Another area of technology that is helping this situation is virtualization, which is the capability to connect to data sources, combine different data types and consume the data through multiple platform/delivery systems. We have found that virtualization is a key component of being able to continually contribute value to the organization while you are executing on the enterprise data strategy.
As we continue down this journey of evolving technology capabilities, we know that there will always be new tools/ capabilities available. My suggestion is to find a suite of tools that already have some level of integration across your data strategy components and stick with that toolset (at least to get your ROI out of the program).
This will save you time in trying to integrate them yourselves and keep you from veering off the path.
Additionally, I suggest establishing a cloud strategy and creating policies to reuse as you expand. For organizations with immature cloud efforts, you will end up learning the hard way if you don’t put some guidelines together to keep everyone aligned. Finally, find yourself a team of data warehouse experts interested in change and investing in new personal capabilities.
They are out there, and they are the lifeblood of the evolving data organization. They know what data can be trusted and where it can be found.
If you don’t start to move away from data warehousing and take advantage of the improved technology, your organization will be stuck in the past. It will be surpassed by your competition’s capability to access and use data. And your organization could end up “Same as it ever was…” or worse.