– Thomas Jefferson
How many times have we been challenged with a knowledge gap in our personal or professional lives? We need to learn a few words in a different language to welcome guests from a foreign country. We need to invest our hard earned millions to earn the best return at minimum risk. We need to follow some obtuse instructions to assemble a new piece of furniture. We need to apply a mandated new process, procedure or methodology at work. We need to navigate through a detour in a construction zone to get to our destination. We need open heart surgery. We need to figure out a new or changed technology. And so it goes.
Life is filled with these know-how gaps. We have a few options to address those challenges. We can acquire the needed information on our own. We can rely on someone who already has the knowledge and experience to educate and guide us. Or, we can hand the job off to someone else. Which option we choose depends on a myriad of factors, from how big or critical the challenge is, the time and cost of the options available, to our personal and organizational strengths, interests and inclinations.
In this story, we’ll see how a major financial services organization responsed to a mandated regulatory change that required them to identify and track the source and pathway of data deemed critical to accurate and timely financial reporting. We’ll discover that it’s the mastery that counts.
Thanks to Peter Szirmak for the details on this story. Full disclosure here – Peter Szirmak has been a colleague and friend for over a decade. His accolades for the Project Pre-Check practice appeared on the back cover of my first book on the subject. I was not aware of the project covered in this article until recently and had no personal involvement in its conduct.
The global financial crisis revealed shortcomings in the ability of banks to provide timely, complete and accurate aggregation of risk exposures. The correctness of reporting data proved to have severe consequences not only to individual banks but to the entire financial system. In response to these shortcomings, the Basel Committee on Banking Supervision (BCBS) issued the “Principles for Effective Risk Data Aggregation and Risk Reporting” (RDARR), outlining key areas to strengthen risk data aggregation capabilities and internal risk reporting practices.
One of the requirements of RDARR related to what is called data lineage. Data lineage tracks data from its origin to its destination, the different processes involved in the data flow and their respective dependencies to ensure that the final presentation in financial reports is accurate and transparent.
Our company of interest, a large international financial services organization, like its peers, was faced with the challenge of ensuring the accuracy and transparency of the data used in its financial reports. In 2014, it launched a project to ensure data lineage compliance with the new regulations.
To track the data lineage for 125 different pieces of information and ensure their use in the company’s financial reports was accurate and transparent. The target date for completion of the project was December, 2016. The budget allocated to the endeavour was $10 million based on what the organization felt was a worthwhile investment. Accountability for the project was assigned to the Chief Data Officer (CDO), head of the Enterprise Data Office.
The CDO assigned the project to a senior IT project manager along with three senior data analysts from his organization. Together they considered options for how to tackle the challenge and, after due consideration, recommended a team of 20 data analysts to manually trace each piece of information back to its source.
The organization had a standard practice in place to submit every project to the Enterprise Architecture Group for an assessment of the planned approach and possible alternatives. The practice, called an Initiative Assessment or IA, was a collaborative and iterative exercise developed with the organization’s Project Management Office. Its intent was to build a comprehensive and holistic view of a project’s impact on the organization and beyond. Project managers and teams used the practice to get feedback and insight on a multitude of fronts as the work progressed.
The first IA review about a month after project launch received full support from the review committee. The consensus was to continue exploring and building an understanding of the project’s breadth and depth using the IA practice.
Three months in, the scope was much better understood. The 125 information items had burgeoned to at least 275 data elements in over fifty systems involving over ten programming languages and a multitude of scripts and fourth generation tools. The IA review at that point was able to consider a number of viable alternatives to achieve the project’s goals.
One of the alternatives proposed was to use the data lineage expertise offered by Mapador International Inc. A member of the review team had had previous experience with Mapador and their approach to what they called application cartography. She was most impressed with their mastery in the data lineage domain and believed their expertise would help deliver a comprehensive, high quality solution on time and within budget.
After Mapador’s offerings were reviewed and their references checked, they were invited to bid on the project. With their quote and projected time frame well within the allocated budget, Mapador were welcomed aboard and the work progressed.
Mapador’s approach was to work backwards from presentation to origination. They relied on the company’s data analysts and business owners to provide the necessary context and access. The work was divided into subprojects by information groupings so the work could progress in parallel, reducing the time required. When a dead end was encountered – for example, no obvious source for a data element – the Mapador staff worked with the organization’s business and data resources to solve the problem. A number of external information sources were identified in the process.
Mapador’s suite of tools and the staff’s parsing talents incrementally revealed the required data lineages. As part of that process, a knowledge repository evolved to manage the incremental results from the lineage analysis. The repository, accessible through a web browser, allowed the data analysts and business staff to review and approve the results on the go.
And so the work continued.
The project was completed by September, 2016 for a total of $4.2 million including Mapador’s costs. Over 500 data elements were ultimately traced to the 125 information items. Those data elements involved some 97 internal systems and 7 external sources using seventeen programming languages plus the aforementioned scripts and code generation tools.
Mapador’s process also delivered the knowledge repository covering the targeted data and applications. That allowed the company to refresh the information at will to identify possible or actual changes to the lineage profiles.
What a Great Team Learned
There is often a tendency to think inside the box in situations like this. The project didn’t look particularly challenging. It appeared the organization had the skills and resources needed in house. Why look anywhere else? In fact, another large financial services organization chose to hire a boatload of analysts and tackle the analysis themselves. After spending in excess of $80 million, they pulled the plug and called Mapador. Fortunately, the organization we’re concerned with had the IA practice to force teams to think outside the box.
Four key factors provided the foundation for a very successful undertaking:
- A Roadmap – The IA practice was the catalyst for exploring alternatives. In fact, considering both business and technology alternatives along with a myriad of other potential impacts was a prerequisite for any project to get approval and follow-on funding.
- Mastery – This assignment was in Mapador’s sweet spot. That’s what they do, day in and day out. They are masters of code and documentation navigation and exploration. They had the practices, tools and know-how from the get-go. There was no learning curve to get up to speed. They hit the ground running and brought the organization’s staff up to speed on the job. As well, the organization’s IA practice demanded mastery on all fronts for the exercise to work effectively. It was a match well made.
- Collaboration – Culture can play a huge role in an organization’s success or failure as well as the outcomes for individual projects. In this case, collaboration was institutionalized with the IA. It wasn’t a formal checkpoint review. It didn’t involve senior executives ruling from on high. It was peers – IT staff, business folks, contractors, architects, data analysts, programmers, whoever was needed – sharing information, ideas and strategies on the best way to achieve success for the team and the organizations.
- Iteration – Two of my last three posts in this blog have covered massive project failures that delivered nothing of any substance for years. This organization’s culture embraced iteration through the IA’s process of creeping commitment. When Mapador joined the fray, it was just business as usual. They proceeded on an information item by information item basis, organized sub-projects operating in parallel, vetted results, built the knowledge repository incrementally and proceeded accordingly. That reduced the risks, accelerated delivery and imbued everyone involved with a confidence build on achievement.
These four building blocks enabled this project to deliver the goal set two years prior, with minimal risk, moderate costs and a reasonable time frame. In fact, the IA was really the progenitor of the other three factors. The IA couldn’t work without mastery, collaboration and iteration. In fact, it sustains that culture.
I haven’t done an exhaustive study on this thought, but I expect you’ll find the presence of these four factors in most successful projects and the absence of one or more in failed ventures. Let me know what you think.
Finally, be a Great Leader. Put these points on your checklist of things to consider so you too can acquire and build the mastery you need to achieve project success. Also remember, use Project Pre-Check’s three building blocks covering the key stakeholder group, the decision management process and Decision Framework best practices right up front so you don’t overlook these key success factors.
Thanks to everyone who has willingly shared their experiences for presentation in this blog. Everyone benefits. First time contributors get a copy of one of my books. Readers get insights they can apply to their own unique circumstances. So, if you have a project experience, good, bad and everything in between, send me the details and we’ll chat. I’ll write it up and, when you’re happy with the results, we will post it so others can learn from your insights. Thanks.
Drew Davison is the owner and principal consultant at Davison Consulting and a former system development executive. He is the developer of Project Pre-Check, an innovative framework for launching projects and guiding successful project delivery, the author of Project Pre-Check – The Stakeholder Practice for Successful Business and Technology Change and Project Pre-Check FastPath – The Project Manager’s Guide to Stakeholder Management. He works with organizations that are undergoing major business and technology change to implement the empowered stakeholder groups critical to project success. Drew can be reached at