Automation and especially re-automation of business functions can provide huge benefits to any company. Business efficiencies and cost reduction are maximized if the opportunity is taken to re-engineer the business processes, not just upgrading legacy processes that were captured years ago.
One component of the re-engineering process that is often overlooked, is how data will be sourced and integrated into the new application. Ideally, data should come what is called a “golden source”; defined as the accurate source of data for the application.
The problem is few firms have implemented a “golden source” data strategy.
Thus, compromises are made to access data from the easiest source to enable a system to “go live” and realize the business benefits as soon as possible. This is pragmatic and I would agree with it, as long as a secondary step is committed to by defining a phase 2 of a data integration. This second step allows time to establish the “golden source” for data being processed in a subsequent but committed phase of the implementation.
At an extreme level, the phrase “garbage in, garbage out” comes to mind! While not all situations are that extreme, inaccuracies or inconsistencies occur if data is not sourced from a verified “golden source” and that can lead to poor decision making. A great example of a “golden source” would be to receive a direct feed of an exchange drop copy for orders and trades. Once this “golden source” has been defined, the source can be used for other applications that use that data. Over time, the murky waters of most databases become clear and output becomes more reliable.
We implement our KRM22 Risk Cockpit initially for operational risk managers to provide management oversight of business processes. The first implementation of our Risk Cockpit absorbed data by “scraping emails” and spread sheet integration, so that early wins could be achieved with the use of the Cockpit. Once the business purpose was seen as understood, we advanced to ‘the golden source” through API integration to the underlying system. Open API integration is simple but define the specific ‘Golden source’ can takes time. The results delivered short term wins via validation exercises and then a long-term win by reducing complexity and cost while improving accuracy and timeliness of the enterprise risk information.
KRM22’s Global Risk Platform is unique as “golden sources” of data are absorbed once and then promulgated across multiple risk functions and applications. For example, absorbing that fix drop copy feed from an exchange can be accessed once, then used to manage everything from real-time P&L and margin management, deriving complex stress models, VAR assessments and market surveillance.
While there is no one path to technical achievement, we know we can achieve success if we take a pragmatic approach to data in a phase one roll out of new technology but stay committed to leveraging our “golden sources” of data in committed subsequent phases.
Take this from a verified, golden source!