Procurement data quality (III) : the must-have to optimize spend management
Last week, we saw how much work is required to fully harmonize the data. Indeed, because of the many different sources of information, it is a tedious work to analyse spends. So, are there any proven solutions to clean the data?
Episode 3: Achieve a consolidated and unique vision through a Procurement solution
“An alternative approach, says Amar Ramudhin, professor and director of the University of Hull’s Logistics Institute, is to use the Rosetta Stone dictionary for spend analysis purposes, but continue to allow individual business units to retain their own coding conventions for transaction purposes. This has the merits of being a lighter-touch approach, he explains, but with an ever-present danger that ‘data drift’ will emerge as business units create new and incompatible codes when they buy new items and engage with new suppliers. “From an implementation perspective, it’s easier to work with, but it requires more maintenance,” he says. “There needs to be discipline when creating new codes, otherwise the mapping to the shared dictionary will be lost, with a consequent degradation of data quality.” All of which serves to underscore the difficulty facing spend analytics projects in businesses where distributed data and data quality issues are part of the landscape. There’s a vital need for a single coherent view of the business’s spend data, yet no simple way of achieving it. What’s more, it’s a problem where outsourcing providers have little to offer: critical pieces of the puzzle are held by people inside the business – and generally, inside the heads of people within the procurement function.
No wonder, then, that a number of businesses take the approach adopted by Vodafone: accepting that enterprise-wide recoding and data governance is an unattainable objective, and looking for a means of providing the procurement function with business intelligence tools to manually build that single version of the truth, in whatever form seems likely to offer a cost-effective and timely solution. It’s a recognition that is not only pragmatic, but one that acknowledges the huge steps taken by business intelligence technologies in recent years – self-service reporting, ‘fuzzy logic’, cloud-based data warehouses and analytics libraries, and in-memory processing. Moreover, it’s an approach which – as at Vodafone – at least sets out to consolidate all the enterprise’s spend data, rather than a small subset deemed to be more strategic. While superficially appealing, this is rarely a good idea in practice, say insiders. “By attempting to redefine the data consolidation challenge so as to make the task simpler and more achievable, there’s a temptation to concentrate only on the very largest areas of spend and on the most important categories,” says Cranfield School of Management’s Saghiri. “The danger with this is that less strategic areas of spend don’t get analysed at all – despite potentially offering some valuable insights and savings.”
Austria’s Erste Group Bank, which operates in eastern and central Europe, is one organisation that has taken just such a business intelligence-driven approach, says Christof Grossfurtner, the bank’s head of process management and IT. Being, as he puts it, “more of a group of banks than a banking group,” it was important for the Erste data quality project to avoid imposing a central data governance model on individual banks. Yet equally, he stresses, the potential gains on offer through spend analytics were too important to ignore. The solution: a manual approach, spearheaded by the procurement function, to allocate procurement spend into one of 160 categories of spend set up inside the bank’s QlikView business intelligence tool. The goal, he explains, was to extend this to a minimum level of 96% of spend at the third level of disaggregation, where ‘facilities management’ might be the first level of disaggregation; ‘building management’ the second level, and ‘cleaning services’ the third. Aggregating data from around 40 sources, the process took three years to reach the 96% level, recalls Grossfurtner. Coverage has since increased to 98%, yielding a number of benefits aside from the usual procurement-specific advantages. Improved visibility into compliance and sustainability, for instance, have been significant – if initially unexpected – wins for the project.
Even so, not every organization will want to wait three years to build a single consolidated view of its procurement spend. Inevitably, then, the question turns to how to consolidate and clean spend data on an accelerated timescale, but without losing the rich insights into procurement spend that come from having procurement expertise engaged on the project. At Germany’s Fresenius Medical Care, a manufacturer and therapy provider that specializes in renal care, global procurement performance management director Pascal Zuber can point to one such solution: the company’s use of a data tool from the digital procurement solution provider SynerTrade. This, he explains, is useful in helping to harmonise, validate, and clean master data from Fresenius’s various ERP systems around the world. Including an in-memory database for high-performance processing, the solution iteratively applies several dozen business rules and algorithms, gradually building an ever-cleaner picture of spend, typically until the point where accuracy reaches 99.9%. Once established, with accurate mappings in place, the rules and algorithms can then be re-run as required, enabling companies to conduct spend analytics at will – but without requiring recoding or data governance exercises within individual business units. “We had all the classic problems of a distributed procurement organisation working on distributed systems,” says Zuber. “We had the same suppliers supplying different sites, but didn’t know it. We were buying the same things under different codes, but didn’t know it. And we were paying different prices for the same product – even from the same vendor. And with no way of bundling demand together, there was no way of going to suppliers to solicit collective bids at advantageous prices and terms.”
Today, the SynerTrade solution has become central to the company’s procurement activities at both a strategic and tactical level, drawing together data from four instances of SAP systems and five other non-SAP solutions. It particularly helps create transparency and bring “unknows” to the surface to focus efforts more effectively. “The original goal was – and still is – a single version of the truth, across the company,” Zuber says. “What we wanted was global transparency into our current contracts so we can see who was buying what and where. We didn’t have this transparency. Using one SynerTrade solution globally has helped us to achieve that goal.”