Versions of truth

The move toward paperless offices and the consequent digitising of data has led to the amalgamation of vast amounts of information. For government agencies, corporates, not for profits and service organisations, data has amassed to the point that we specifically reference its magnitude. Data ‘warehouses’, ‘big’ data and data ‘lakes’ are terms used to describe the enormity of the data we are gathering and accessing.

But how do you know whether the data you rely on is valid and mutually acknowledged as a truthful source? How can you be certain the data used to report one metric is the same as in the next department when we each have our own databases and we don’t collaborate? You turn to a single source of truth (SSOT).

Although difficult to achieve, more and more organisations, including government agencies, are now considering implementing a single source of reliable, verified data on which to base decision making.

This whitepaper discusses the concept of SSOT, the challenges and barriers to adopting it and the things an organisation must consider when deciding to adopt SSOT as a solution to the data deluge.

What is a single source of truth?

“Single source of truth” is a concept whereby organisations develop a system design that only stores each data element once. That means one employee name, one employee address, one list of external training suppliers, etc.

At first glance you might believe SSOT refers to having only one database within which everything is stored once. For many this idea is the gold standard, but in reality organisations usually have multiple databases that hold some, but not all data. Although the collection of data items may be stored in multiple databases, a SSOT solution seeks to ensure that no piece of data is duplicated across the databases. So, where you have databases holding information specific to HR, Finance or Sales, the SSOT concept holds that there will only be one instance of a data point, eg supplier name, within all of them. That data point is then understood to be the truth.

What created the problem?

The requirement for SSOT has been driven by the advancements made in technology. In the early days, data was stored on tape on a large, but generally single computer. As technology advanced and cost was not a barrier to entry, organisations invested in two or more computers. Where there were two sources of data, there came a requirement to share the data. A new concept, ETL (Extract, Transform and Load), was born. Using ETL, multiple databases could be created, copied and shared.

Fast-forward a few decades and organisations have innumerable sources of data that they capture. Typically, data tends to be stored in silos specific to a function, such as sales, HR, finance, etc. Informally, even more data might be stored on spreadsheets, local databases or other documents. The emergence of siloed and informal data adds to the complexity of the SSOT concept because, with the exception of finance, there are few controls around the data itself.

Where a HR department could have a list of suppliers that includes ABC Training Inc., a finance team could have a similar record for the same supplier but stored as ABC Training Sydney, ABC Training London, or just ABC Training.

Siloed databases were not typically integrated, but that wasn’t a problem with the advent of Business Intelligence. Using simple tools, a user could extract data from some or all company databases and present it as truth. Cloud computing, mobility and the rise of social media have also contributed to the deluge of data. The culture of on-demand output has been assisted by the breadth of data sources on which people can draw. Artificial Intelligence can derive truths from disparate sources, further challenging the validity of a given data insight. Together, they have created what is often known as versions of truth.

A version of truth is an insight based on a data source configured to support a certain view, eg sales data compared to HR data. Each may store data related to a common term, but the data stored is not necessarily the same thing. Consider, for example, a measure as apparently simple as revenue.

Just as the Inuit people have multiple words for ‘snow’, most companies have many of words for revenue. They typically use terms such as ‘gross revenue’, ‘net revenue’, or ‘invoiced revenue’. But of these, which revenue is actually revenue? Within finance departments, the answer may be ‘all of them’ depending on context, but to wider business teams, revenue will probably mean something else. So imagine the difficulty when HR presents to the executive team and introduces a version of truth with ratios such as revenue per FTE or labour cost as a percentage of revenue.

After finance has finished qualifying the data on which the ratio was calculated, the insight has lost credibility amongst the team it sought to influence. Indeed, versions of truth that are based on siloed data represent one of the biggest challenges to data-driven decision making.

Why are versions of truth challenging?

Every organisation is data driven. Decision makers need to know they are basing strategic decisions on accurate data that is universally accepted as true and representative. Without having data on which it can rely, an organisation faces potential problems, which can include:

Confidence

Decision makers must have confidence that the data on which they act is accurate and correct. Depending on the nature of the decision, the results could be disastrous. A 2013 study, The State of Data Quality: Current Practices and Evolving Trends, conducted by Gartner found businesses were losing $US14.2 million per annum due to incorrect data. The same study predicted a 40% p.a. increase in corporate data growth, exponentially increasing the risk.

Assuming 20% of that data is incorrect, organisations must endeavour to understand and regulate data quality or risk relying on data sources with little integrity.

According to KPMG’s 2016 Global CEO Outlook, 84% of CEOs lack confidence in the quality of the data on which they base decisions. Without confidence in the data on which they rely, executives will be less prepared to support new ideas or new investments in technology or quality initiatives.

Lost Growth

Where people do not trust the quality of their data source, they may well opt not to decide at all. The preference for no decision over a wrong decision can lead to stagnation. No decisions lead to no actions, which impacts business growth and performance and an unwillingness to pursue new opportunities. Alternatively, decision processes are slowed as executives seek verification of the data on which they rely.

Time spent verifying data can often translate into lost business. Organisations with better data insights can identify opportunities that others will miss. Consequently, one organisation may be proactively exploiting intelligence suggesting the need for a new product or service that another organisation hasn't even recognised as a trend.

Compliance

Every industry is subject to regulations. Organisations not maintaining good quality data are at risk of non-compliance and the associated penalties. As regulatory environments evolve and change, organisations must be able to comply. The Single Touch Payroll and Payday Filing regulations in Australia and New Zealand and the Real Time Information process in the UK, require organisations to implement strict controls over data relating to HR & payroll.

Organisations must ensure their data complies or risk significant penalty.

Reputation

Small inconsistencies in data can impact corporate reputations. The media consistently reports on telephone companies that refuse to cancel the contracts of deceased customers, or bills being sent with incorrect personal details. On a more significant note, organisations or departments engaging in transactions with other organisations that are sanctioned or known to be involved in nefarious acts can cause significant reputational damage.

Why isn’t SSOT adopted by everyone?

With the growing realisation that siloed data is impacting the bottom line, many organisations are charging CFOs with the task of developing an SSOT. Recent research completed by software developer Adaptive Insights helps clarify the challenges organisations face when they begin their SSOT journey. By surveying more than 430 CFOs around the world they learned the following:

  • 41% of respondents had three to five distinct data sources within their organisations
  • 30% of respondents had five or more distinct data sources within their organisations
  • 59% of respondents expected, over the next five years, a 50% increase in the amount of data they will gather and manage
  • 68% said they were, at best, “average” at integrating operational data with analytics or dashboards
  • 47% manually aggregate data and 3% don’t try at all
  • 27% rely on a consolidated data warehouse or a BI tool.

When asked what the barriers to adopting an SSOT were, they found that:

  • 37% identified legacy technology and data structures
  • 27% identified too many data sources to review and consolidate
  • 27% identified lack of collaboration within the organisation
  • 27% cited cost

What is required to implement an SSOT?

Organisations seeking to implement SSOT need to appreciate the complexity of the task they are contemplating. Every organisation is different, with differing technology architectures and no unique solution that all can adopt. What follows is a discussion of the factors organisations need to consider when planning for an SSOT solution.

Commitment and buy-in

The decision to create SSOT architecture requires significant change management. To change how an organisation gathers, stores and uses its data, a commitment and buy-in is required from its executive team. Hence, the executive team need to understand the concept and benefits delivered by the SSOT approach to data management. For further information on Change Management, refer to Frontier Software’s e-book and article on the topic.

Define the metrics

If you don’t know what revenue figure to use in your HR ratio, then you don’t have an SSOT. One of the critical tasks to complete is the identification of the key metrics your organisation uses to make its decisions. Consensus among the executives and decision makers is crucial. Having several data points for ‘revenue’ is OK, but the executive team must agree on the revenue data point that informs each insight.

Do not underestimate the importance or magnitude of this task. You must understand where every data point resides within your system and then determine which data points will inform which insights. Depending on your performance metrics, this could be a difficult and time- consuming process.

Once agreed, however, the fear of versions of truth can be allayed. If the version of truth relied on by sales uses only the mutually agreed figure for revenue, then there is no need to question the validity of the insight. Multiple versions of truth are useful for line management to add insight to the raw data. Senior management and the Board, however, must determine the set of metrics, sourced from the SSOT, that are acceptable for use throughout the organisation.

Work in stages

SSOT projects are complex. Attempting to corral all data in one stage would likely result in significant problems. Instead, consider breaking the task into stages, perhaps by silos. Start with finance, then HR, marketing, procurement, sales etc. Successful projects often begin with the smallest business unit. Remember to define what business value each stage will bring to your SSOT project and use it to describe how success will look.

Be mindful of momentum. When you have the buy-in of the broader team, deliver the project without breaks or delays. One of the best ways to derail the SSOT project is lack collaboration.

Be outcome focussed

Although the process of moving to SSOT is largely driven by technology, your primary focus should be on the outcomes you expect to achieve. SSOT is designed to enable organisation- wide decision making derived from mutually agreed and validated data sources.

Your goal is to deliver the right data to users, how and when they need it, in a format that they can consume. This approach does not remove normal controls around data access. The goal of the project is to enable the organisation to answer the questions it must ask to make business- specific decisions and to set measurable benchmarks of what success looks like.

Conclusion

Every organisation works in a competitive environment and amasses more and more data year after year. Those who aren’t focussed on identifying key performance metrics and the data that supports them are lacking critical business and competitive insight. Without insight, businesses either make incorrect, ill-informed or no decisions at all. By adopting SSOT architecture, organisations can manage the data lakes they have created and harvest them for competitive advantage. Although an initially daunting proposition, the consequences of not doing so are equally daunting and potentially devastating.

Download Whitepaper