Today, data is more valuable to a company than it’s ever been. It can help you make data-driven decisions which can improve business performance, boost revenue, and improve efficiencies.
But almost all industries across the world face the same challenge: they aren’t sure if their data is accurate and consistent, which means it’s not trustworthy. This can cause anything from day-to-day issues to significant business problems and risks.
On top of this, we’re living through the age of big data, where more information is being processed and stored by organisations that also have to manage regulations.
For many companies in the EU, the introduction of GDPR forced them to manage their data more carefully, bringing in policies and processes to handle their customer’s personal data. But in the four years since it came into force, have companies reached their full potential for data integrity?
At Alation, we think there’s more that can be done. But firstly, we need to look at how we define data integrity.
What is data integrity?
Data integrity refers to the access, maintenance, storage, and quality of data throughout its life cycle, from creation through to archiving or deletion. It is essential to consider data integrity when designing, implementing and using any system that stores, processes, and retrieves data.
Many confuse data integrity with data quality. However, there are differences between the two; first and foremost, data quality is a part of an organisation’s wider data integrity strategy. This strategy should also consider data security.
Data quality refers to the processes that are put in place to measure your data’s age, relevancy, completeness, accuracy, and reliability at that point in time. Data quality can be broken down into two categories: objective and subjective.
- Objective: a set of rules that determines what the data should look like. Once these rules are executed it can record a pass or fail for data records.
- Subjective: someone’s thoughts on whether a specific set of data is correct. This could be data that is missing, or obviously incorrect but in a way that doesn’t get flagged by the rules set up.
To ensure good data quality, an organisation needs to address both objective and subjective data quality, as part of their wider data integrity strategy.
Having processes in place that look at both integrity and quality helps organisations ensure that data can be recovered, searched, traced, and connected across the organisation. This in turn gives you the confidence the data you are using to make decisions is correct.
A data integrity strategy can also help companies comply with regulations, like GDPR in the EU, where personal data should only be able to be accessed by certain people and only stored for as long as it’s needed.
Is integrity a universal truth?
Data integrity is a concept that exists across the world. And it’s easy to see why. Wherever we are based, we need trustworthy data to make business decisions and minimise organisational risk. What does differ is the perception of integrity across industries.
Because it involves data access, some see data integrity as an IT problem to solve with password protection and logins. However, others see it as a business-process problem, as data needs to be inputted correctly at the start of the process based on a series of rules.
Data quality is also well accepted across the world as a general principle. But what good data looks like varies. This is due to the variance we see in Data Quality Dimensions that people use to determine whether data is good or bad. Different bodies of data professionals and different professions and departments have slightly varying takes on what is good and bad.
Is data integrity still needed?
In the EU, many companies have looked at the defensive side of integrity since GDPR was introduced. Processes and policies have been put in place to ensure that where necessary, access is restricted. In this region, the GDPR has normalised new processes. For example, when data is first added into a system it must be labeled with what it is and who should access it.
This might leave you wondering: is it still important to for companies to be looking at their data integrity? Simply put, yes. But there’s still more work to be done.
The policies that have been introduced across the UK and EU are just the first step. It’s now time for companies to focus on data quality, one of the aspects of data integrity. As new data is gathered there must be processes and systems put in place to ensure that it is accurate and compliant from day one, with quality measures in place to safeguard quality.
3 data integrity and quality best practices
1. Involve teams across your organisation
Different departments treat data differently, and will have different data needs and perspectives, so to ensure a consistent approach to your data integrity you need to speak to people across your organisation.
2. Decide what good looks like to you
There are many ways you can ensure your data is good quality, but it’s ultimately down to you and your business to see what you want to get out of your data. There is advice you can use to help you make that decision, take a look at our blog on data quality to find out more.
3. Invest in systems that can help
Our data catalog can help you with both integrity and quality. It can capture the roles and permissions for who should be able to access data and it features flags that allow you to mark what data is completely checked and trustworthy or whether it needs some work.
Alation’s Open Data Quality Initiative
With the Open Data Quality Initiative Alation customers can now choose the data quality vendor that works for them, with the confidence that the tool will integrate with our Data Catalog software and Data Governance application. This way you can see your data and it’s quality, so you can instantly see how trustworthy it is.
To learn more, request a free demo to see how Alation can support your data quality and integrity throughout its life cycle.