In managing master data initiatives, leading or participating in data quality efforts can feel like a “yo-yo diet,” resembling the ups and downs of the unsuccessful diets many of us commonly commit to at the start of the new year. In our case, these are our cyclical efforts to improve our data quality and start the year off fresh.
How can we transform this yo-yo thinking into a progressive, transformative process that will sustainably address data issues and improve them in the long term? Here are four best practices that might help.
Develop and Communicate Metrics and Goals
Before launching a data quality initiative, ad hoc or otherwise, record the purpose of your efforts and the results. Goals and objectives should be specific and clear. For example, the overarching organizational goal might be “We want to enrich 80% of our contact data with title and email attributes by the end of Q2.” This should be followed up with specific baselines: How much of that 80% will the initiative need to cover? Will it be everything or anything created before last quarter? Be specific so everyone understands. Then, provide an ongoing schedule of metric updates to your stakeholders and intelligence on prioritization and target dates. Once you have this, send them out at the promised time and cadence – RELIGIOUSLY. Through transparency, you get your stakeholders engaged with your progress.
Take Advantage of Technology
This best practice is all about efficiency: Technology can exponentially grow your data management superpower. Manual work may be necessary for proving tactics and definitely for a certain portion of the data universe you are treating, but technology allows you to replicate these tactics you’ve developed and scale them efficiently. It’s as simple as choosing SQL over spreadsheets for data manipulation. Once you have a proven process, you can submit your code to the technology group to systematically append, alter, and enrich data, and then regularly execute updates. On the other hand, using spreadsheets will keep you at a snail’s pace, which will be slower and might be prone to manual error. Another example is using application programming interface as a gateway to connect your data universe to a reference source. This would remove the need to manually download batches of data and upload them to your system. Applied properly, technology can reduce errors and add stability and sustainability to any data initiatives.
Institute Data Standards
To increase data dependability, a framework is necessary to be able to quantify proper completeness and process compliance. The institution of data standards will provide this capability. With these standards, data creators, users, and stewards will have a common understanding of the data at hand and what data the business needs to function. Assessing the work that needs to be done can be quantified more straightforwardly. For example, it should be decided whether the phone number field can be left blank or populated. When populated, it should only have non-repeating numeric characters greater than seven digits. The absence of standards like this will continuously bring unneeded distractions to your data processes. Data standards help enforce the reduction of garbage being introduced to your environment. Those standards could also bring efficiencies in metrics and technology efforts.
Establish Data Habits
Successful diets aren’t once-and-done every few weeks – they require ongoing lifestyle changes. Similarly, data management requires ongoing, committed processes to be successful. Sure, ad hoc data quality initiatives are necessary at times, but be sure to use these as opportunities to bring your findings and lessons to your data practices. For example, if you have a request from the business to enrich the organization’s records with URL domain values, performing this manually only one time would provide a myopic/short-term benefit to your stakeholders, giving the illusion of fixing the issue. But what about new records coming in? It would be just a matter of time for this gap to widen. Ensure that the population of your domain data is addressed at the introduction of new records. It should also follow a certain format to be considered for record entry. Add the URL domain field in your completeness report and, lastly, ensure that there is a regular cadence of data enrichment, including this new field, to organically provide accuracy proactively. Build practices like these in your data platform as new requirements unfold.
Change Is Hard Work
Challenge your data quality efforts to be pervasive. Make it repeatable, efficient, and effective. Manual correction and touches are very tempting default moves. Why? Because we all want “quick wins” – the results are immediate, and the pain goes away in the short term. But this doesn’t really solve anything – it keeps your data efforts in a cyclical nuisance like a yo-yo diet. The reality is that manual stewardship or ad hoc data efforts are not scalable. Data errors are mere symptoms indicating that procedural and policy adjustments need to take place. This is what you must address. To rid yourself of this condition, bring these four points together to increase data quality. Just like starting a diet, you’ll be shedding pounds, increasing energy, and getting healthier, as long as you continue to be vigilant and not regress to old habits.
(Source:https://www.dnb.com/perspectives/master-data/data-quality-yo-yo-diet.html)
Please contact us for more detailed advice.
CRIF D&B VIETNAM LLC
- Address: Floor 15, Minh Long Building, 17 Ba Huyen Thanh Quan, Ward 6, District 3, HCMC, Vietnam
- Hotline: 02839117288
- Email: csvietnam@crif.com
- Website: https://dnbvietnam.com