Can a Data Manager ever raise a hand in front of his CTO or DGO and say he has achieved perfect data quality? The answer unfortunately is a big ‘NO’.
Why? Because there is nothing like perfect data; it’s a mirage which, if you start chasing, will get you stranded in the middle of a desert. So, is it worth trying and remediating bad data? Yes of course, but we have to take a selective approach. The Pareto principle, the 80-20 rule, will come to the rescue here. Identify 20% of the issues that fix 80% of the data. Easier said than done but at-least it is something achievable and can be aspired for.
Even more important is to realize that fixing the process is far more important than fixing the data. I am not trying to imply that data remediation is not important. It is and needs to be done. But it should be more of a correction or fix in case of exceptions or a one-time effort when you laying down the foundations at the onset of your Data Quality program.