There’s an old computer sciences adage that refers to the dangers, and costs, of bad data. It goes: “Garbage in, garbage out.” With e-government technology in use and being rolled out from local to national levels U.S. society has never been so exposed and vulnerable to the potential costs of bad data.
Conducting a comprehensive survey on data quality in the public sector, researchers working for online magazine Governing found that bad data costs taxpayers an unspecified but large amount of money in the hundreds of millions of dollars. Data entry errors uncovered during an audit in California, for example, reportedly cost the state $6 million.
More broadly, 7 of 10 officials said bad data is “frequently” or “often” a problem. None, zero percent, said they only rarely come across problems with data quality. Governing surveyed state-level managers whose job entails dealing with data directly. In total, more than 75 interviews were conducted with auditors and other data managers in 46 states.
The most commonly cited bad data issues were inaccuracy, inconsistency and incomplete or missing data. Seventeen percent of officials said technology issues was the major cause of bad data problems. Fourteen percent cited mismanagement or lack of accountability, while 12 percent said it was a lack of planning.
Human services, economic development, and public safety and corrections agencies were said to have the biggest data quality issues. Those that need to comply with strict federal reporting requirements were said to have the best data quality.
The full report and a summary infographic are available on Governing’s website.