PUBLIC COMMENT ON THE CRE DRAFT DATA QUALITY RULE
Commenter:
Daniel M. Byrd III, Ph.D., D.A.B.T.
Affiliation:
President, Consultants in Toxicology, Risk Assessment and Product Safety
Comment:
July 7, 2000
The Center for Regulatory Effectiveness (CRE)
Suite 700
11 Dupont Circle, NW
Washington, DC 20036
CRE has prepared a hypothetical rule about data quality, in principle resembling one that the Office of Management and Budget (OMB) would propose, in response to a pending Congressional bill. CRE deserves plaudits for the draft data quality regulation, because it will help the public think about what the legislation might accomplish. In addition, the idea of a standard for data quality will interest the scientific community.
Passage of the statute apparently would require OMB to establish (1) the standard a federal agency would apply to achieve a minimum quality of data used in its decisions and (2) the procedural mechanism an agency would follow to correct defects in its data, as brought to its attention
by outside parties. Each agency would set up its own version of OMB's standard and mechanism through rulemaking. Individuals from the private sector could then petition the agency for correction of information that is incorrect or misleading, subject to judicial review, although arguably, this remedy already exists under OMB Circular A-130 (Management of Federal Information Resources).
Under the Paperwork Reduction Act of 1995 (PRA), OMB already reviews agency inquiries of private citizens. The private sector views OMB's oversight as a generally good idea. Government often imposes burdensome paperwork requirements on the public, which are in some ways worse than
paying taxes. For example, filling out a thirty-page form to collect three dollars in taxes may make sense to a bureaucracy charged with generating revenues, but the time required to fill out the form prevents
the affected parties from engaging in other activities that probably would generate more tax revenues. In addition, poor quality information (or information that will not meet objectives) is not even in
government's interest.
The mere existence of a formal process that requires written documentation, makes agency staff stop and think about their information needs and the public burden of providing this information. Notice and an opportunity to comment about an agency submission to OMB also lets the interested public see what is going on. Some examples of unsound inquiries that OMB has prevented were: (1) a questionnaire designed to stimulate support for an agency's mission instead of elicit information and (2) a survey design that could not possibly answer the questions posed by an agency. Thus, OMB has acquired some experience with problems of data quality through the PRA.
Yet, when viewed from the overall perspective of all data generated by, or for, federal agencies, PRA is limited; its scope is narrow. It does not cover, for example, data generated within an agency for its own use. OMB also has the authority under PRA not to prevent frivolous paperwork burdens placed on the public. Thus, efforts to broaden OMB's oversight, compel OMB action, or improve data quality through other means, make good sense.
Relying on a petition process will have substantial benefits. Government would not expend resources to improve data quality, when it does not matter to the affected public. The vast preponderance of federal information is financial or econometric in nature. However, toxicologists generate a different kind of data, and they would experience some problems under CRE's draft data quality regulation.
Toxicologists have experience with data generated by private organizations but used by federal agencies to make regulatory decisions. Part of the insensitivity of the proposed rule is that it does not address the source of data in a specific way. From this perspective, a data quality rule should apply differentiated procedures to: (1) data that an agency gets by surveys or reporting requirements imposed on the public, (2) data that an agency requires from specific private parties for the agency's own use, and (3) data generated by an agency for its own use without involving the public.
In addition, CRE's draft data quality regulation does not recognize different categories of poor data quality with sufficient specificity. The misuse of good quality data, the wrong kind of data to answer a question, misleading context, inadequate numbers of samples or subjects, measurement errors, unreliability, insufficient specifications of uncertainty, and fraudulent or sloppy data generation, are all different problems from the perspectives of response and remedy. The proposed rule does not recognize the difference between data material to human health and safety and other kinds of data, a distinction that the courts have no difficulty in understanding.
The scientific community ultimately relies on the replication of experiments or surveys to evaluate data quality. This criterion means that the provision of adequate information to reproduce an experiment or survey is a key aspect of data quality. In addition, scientists rely on experience with similar experimental or survey conditions, correspondence with similar data, quality assurance (particularly useful
in preventing fraud), scrutiny of internal contradictions, reviews of study design by review boards, peer review, and published comments as stop gaps short of replication.
Independent data analysis, another activity not adequately covered by CRE's draft data quality regulation, often consists of culling out obviously poor quality data and selecting the "best" data (sometimes called "the critical study"). It is not clear how a petitioner would act, if the error consisted of selecting the wrong study, not the quality of the study's data. Data reanalysis also might prove a problem. For example, what would happen if a toxicologist discovered that a company had mistakenly filed an 8(e) notice under the Toxic Substances Control Act? Would this independent activity confer standing? What if the Department of Health and Human Services miscalculated the average of otherwise good quality data? Overall, CRE's draft data quality regulation would improve some, but not resolve all, data quality problems within the federal government.
Instead of trying to expand the draft data quality regulation to cover all eventualities, which might prove infeasible, if not impossible, CRE could simplify the data correction process, by (1) relying more on the discretion of the designated senior official in each agency with responsibilities for data quality and for the correction of data errors (the Chief Information Officer; CIO) and (2) imposing OMB oversight on
this official. OMB has more relevant skills and more experience with data quality than the judicial system. If an agency denied a petition, the petitioner would first appeal to OMB. OMB review would be more accessible to the public.
In addition, the criteria for standing to petition seem too narrow. For example, a scientist who generated some data, which an agency then misused, would have to rely on the possibility that agency action would affect a reputation or interest, a circumstance hard to document. If the scientist wanted to appeal this denial of a petition, under CRE's current proposal the Courts would look closely at standing and could well deny it. In addition, scientists seldom read or know about the Federal Register, so previous comment on a notice would not make a good criterion for them. If a scientist has independently reanalyzed another scientist's data and reached different conclusions that relate to an agency's use of the data, eventually the agency will arbitrate a conflict between two scientists.
Circumstances become even more hazy, if scientists have good reasons to believe that reanalysis would yield different conclusions but have been denied access to the primary data. Similarly, engaging in scientific studies is seldom found an unlawful activity, but precious few laws authorize scientists to engage in studies. The necessity of engaging in a lawful activity might prove difficult to establish. Instead, suppose that CRE's draft data quality regulation simply authorized any U.S. citizen to file a petition for review. The fear may be that frivolous petitions would deluge an agency. However, CRE has no knowledge of whether this circumstance would ever occur. If the decision to respond or not is left to the agency's CIO, with OMB oversight in instances where the petitioner disagrees, an agency could easily deflect a flood of irrelevant petitions.
Sincerely,
Daniel M. Byrd III, Ph.D., D.A.B.T.
President,
Consultants in Toxicology, Risk Assessment and Product Safety
Washington, DC