|
The Data Quality Act: A revoluation in the role of science in policy making or a can of worms? If you’ve never heard of the Data Quality Act, you’re not alone. What is being called the Data Quality Act (and the Information Quality Act) was enacted with no discussion and no debate as Section 515 of the Treasury and General Government Appropriations Act of 2001 (PL 106-544, H.R. 5658). The section directs the Office of Management and Budget (OMB) to issue government-wide guidelines that “provide policy and procedural guidance to Federal agencies for ensuring and maximizing the quality, objectivity, utility, and integrity of information (including statistical information) disseminated by Federal agencies.” OMB published final guidelines to implement Section 515 in the Federal Register on September 28, 2001 (66 Fed. Reg. 49718 - https://www.whitehouse.gov/omb/fedreg/reproducible.html). Between then and October 1, 2002, federal agencies issued hundreds of pages of agency-specific implementing guidelines that include “administrative mechanisms allowing affected persons to seek and obtain correction of information maintained and disseminated by the agency that does not comply with the OMB guidelines.” Some regulatory analysts believe that the information/data quality guidelines will “revolutionize the role of science in policy making by ensuring and maximizing the quality, objectivity, utility and integrity of scientific information.” Others believe that the guidelines will be a “central battleground for reshaping or repeal of environmental laws and regulations.” Still others say that the effect of the guidelines will be determined by the number and quality of petitions filed to challenge information and the vigor with which OMB oversees and enforces the requirements. (Dr. John Graham, Administrator of OBM’s Office of Information and Regulatory Affairs, has stated that the Bush administration is “committed to vigorous implementation of the new information quality law.” ) One observer has predicted that the information quality initiative could degenerate into a “stakeholder food fight.” Background
Whatever the genesis of the Data Quality Act, the guidelines issued to implement it have raised a good many legal and practical questions, many of which are of significant importance to the research community. Requirements of the OMB
guidelines OMB defines “quality” as the encompassing term, of which “utility,” “objectivity,” and “integrity” are the constituents. “Utility” refers to the usefulness of the information to the intended users. “Objectivity” focuses on whether the disseminated information is being presented in an accurate, clear, complete, and unbiased manner, and as a matter of substance, is accurate, reliable, and unbiased. “Integrity” refers to security – the protection of information from unauthorized access or revision, to ensure that the information is not compromised through corruption or falsification. “Dissemination” is defined to mean “agency initiated or sponsored distribution of information to the public.” Of interest to the research community is sponsored distribution of information to the public. Sponsored distribution refers to situations in which an agency has directed a third-party to disseminate information, or in which the agency has the authority to review and approve the information before release. For example, if an agency, through a grant, provides for a person to conduct research, then directs the person to disseminate the results (or the agency reviews and approves the results before they may be disseminated), then the agency has “sponsored” the dissemination of this information. By contrast, if the agency simply provides funding to support research and the researcher (not the agency) decides whether to disseminate the results and determines the content and presentation of the dissemination, then the agency has not “sponsored” the dissemination even though it has funded the research and may even retain ownership or other intellectual property rights. (According to OMB guidelines, “To avoid confusion regarding whether the agency is sponsoring the dissemination, the researcher should include an appropriate disclaimer in a publication or speech to the effect that the “views are mine, and do not necessarily reflect the view” of the agency.”) However, if the agency subsequently disseminates information from sponsored research, the information must adhere to the agency’s information quality guidelines. OMB guidelines state that as a general matter, scientific and research information that has been subjected to formal, independent, external peer review is regarded as presumptively objective. An example of a formal, independent, external peer review is the review process used by scientific journals. However, in discussion of the guidelines in the Federal Register notice, OMB says “Although journal peer review is clearly valuable, there are cases where flawed science has been published in respected journals.” Consequently, the guidelines provide that the presumption of objectivity “is rebuttable based on a persuasive showing by the petitioner in a particular instance.” The OMB guidelines also require that agency-sponsored peer-review be “transparent,” meaning that (a) peer reviewers be selected primarily on the basis of necessary technical expertise,“Influential” information The OMB guidelines apply stricter quality standards to the dissemination of information that is considered “influential.” In regard to scientific, financial, or statistical information, “influential” means that “the agency can reasonably determine that dissemination of the information will have or does have a clear and substantial impact on important public policies or important private sector decisions.” Each agency is authorized “to define ‘influential’ in ways appropriate for it, given the nature and multiplicity of issues for which the agency is responsible.” If an agency disseminates “influential” scientific, financial, or statistical information, that information must meet a reproducibility standard. Analytic results related to influential scientific, financial, or statistical information, must generally be sufficiently transparent about data, methods, models, assumptions, and statistical procedures that an independent reanalysis (or more practically, tests for sensitivity, uncertainty, or robustness) could be undertaken by a qualified member of the public. The guidelines direct agencies to consider in their own guidelines which categories of original and supporting data should be subject to the reproducibility standard and which should not. In cases where public access to data and methods cannot occur because of privacy or proprietary issues, agencies are directed to apply especially rigorous robustness checks to analytic results and document what checks were undertaken. If agencies wish to rely for important and far-reaching rulemaking on previously disseminated scientific, financial or statistical studies that at time of dissemination were not considered “influential,” then the studies would have to be evaluated to determine if they meet the “capable of being reproduced” standard. Risk analyses
(i) each population addressed by any estimate [of applicable risk effects];Information quality requirements can be temporarily waived in cases of imminent threat to public health or homeland security. Challenges to
information EPA assessment factors for third
party information Of additional interest is EPA’s draft document Assessment Factors for Evaluating the Quality of Information from External Sources. (https://www.epa.gov/oei/qualityguidelines/af_assessdraft.pdf) EPA receives a huge amount of information from third party sources. Pesticide companies seeking registration, for instance, must provide hazard information. Trade groups and environmental groups provide information during the rulemaking process. State agencies implementing EPA rules provide monitoring data, laboratory results and analytic studies. And, EPA gathers information from outside sources—such as scientific journals and academic and research institutions—for use in developing policy and regulatory decisions. EPA does not apply quality controls when such information is being generated but must apply quality controls when it uses or disseminates this information. Although EPA has not yet announced an intention to require use of the information quality guidelines by those who provide information, the agency has published for review the draft cited above. In the draft EPA states that its goal is not to impose legally binding obligations but to make the assessment factors broadly known to those who generate information. Current
challenges
Resources Anderson, Frederick R. 2002. “Data Quality Act.” The National Law Journal. October 14, 2002 (https://www.nlj.com/) | |