The Data Quality Act: A revoluation in the role of science in policy making or a can of worms?

If you’ve never heard of the Data Quality Act, you’re not alone. What is being called the Data Quality Act (and the Information Quality Act) was enacted with no discussion and no debate as Section 515 of the Treasury and General Government Appropriations Act of 2001 (PL 106-544, H.R. 5658). The section directs the Office of Management and Budget (OMB) to issue government-wide guidelines that “provide policy and procedural guidance to Federal agencies for ensuring and maximizing the quality, objectivity, utility, and integrity of information (including statistical information) disseminated by Federal agencies.”

OMB published final guidelines to implement Section 515 in the Federal Register on September 28, 2001 (66 Fed. Reg. 49718 - https://www.whitehouse.gov/omb/fedreg/reproducible.html). Between then and October 1, 2002, federal agencies issued hundreds of pages of agency-specific implementing guidelines that include “administrative mechanisms allowing affected persons to seek and obtain correction of information maintained and disseminated by the agency that does not comply with the OMB guidelines.”

Some regulatory analysts believe that the information/data quality guidelines will “revolutionize the role of science in policy making by ensuring and maximizing the quality, objectivity, utility and integrity of scientific information.” Others believe that the guidelines will be a “central battleground for reshaping or repeal of environmental laws and regulations.” Still others say that the effect of the guidelines will be determined by the number and quality of petitions filed to challenge information and the vigor with which OMB oversees and enforces the requirements. (Dr. John Graham, Administrator of OBM’s Office of Information and Regulatory Affairs, has stated that the Bush administration is “committed to vigorous implementation of the new information quality law.” ) One observer has predicted that the information quality initiative could degenerate into a “stakeholder food fight.” 

Background
What prompted the rider that enacted the Data Quality Act? Because there is little legislative history, the genesis of Section 515 is obscure. One widely accepted view is that Section 515 was a reaction by business and industry to “regulation by information.” The idea behind “regulation by information” is that agencies can accomplish regulatory goals by publishing information more easily than by enacting regulations. For instance, industries are required to provide information to EPA for the Toxics Release Inventory (TRI) which EPA disseminates on the Internet  (https://www.epa.gov/tri//). Dissemination of this information may embarrass industries into actions they are not required by regulation to undertake. The fact that very important information dissemination is frequently done outside the procedural safeguards of the Administrative Procedure Act, some say, led business and industry to ask for “regulation OF information.”

Whatever the genesis of the Data Quality Act, the guidelines issued to implement it have raised a good many legal and practical questions, many of which are of significant importance to the research community.

Requirements of the OMB guidelines
OMB’s guidelines direct Federal agencies “to develop information resources management procedures for reviewing and substantiating (by documentation or other means selected by the agency) the quality (including the objectivity, utility, and integrity) of information before it is disseminated. In addition, agencies are to establish administrative mechanisms allowing affected persons to seek and obtain, where appropriate, correction of information disseminated by the agency that does not comply with the OMB or agency guidelines. Agencies must apply these standards flexibly, and in a manner appropriate to the nature and timeliness of the information to be disseminated, and incorporate them into existing agency information resources management and administrative practices.”

OMB defines “quality” as the encompassing term, of which “utility,” “objectivity,” and “integrity” are the constituents. “Utility” refers to the usefulness of the information to the intended users. “Objectivity” focuses on whether the disseminated information is being presented in an accurate, clear, complete, and unbiased manner, and as a matter of substance, is accurate, reliable, and unbiased. “Integrity” refers to security – the protection of information from unauthorized access or revision, to ensure that the information is not compromised through corruption or falsification. “Dissemination” is defined to mean “agency initiated or sponsored distribution of information to the public.” 

Of interest to the research community is sponsored distribution of information to the public. Sponsored distribution refers to situations in which an agency has directed a third-party to disseminate information, or in which the agency has the authority to review and approve the information before release. For example, if an agency, through a grant, provides for a person to conduct research, then directs the person to disseminate the results (or the agency reviews and approves the results before they may be disseminated), then the agency has “sponsored” the dissemination of this information. 

By contrast, if the agency simply provides funding to support research and the researcher (not the agency) decides whether to disseminate the results and determines the content and presentation of the dissemination, then the agency has not “sponsored” the dissemination even though it has funded the research and may even retain ownership or other intellectual property rights. (According to OMB guidelines, “To avoid confusion regarding whether the agency is sponsoring the dissemination, the researcher should include an appropriate disclaimer in a publication or speech to the effect that the “views are mine, and do not necessarily reflect the view” of the agency.”) 

However, if the agency subsequently disseminates information from sponsored research, the information must adhere to the agency’s information quality guidelines. 

OMB guidelines state that as a general matter, scientific and research information that has been subjected to formal, independent, external peer review is regarded as presumptively objective. An example of a formal, independent, external peer review is the review process used by scientific journals. However, in discussion of the guidelines in the Federal Register notice, OMB says “Although journal peer review is clearly valuable, there are cases where flawed science has been published in respected journals.” Consequently, the guidelines provide that the presumption of objectivity “is rebuttable based on a persuasive showing by the petitioner in a particular instance.” 

The OMB guidelines also require that agency-sponsored peer-review be “transparent,” meaning that 

(a) peer reviewers be selected primarily on the basis of necessary technical expertise, 
(b) peer reviewers be expected to disclose to agencies prior technical/policy positions they may have taken on the issues at hand, (c) peer reviewers be expected to disclose to agencies their sources of personal and institutional funding (private or public sector), and (d) peer reviews be conducted in an open and rigorous manner. 
“Influential” information
The OMB guidelines apply stricter quality standards to the dissemination of information that is considered “influential.” In regard to scientific, financial, or statistical information, “influential” means that “the agency can reasonably determine that dissemination of the information will have or does have a clear and substantial impact on important public policies or important private sector decisions.”  Each agency is authorized “to define ‘influential’ in ways appropriate for it, given the nature and multiplicity of issues for which the agency is responsible.” 

If an agency disseminates “influential” scientific, financial, or statistical information, that information must meet a reproducibility standard. Analytic results related to influential scientific, financial, or statistical information, must generally be sufficiently transparent about data, methods, models, assumptions, and statistical procedures that an independent reanalysis (or more practically, tests for sensitivity, uncertainty, or robustness) could be undertaken by a qualified member of the public. The guidelines direct agencies to consider in their own guidelines which categories of original and supporting data should be subject to the reproducibility standard and which should not.

In cases where public access to data and methods cannot occur because of privacy or proprietary issues, agencies are directed to apply especially rigorous robustness checks to analytic results and document what checks were undertaken. 

If agencies wish to rely for important and far-reaching rulemaking on previously disseminated scientific, financial or statistical studies that at time of dissemination were not considered “influential,” then the studies would have to be evaluated to determine if they meet the “capable of being reproduced” standard.

Risk analyses
Agencies that maintain or disseminate information on analysis of risks to human health, safety and the environment must either adopt or adapt (indicating some degree of flexibility) the quality principles applied by Congress to risk information used and disseminated under the Safe Drinking Water Act Amendments of 1996 (42 U.S.C. 300g-1(b)(3)(A) & (B)). These quality principles require that in communicating risk in support of a regulation, agencies make available to the public a document that, to the extent practicable specifies 

(i) each population addressed by any estimate [of applicable risk effects]; 
(ii) the expected risk or central estimate of risk for the specific populations [affected]; 
(iii) each appropriate upper-bound or lower-bound estimate of risk; (iv) each significant uncertainty identified in the process of the assessment of [risk] effects and the studies that would assist in resolving the uncertainty; and 
(v) peer-reviewed studies known to the [agency] that support, are directly relevant to, or fail to support any estimate of [risk] effects and the methodology used to reconcile inconsistencies in the scientific data.” 
Information quality requirements can be temporarily waived in cases of imminent threat to public health or homeland security.

Challenges to information
Under the Section 515 guidelines, “affected persons” can legally challenge any information disseminated by a federal agency at any stage of development, including draft form. Petitioners must clearly demonstrate that a specific dissemination of information does not meet the applicable quality standards. OMB, many business and industry groups, and some legal experts interpret Section 515 to apply to rulemaking, although a few legal analysts disagree. Agencies must respond to requests to correct information according to timeframes established by their own guidelines and must provide a process for re-appeal. Most legal analysts say that judicial review of final decisions is available, although some disagree. 

EPA assessment factors for third party information
Section 515 guidelines apply to all federal agencies subject to the Paperwork Reduction Act, but implementation of the guidelines by the U.S. EPA is of most interest to readers of this newsletter. EPA guidelines to implement Section 515 (Guidelines for Ensuring and Maximizing the Quality, Objectivity, Utility and Integrity of Information Disseminated by the Environmental Protection Agency) can be found on the EPA Website at https://www.epa.gov/oei/qualityguidelines/.  The guidelines spell out which information is subject to the guidelines, what quality standards apply, and how to submit a request for correction of information.

Of additional interest is EPA’s draft document Assessment Factors for Evaluating the Quality of Information from External Sources. (https://www.epa.gov/oei/qualityguidelines/af_assessdraft.pdf)  EPA receives a huge amount of information from third party sources. Pesticide companies seeking registration, for instance, must provide hazard information. Trade groups and environmental groups provide information during the rulemaking process. State agencies implementing EPA rules provide monitoring data, laboratory results and analytic studies. And, EPA gathers information from outside sources—such as scientific journals and academic and research institutions—for use in developing policy and regulatory decisions. EPA does not apply quality controls when such information is being generated but must apply quality controls when it uses or disseminates this information. Although EPA has not yet announced an intention to require use of the information quality guidelines by those who provide information, the agency has published for review the draft cited above. In the draft EPA states that its goal is not to impose legally binding obligations but to make the assessment factors broadly known to those who generate information. 

Current challenges
If the potential of the Data Quality Act is indicated by the nature of challenges to information filed, then a look at some of those challenges suggests the act will be used—not just by business and industry—but by an array of parties for an array of reasons. Requests for correction of information filed with EPA can be found at https://www.epa.gov/oei/qualityguidelines/af_req_correction_%20sub.htm. Included are:
 

  • A request by the Center for Regulatory Effectiveness, the Kansas Corn Growers Association and the Triazine Network for correction of information in EPA’s Atrazine Environmental Risk Assessment. The petitioners challenge EPA’s published reference to a scientific study suggesting that atrazine has endocrine-disrupting effects on frogs (See WRRI News March/April 2003). The petitioners allege that there are no validated endocrine-effects tests for atrazine. 
  • A petition by Senators Jim Jeffords, Paul Sarbanes, Barbara Boxer and Frank Lautenberg challenging information upon which EPA based a decision to postpone a permit deadline for a storm water phase II regulation for the oil and gas industry. The Senators had the General Accounting Office conduct an evaluation of the information that they say found some critical data to be out-of-date. 
  • A request by the Competitive Enterprise Institute for EPA to remove Internet links (and thereby stop disseminating) the Climate Action Report of 2002 because data in the report fails to meet standards of the Data Quality Act.
  • Requests by the Ohio EPA that U.S. EPA correct formatting problems with a document on its website relied on for information on RACT for VOC emissions and that it make available in electronic form and correct confusing writing in another document relating to control of VOCs. 
  • A request from the U.S. Chamber of Commerce, which says it has been involved in a long-fought battle to see the Data Quality Act passed and implemented, that the minutes of an Oct 1, 2002, meeting of the EPA Science Advisory Board be corrected to include a statement made by the SAB Chairman Dr. William Glaze regarding EPA’s failure to validate a sizeable number of models used by the agency. (EPA says minutes of meetings are not covered by the guidelines.)
  • A request from BMW asking for correction of EPA’s Enforcement Compliance History Online database and other databases showing BMW in Significant Non-Compliance with RCRA. 


As a further indication of the can of worms the Data Quality Act opened, on April 3, 2003, Public Employees for Environmental Responsibility (PEER) announced that it has lodged a Freedom of Information Act request with the Corps of Engineers asking for its Information Quality Guidelines and has asked OMB to take action to ensure compliance with the Data Quality Act by Pentagon agencies. PEER wants to prepare a series of requests for correction of information regarding Corps water development project studies that it says understate environmental consequences or overstate development benefits but cannot do so because the Corps has not issued its guidelines. 

Resources

Anderson, Frederick R. 2002. “Data Quality Act.” The National Law Journal. October 14, 2002 (https://www.nlj.com/)

David, Joseph A. 2003. “Industry Test-Fires New Secrecy Weapon.” Metcalf Institute Environment Writer  December/January 2002-2003. (https://www.environmentwriter.org/resources/articles/1202_dataquality.htm)

Pielke, Roger, Jr. 2002. “Flying Blind: The Data Quality Act and the Atmospheric Sciences.” WeatherZine No 33, April 2002. University of Colorado Center for Science and Technology Policy Research. (https://sciencepolicy.colorado.edu/zine/archives/33/editorial.html

“Little law could block major government decisions” Environmental Science & Technology Online. Nov 7, 2002. (https://pubs.acs.org/subscribe/journals/esthag-w/2002/nov/policy/rr_epaguidelines.html

Noe, Paul, Frederick R. Anderson, Sidney A. Shapiro, James Tozzi, David Hawkins and Wendy E. Wagner. 2003. “Learning to Live with the Data Quality Act.” ELR 33:10224-10236. (https://www.eli.org/)

National Research Council. 2003. Ensuring the Quality of Data Disseminated by the Federal Government: Workshop Report. Washington: National Academies Press.

“Corps Violates Data Quality Act.” April 3, 2003. Public Employees for Environmental Responsibility website (https://www.peer.%20org/press/326.html

“The Data Quality Act.”  ND. U.S. Chamber of Commerce. (https://www.uschamber.com/isr/dqa.htm)

“Data Quality.” ND. The Center for Regulatory Effectiveness website (https://www.%20thecre.com/quality/20030211_cei.html). 

WRRI Homepage |NCSU Homepage |Other Links |Feedback