Original url: http://www.ombwatch.org/article/archive/231?TopicID=2
 
Democracies die behind closed doors
 
The Data Quality Act passed through Congress in Sec. 515 of the Treasury and General Government Appropriations Act for Fiscal Year 2001 (Public Law 106-554; H.R. 5658). The guidelines, implemented in 2002, could be misused to delay, manipulate, and unfairly affect the outcome of federal agencies' activities.
The Data Quality Act

OMB's guidelines were required by an appropriations rider, sometimes referred to as the "Data Quality Act," contained in Sec. 515 of the Treasury and General Government Appropriations Act for Fiscal Year 2001 (Public Law 106-554; H.R. 5658). There were no hearings on the rider -- which was added at the last second by Rep. Jo Ann Emerson (R-MO), who serves on the Appropriations Committee -- and no debate on the floor, leaving little to judge congressional intent besides the legislative language.

Specifically, the law directed OMB to issue, by Sept. 30, 2001, "policy and procedural guidance to Federal agencies" subject to the Paperwork Reduction Act (44 U.S.C chapter 35) requiring that they:

This rider builds on an industry lobbying effort to put roadblocks in the regulatory process. As noted by the Center for Regulatory Effectiveness, a strong advocate for the rider, there was similar report language added to the FY 99 Omnibus Appropriations Act (PL 105-277), also added at the last second without any debate.

The report language (at House Report 105-592 (at 49-50)) directed OMB to develop "rules providing policy and procedural guidance to Federal agencies for ensuring and maximizing the quality, objectivity, utility, and integrity of information (including statistical information) disseminated by Federal agencies, and information disseminated by non-Federal entities with financial support from the Federal government." The report language, incorporated by reference in the conference report, also calls for administrative mechanisms for error correction to be established at each agency. CRE and other pro-industry representatives were frustrated that OMB never issued guidelines based on the report language, persuading Emerson to put it into law. Unlike the report language, however, the Emerson rider does not specifically apply to "non-Federal entities with financial support" from the government.

Some have expressed concern that the Data Quality Act (DQA) also builds on another appropriations rider offered on the FY 99 Omnibus Appropriations Act by Sen. Richard Shelby (R-AL). The Shelby amendment directed OMB to revise OMB Circular A-110 -- a circular dealing with grants to nonprofits -- to allow public access to federally funded research data through Freedom of Information Act (FOIA) requests. The rider only applies to research done by federal grantees, not contractors, and like the DQA was done without any public debate or scrutiny. Some speculated that industry would request underlying data from universities, for instance, potentially stifling ongoing research that might lead to regulation by federal agencies.

OMB's Implementing Guidelines

OMB published final implementing guidelines in the Federal Register on Sept. 28, 2001. In doing so, OMB requested additional comments on the "reproducibility" standard and the related definition of "influential" information (discussed below), which were issued on an interim final basis. A final verdict on these issues was published in the Federal Register on Jan. 3, 2002, corrected on Feb. 5, 2002, and republished on Feb. 22, 2002 (Vol. 67, No. 36, pg. 8452).

Specifically, in developing their own guidelines, which are to be finalized by Oct. 1, 2002, agencies are to:

OMB notes that its guidelines are intended to allow agencies to incorporate their existing practices in a "common-sense and workable manner," rather than "create new and potentially duplicative or contradictory processes." For example, OMB acknowledges that under OMB Circular A-130, agencies already address data quality issues. At the same time, OMB prescribes a number of requirements that go beyond the statute, instructing that "agencies should not disseminate substantive information that does not meet a basic level of quality."

There is significant debate on whether the new law and OMB's guidelines create any new judicially reviewable process. Industry lobbyists suggest that the administrative mechanisms for error correction, including the appeals process, establish new legally reviewable responsibilities. In their draft guidelines, several agencies argue the opposite. OMB has taken no position on this.

Through all of the above requirements, the definition of "quality" information is crucial. As stated above, the statute directs guidelines that ensure "quality, objectivity, utility, and integrity" of information disseminated to the public. OMB treats "quality" as "an encompassing term comprising utility, objectivity, and integrity" and provides definitions for each of these constituent terms.

The definitions for "utility" and "integrity" appear relatively benign. "Utility" refers to information's usefulness to the public. "Integrity" refers to the "protection of the information from unauthorized access or revision." (Because OMB's guidelines encourage flexibility, agencies may have their own unique, and perhaps more specific, definitions for these terms.) The expansive definition of "objectivity" is where things get complicated, as OMB packs in a number of controversial requirements.

OMB instructs that "objectivity" contains two elements. The first involves presentation -- whether information "is being presented in an accurate, clear, complete, and unbiased manner. This involves whether the information is presented within a proper context." (emphasis added) How this is woven into administrative mechanisms for error correction remains unclear and will likely be a contentious issue for some agencies.

The second element involves the substance -- whether it is "accurate, reliable, and unbiased information" and uses "sound statistical and research methods." For dissemination of information that is "influential," OMB calls for even higher standards of data quality. Specifically:

"Influential" scientific, financial, or statistical information must be "reproducible." "Influential" means that the agency can "reasonably determine that dissemination of the information will have or does have a clear and substantial impact on important public policies or important private sector decisions." Such information must be reproducible by "qualified third parties," meaning the same result would be achieved following reanalysis. Accordingly, agencies must offer a "high degree of transparency about data and methods" to ensure reproducibility, which is applied differently for three types of "influential" information:

Peer review may be necessary to demonstrate objectivity. According to OMB, "'objectivity' involves a focus on ensuring accurate, reliable, and unbiased information," which can be achieved "using sound statistical and research methods." Independent, external peer-reviewed information "may generally be presumed to be of acceptable objectivity." However, OMB throws in a kink; peer-review may not be adequate to demonstrate "objectivity" if a "persuasive" showing is made to the contrary. OMB's guidelines do not direct peer review panels to be balanced in terms of viewpoints. Rather, as recommended by OMB on Sept. 20, 2001, peer reviewers are to be selected "primarily on the basis of necessary technical expertise," and any financial conflicts of interest or related policy positions taken prior to peer review must be disclosed to the agency, but not necessarily the public.

Agencies must "adapt or adopt" requirements for risk assessment under the Safe Drinking Water Act Amendments of 1996 (42 U.S.C. 300g-1(b)(3)(A) & (B)). "With regard to analysis of risks to human health, safety, and the environment maintained or disseminated by the agencies," agencies are to meet principles laid out in the Safe Drinking Water Act (SDWA), which are perhaps the most rigorous standards for risk assessment written into statute.

Previously, John Graham, administrator of OMB's Office of Information and Regulatory Affairs (OIRA), had issued an agency-wide memo on regulatory analysis that also pressed SDWA principles for risk assessment, saying that agency proposals employing these methods would be viewed more favorably by OIRA -- which must grant clearance to all health, safety, and environmental protections before they can take effect. Graham, who was in charge of formulating OMB's data quality guidelines, seized the opportunity to achieve formal adoption of these risk assessment principles across agencies.

The SDWA places particular emphasis on "peer-reviewed science and supporting studies" and asks for very detailed information about the risk being examined. For instance, the agency is to identify "each population" affected, the "expected risk" for each of these populations, and "each significant uncertainty" that emerges in the risk assessment. Graham has said such rigor, specifically the practice of agency peer review, should satisfy the "objectivity" requirement of the guidelines.

The U.S. Chamber of Commerce and other industry representatives argue that all information related to a rulemaking, regardless of whether it was generated by a third party, should be presumed to be "influential" and subject to the robust requirements suggested by OMB. Accordingly, industry would like agencies to label information as "influential" or "non-influential" and have a very open definition. Industry does not want agencies to define "influential" as equivalent to an "economically significant" rule under Executive Order 12866; they consider that too narrow.

Background and Data Quality History