Statement of
William L. Kovacs
Vice President
U.S. Chamber of
Commerce
Before the
Committee on
Government Reform
Subcommittee on
regulatory reform
U.S. House of
Representatives
On the Subject
of IMPROVING information quality IN THE FEDERAL GOVERNMENT
july 20, 2005
Madam Chairman, members of the subcommittee, thank
you for inviting me to testify on “Improving Information Quality in the Federal
Government.” I am William Kovacs, Vice
President of the Environment, Technology, and Regulatory Affairs division at
the U.S. Chamber of Commerce. The
Chamber is the world’s largest business federation, representing more than
three million businesses of every size, sector, and region.
The quality of information that the public relies on
when making decisions is a matter of importance to all of us. For me to have confidence that my decisions
are sound, I must first have good information.
This is just plain common sense.
Similarly, members of Congress must be able to rely on their staffs, as
well as the Congressional Research Service, to provide good information. In the business sector, tens of billions of
dollars are spent to secure good quality information for decision making. Why then shouldn’t we expect U.S. government
agencies to do the same? That is, why
shouldn’t we expect government agencies to utilize good information when
developing regulations and disseminating information that impacts our lives,
businesses, and institutions? After
all, since the cost of regulation is estimated at approximately $850 billion
annually,[1]
the government must assume some responsibility that its mandates are supported
by good quality data. Doesn’t that make
sense?
The Information Quality Act (IQA) seeks to ensure
that our government’s decisions are based on good quality data. The IQA requires federal agencies to ensure
and maximize the quality, objectivity, utility, and integrity of disseminated
information and establishes a system whereby interested parties can seek
correction of erroneous, disseminated information. The Chamber has been a strong proponent of the IQA, because by
utilizing sound data, we can assure ourselves that, as a nation, we are
focusing our resources on the problems that need to be addressed, and that our
decisions are based on good quality information.
Before turning to the specifics of my testimony, let
me address a mischaracterization of the IQA raised by those who oppose its
implementation. The IQA has frequently
been derided as a tool of industry, which critics claim is being used to
conduct an “end-run” around environmental and employee safety regulations. One particularly vociferous critic has even
charged that agencies can’t afford the
time or expense of revamping [incorrect data].
Correcting the errors would take EPA away from other priorities. Nothing could be farther from the
truth. The IQA is designed to promote
integrity in the agency decision making process, and to enhance the accuracy of
the data underlying government regulatory decisions. As such, the IQA is a tool for everyone—from businesses to
environmentalists to citizens—providing all an equal opportunity to correct
faulty government data, and promoting confidence in government decision
making. Moreover, because of the
difficulties in mounting an IQA challenge, agencies have received very few
substantive petitions for correction.[2] Truth be told, it is hard work developing a
data quality petition. It requires
conducting complex factual and scientific research, obtaining expert opinions,
and understanding a myriad of federal regulations. Perhaps this is why so few data quality petitions have been
filed. Notwithstanding these number
counting exercises, in the end, the data used by federal regulators must be
correct; if it is not, then every activity that uses the flawed data will have
flawed results.
While the available facts establish that application
of the IQA is not overly burdensome on federal agencies, there remain questions
about the efficacy of the IQA. Federal
agencies have strongly resisted compliance with the IQA. They have taken the position that it is not
judicially reviewable and that determinations about the quality of
data
used by an agency are solely within the discretion of the agency.[3] Simply put, agencies want sole discretion
over what data to use, regardless of whether it is the best data, or even
correct data.
Because of the importance that the Chamber attaches
to the government’s use of good quality data, it has undertaken two significant
data quality challenges that aim to address agency resistance to the IQA. First, the Chamber has filed a challenge to
data disseminated by the United States Department of Health & Human
Services (HHS) concerning the relationship between salt and hypertension. This “salt litigation” seeks to establish
the judicial reviewability of the IQA.
Second, the Chamber has filed a data inconsistency correction request
with the United States Environmental Protection Agency (EPA) over numerous
chemicals listed in its various databases.
The problem is essentially this: depending on which database you look
in, you will find vastly different numerical values for the same chemical when
these values should be exactly the same.
These discrepancies among the databases disseminated by EPA create
significant, arbitrary differences in risk assessment outcomes and enforcement
activities.
I will briefly discuss each of these important IQA
challenges in turn.
Salt Litigation
On
April 15, 2005, the Chamber filed an Appellate Brief with the 4th
Circuit Court of Appeals as part of the Chamber’s litigation against HHS. The litigation stems from the agency’s
denial of the Chamber’s IQA petition, which included a request for disclosure
of information that the agency relied on in concluding that salt has
significant adverse health effects on the general population. HHS denied the petition, as well as a
subsequent administrative appeal, insisting that its recommendation on salt
intake was scientifically sound while and has steadfastly refusing to make the
requested information available, which would allow the public to test the
quality of HHS data against the conclusions drawn from it. For this
reason, the Chamber, together with the Salt Institute, sued the agency seeking,
among other things, to compel release of the information for use in determining
the reproducibility of the HHS findings.
The lawsuit also a ruling that whether the IQA is judicially
reviewable.
The
district court dismissed the lawsuit for lack of standing and also held that an
agency's disposition of an IQA-based information and correction request is
solely within the discretion of the agency. The Chamber is appealing the
court’s decision, arguing that the IQA creates information rights that become
judicially enforceable under the Administrative Procedure Act after there has
been final agency action on an IQA petition and appeal. The National Association of Home Builders
and the Grocery Manufacturers of America have also filed amicus briefs with the
4th Circuit on this issue.
If
the district court’s decision is reversed on appeal—as the Chamber believes it
will be—the decision will enable parties to seek judicial review of an agency’s
final disposition of IQA petitions.
Conversely, if the Chamber does not prevail in its court challenge to
establish judicial reviewability of the IQA, Congress will then either have to
provide for judicial review, or accept the contention that federal agencies
have sole discretion over the quality of information disseminated to the public
and to Congress.
Data
Inconsistency
A
second initiative of the Chamber concerns data inconsistencies within databases
and models disseminated by EPA. This
information is used, for example, in understanding how chemicals are
distributed in the environment, in performing risk assessments, and in determining
remedial measures for contaminated sites and natural resource damages.
The Chamber, through a request for correction filed
with EPA, set forth comparisons of different databases showing that the data
disseminated by the agency is inconsistent and faulty. The Chamber also provided evidence
demonstrating how the use of such faulty data can cause the unnecessary
expenditure of tens of millions of dollars in cleanup costs at a contaminated
site. The Chamber suggests that such
unwarranted costs aggregated over all the uses to which such data are employed
would amount to the unnecessary expenditure of billions of dollars without a
corresponding amount of protection for health and safety. In its request for correction, the Chamber
cited questionable databases that are used, for example, to assess the
environmental impacts of groundwater contamination, leaking underground storage
tanks, MTBE in ground water, Superfund hazardous waste cleanups, occupational
exposures, and natural resource damage claims.
To appreciate the extent of such activities, consider that there are
more than 12,000 active and inactive Superfund sites in the United States. There is little doubt that improving the
faulty data could lead to better regulatory decisions; reduce uncertainties;
mitigate the prospect of time-consuming litigation; and reduce instances in
which scarce resources (time and capital) are wasted addressing the wrong
problem, or the right problem in the wrong way.
In its request for correction, the Chamber asked that
the erroneous data be corrected. To
understand the complexity of the correction request, it is necessary to
recognize that there are two types of problems with the disseminated databases
and models: [1] there are data inconsistencies among them; and [2] even leaving
aside the data inconsistencies, the databases and models contain erroneous data
and data of uncertain quality, and being able to assure that all the individual
data associated with the databases and models are reliable is a challenging undertaking.
Data inconsistency is relatively easy to
understand. It occurs when the same
chemical has a different numerical value depending on which database you are
looking at. For example, in the
ChemFate database, one particular property parameter, Kow for total
PCBs,[4]
is assigned a value of 7,900, whereas in the Soil and Transport Fate database,
the same Kow for total PCBs is assigned a value of 169 million. Both values cannot be right, and the choice
of which value to use will ultimately result in vastly different assessments
and remediation costs when applied to real world cleanup decisions.
Unfortunately, making the data in the databases
consistent is only the first step. The initial data selected must also be
reliable. Assuring this latter objective
is a more difficult undertaking. To
understand the problem in simple terms, imagine that in one database the price
of a quart of milk is listed as $10 million and in a second database the price
of a quart of milk is listed as $5.
Officials responsible for establishing consistency between the two
databases meet and subsequently revise the two databases, but now in each
database the price of a quart of milk is listed as $15,000. So there is certainly consistency—both
databases yield the same answer—but the answer happens to be wrong, as a quart
of milk certainly doesn’t cost $15,000.
Analogously, problems with the data entries in databases and models
disseminated by EPA need to be addressed, because many, if not most, of the
data entries in the databases are not well established. In fact, one request the Chamber made to its
consultant, Cambridge Environmental, was to check EPA’s original research to
determine if appropriate data values were properly reflected in the
databases. The conclusion regarding the
several values considered was that information reported in original research
was not properly taken into consideration, and this is reflected in incorrect
data entries in the disseminated databases and models.
How to Address This Problem
The Chamber believes that addressing this problem
requires developing and applying an agreed upon standard methodology for
critical review of data—something that, as required by Congress, the National
Institute for Standards and Technology does so well and which has also been
done by the U.S. Geological Survey.
This is why assembling a federal interagency work group to look at the
problem would be a desirable course of action, as the intellectual expertise of
federal employees who understand this issue is resident collectively among
various government agencies. The
Chamber contends that such an interagency group could establish an efficient
process for forward progress on this matter.
Chamber Provided EPA With all
the Chamber’s Information
This is not a game of “gotcha.” Getting the data right is a serious matter
with consequences potentially impacting every risk assessment developed by
government, every environmental cleanup, and every natural resource damage
claim. It will even impact what new
chemicals can go on the market.
Recognizing the seriousness of this issue, the Chamber provided EPA not
only with petitions, but also with the research it had commissioned from
Cambridge Environmental, including all attachments and a copy of a key study
performed by the U.S. Geological Survey.
The Chamber gave EPA all of its research, including simple, clear
examples of the data inconsistencies.
EPA’s Response – A Refusal to
Consider the Facts
EPA’s response to the Chamber’s correction request
literally ignored the issue raised. EPA
responded that:
1.
The
databases and models in question are individually in conformance with the EPA’s
Information Quality Guidelines.
2.
It
temporarily removed one database from its web site, but did not acknowledge any
problems.
3.
Some
databases were superseded by new databases (an action that is not guaranteed to
fix the problem).
4.
A
valid reason for differing values among databases is site-specific conditions.
5.
Ownership
of databases and models resides with contractors or third parties, and the responsibility
for correctly using them and determining the quality of the data therein rests
with the user, not EPA.
6.
Disclaimers
have been attached to, or made in regard to, certain databases and models.
The Chamber Sent EPA’s
Response Back to Cambridge Environmental for Review
Cambridge
Environmental found that:
1.
Database
and model errors cannot be explained away by invoking site-specific
conditions. Such conditions account for
only a small portion of the variances in the data.
2.
Peer review was poor, in some instances, did
not occur at all, and in other cases the wrong information was used.
3.
Databases
that supersede older databases are not necessarily correct, because errors
propagate from one information source to another.
4.
EPA
funded the development of databases and models whose reliability it failed to
properly assess.
5.
In
various ways, EPA disclaimed responsibility for the quality of disseminated
information. One such example of
disclaimer language is: This software and
the accompanying files are provided as is and without warranties whether
expressed or implied. The user assumes
the entire risk of using the program.[5]
In sum, EPA refused to examine inconsistencies among
disseminated models and databases; refused to accept responsibility for the
quality of the models and databases it disseminates, instead passing
accountability to contractors, third parties, or users of the databases and
models or issuing disclaimers; and failed to adequately peer review the
databases and models. This is both
arrogant and irresponsible.
* * * * *
Madam Chairman, the Chamber can provide Congress
with all of the written information developed on this issue that has been
communicated to federal government officials, including expert reports and
attachments. Moreover, for the record,
the Chamber was informed on July 12, 2005, by Igor Linkov of Cambridge
Environmental, that the Cambridge Environmental study was submitted to the
prestigious journal, Environmental
Science & Technology, and has been successfully peer reviewed and
accepted for publication.
Conclusion
In conclusion, the Chamber remains hopeful that the
courts will affirm the judicial reviewability of the Information Quality Act in
the near future. As to the problems
among databases and models that EPA disseminates, the Chamber suggests that the
administration or Congress establish an interagency panel that includes the
National Institute of Standards and Technology, the U.S. Geological Survey, and
other federal agencies that use the disseminated information. The purpose of the interagency panel will be
to examine how physical chemical property data associated with disseminated
databases and models can be critically reviewed to improve their reliability.
I thank this committee for the opportunity to
present the Chamber’s views and recommendations about the Information Quality
Act and its utility.
[1] W. Crain and T. Hopkins, The Impact of Regulatory Costs on Small Firms, Report RFP No. SBAHQ-00-R-0027, for the Office of Advocacy, U.S. Small Business Administration (July 2001).
[2] Some individuals have argued that the IQA is just another tool for regulatory obstruction. But is it? According to FY 2003 annual agency reports sent to OMB, 19 federal agencies and departments received 24,619 requests for correction. This may seem like a burdensome number, however, it isn’t. This is because, of these requests, 24,433 were submitted to the Federal Emergency Management Agency (FEMA) for minor revisions and amendments to flood insurance rate maps. FEMA typically receives thousands of such requests year in and year out. With the advent of the IQA, FEMA has processed such requests through its information quality process. As such, the IQA did not stimulate these requests; rather it merely provided an alternative means to address them. Similarly, of the 89 correction requests received by Department of Transportation, 87 concerned individual data items on motor carrier safety reports. The point of these statistics is that excluding FEMA, 18 federal agencies and departments received just 186 requests for correction. OMB deemed 30 to 40 substantive in nature, and only eight influential. Of the eight influential requests for correction, four were denied outright, one was partially addressed through a process change, and three were still pending at the close of the FY 2003 reporting period. In other words, the regulatory process has not come to a grinding halt as a result of being swamped by correction requests submitted by business and industry stakeholders. This fact contradicts those who view the IQA as a tool for regulatory obstruction.
[3] A June 10, 2002, memorandum from John Graham,
Administrator of the Office of Information and Regulatory Affairs, Office of
Management and Budget to the President’s Management Council, discusses the
“appeals mechanism” for IQA denials. In the memo, issued at the time most
agencies were in the process of developing their IQA Guidelines, Graham states
that by agencies asserting in IQA Guidelines that IQA denials are not
judicially reviewable doesn’t necessarily make it so. Specifically, he
states that agencies should be aware that their statements regarding
judicial enforceability might not be controlling in the event of
litigation. Graham goes on to
say: We note, in this regard, that
a number of agencies emphasize that their guidelines are not intended to
provide any right to judicial review. A
few agencies even stress that their guidelines may not be applicable based on
unspecified circumstances and that the agency may be free to differ from the
guidelines where the agency considers such action appropriate. Regardless of what kinds of
litigation-oriented disclaimers the agencies may include, agency guidelines
should not suggest that agencies are free to disregard their own
guidelines. Therefore, if you believe
it is important to make statements that your agency’s guidelines are not
intended to provide rights of judicial review, we ask that you not include
extraneous assertions that appear to suggest that the OMB and agency
information quality standards are not statements of government-wide policy,
i.e., government-wide quality standards which an agency is free to ignore based
on unspecified circumstances.
See also, Brief for the Appellee at 30, Salt Institute
v. Michael O. Leavitt, 345 F. Supp. 2d
589, No. 05-1097 (4th Cir., 2004), in which the U.S. Department
of Justice states, It is well
established, however, that an agency’s reports and other statements lacking the
force and effect of law do not constitute final agency action within the meaning
of the APA.
[4] Kow is a coefficient representing the ratio of a compound in octanol (a non-polar solvent) to its solubility in water (a polar solvent). It is generally used, for example, as a relative indicator of a tendency of an organic compound to absorb to soil.
[5] Refer to footnote 8 of the Chamber’s April 11, 2005 Request for Reconsideration of the Chamber’s Request for Correction.