TheCRE.com
CRE Homepage About The CRE Advisory Board Newsletter Search Links Representation Comments/Ideas
Data Access
Data Quality
Regulation by Litigation
Regulation by Appropriation
Special Projects
CRE Watch List
OMB Papers
Abstracts and Reviews
Regulatory Review
Voluntary Standards Program
CRE Report Card
Public Docket Preparation
Consumer Response Service
Site Search

Enter keyword(s) to search TheCre.com:

Reg Week®: CRE Regulatory Action of the Week

CRE Submits Comments on "Reproducibility" Standard in OMB's Data Quality Guidelines
CRE submitted comments to OMB on the "capable of being substantially reproduced" standard included in the agency's final Data Quality guidelines. The reproducibility standard was issued on an interim final basis, and OMB accepted additional public comments on that key aspect of the guidelines through October 29, 2001. CRE strongly supports OMB's reproducibility requirement as a standard of care for governmental information. If information is not sufficiently robust that it cannot be reproduced by independent parties across testing environments, it should not be deemed adequately reliable for dissemination to the public. CRE urges OMB to retain this important aspect of the guidelines.

  • Click to read more
  • Click to read more about the Data Quality Act and related issues
  • Comment on Item



    October 26, 2001

     

    Ms. Brooke J. Dickson
    Office of Information and Regulatory Affairs
    Office of Management and Budget
    New Executive Office Building, Room 10236
    725 17th Street, N.W.
    Washington, DC 20503

    Re: Comments on OMB’s Interim Final Data Quality Guidelines

     

    Dear Ms. Dickson:

    The Center for Regulatory Effectiveness (CRE) has maintained a significant and ongoing interest in the area of Data Quality, so we would like to compliment OMB on the timely issuance of efficacious Data Quality guidelines that will help ensure high standards for information that the government disseminates to the public. Consequently, the Center is pleased to offer the following comments in response to OMB’s request for additional public input on the term "capable of being substantially reproduced" in OMB’s "Guidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity of Information Disseminated by Federal Agencies," 66 Fed. Reg. 49718 (September 28, 2001).

    From the outset, I would like to express CRE’s strong support for OMB’s Data Quality guidelines, particularly the requirement that analytic results be "capable of being substantially reproduced." In the comments that follow, CRE provides its recommendations for further refinements and clarifications to the guidelines.

     

    Reproducibility/Replicability of Results Across Testing Environments

    (1) When an Agency Uses Data From Validated Tests

    • Tests (animal and/or human) that were developed and validated through an open consensus process can be presumed to have demonstrated that they generate the same data at any laboratory using the same standard protocol, as long as the tests are being used to detect the same endpoints for which they were developed. Thus, for data generated from such consensus tests:

    - The disseminating agency should be able to presume that the test data satisfy inter-laboratory consistency concerns for all the end points encompassed by the validation and demonstration.

    (2) When an Agency Uses Data From Non-Validated Tests

    • Tests (animal and/or human) that were not developed and validated through an open consensus process cannot be presumed to have demonstrated that they generate the same data at any laboratory. Thus, when data has been generated from such non-consensus tests:

    The disseminating agency should be required to demonstrate and document that the tests will generate substantially the same data at any laboratory conducting them properly for all the involved end points before the agency disseminates information containing or based on the data.

    (3) When an Outside Party Submits Contradictory Information

    • When a party outside the agency submits information to the agency that appears to demonstrate that other laboratories generated significantly different data using the same tests and test protocols for end points involved in an information dissemination:

    - The disseminating agency should be required to consider this data and resolve this discrepancy in accordance with sound scientific principles before the agency disseminates the relevant information.

    - If subsequent attempts to replicate the original experiment or observations by other qualified persons demonstrate that the originally reported results are not, in fact, substantially reproducible, those original results must be regarded as unreliable.

    Additional Recommendations

    Need for Validated Models

    • Applying unvalidated models to data sets can produce "reproducible" but unreliable results.

    - CRE recommends that the definition of "reproducibility" should also include tests of robustness so as to exclude results which may not be reliable, even if, because they are model-driven, they appear to be reproducible.

    Minimum Peer Review Requirements

    • On September 20, 2001, Dr. John Graham issued an important memorandum and attachment setting forth OIRA regulatory review principles and procedures.

    - CRE recommends that the Data Quality guidelines incorporate, either directly or by reference, the attachment's discussion of peer review. The standards set for peer review by this document would enhance the objectivity and transparency of the peer review process.

    - The presumption of objectivity based on peer review, which is currently stated in the guidelines, should be clarified to require that agencies follow the peer review recommendations.

     

    Data Quality Guidelines Linkage to the Paperwork Reduction Act

    • OMB’s Data Quality guidelines note that, "Many comments raised questions and concerns about how these guidelines interact with existing statutes and policies, including the Paperwork Reduction Act [PRA] .... We have attempted to draft these guidelines in a way that addresses the requirements of section 515, but does not impose a completely new and untried set of standards upon Federal agencies."

    CRE recommends that the linkage between the guidelines and the PRA be strengthened so as to clarify the role of guidelines vis-a-vis the PRA, and, thus, reduce any uncertainty regarding standards to be set for federal agencies. Specifically, we recommend that the OMB guidance describe the PRA as containing the original and fundamental statutory authority and directives, with the statutory language in the Appropriations Act being supplemental to the PRA.

    Objectivity of Formal Peer Review

    • · The OMB guidelines state that "... we regard technical information that has been subjected to formal, independent, external peer review as presumptively objective. An example of a formal independent external peer review is the review process used by scientific journals." CRE agrees with the presumption that the afore-described process is objective; however, we believe that such presumption should be rebuttable.

    - CRE recommends that, if outside parties are able to demonstrate that the peer review was not objective (i.e. that the process was not independent, external, or was driven by policy rather than scientific considerations), agencies must consider such evidence and, if warranted, not accept the objectivity of the technical information.

     

    Safe Drinking Water Act Language

    • The OMB guidelines use language from the Safe Drinking Water Act (SDWA) requiring the use of "the best available, peer-reviewed science and supporting studies conducted in accordance with sound and objective scientific practices."

    - The Safe Drinking Water Act provides a standard to which all government information dissemination should be held. CRE strongly supports continued use of the SDWA language.

    Influential Scientific or Statistical Information

    CRE supports OMB’s definition of "influential scientific or statistical information" as well as the high standards set for information in this category.

    CRE appreciates this opportunity to comment, and we would be pleased to further assist the agency regarding Data Quality Act implementation, as appropriate. Please feel free to contact me, should you have any questions or require further clarification.

     

    Sincerely,

    Jim J. Tozzi
    Member, CRE Board of Advisors