TheCRE.com
CRE Homepage About The CRE Advisory Board Newsletter Search Links Representation Comments/Ideas
Data Access
Data Quality
Regulation by Litigation
Regulation by Appropriation
Special Projects
CRE Watch List
OMB Papers
Abstracts and Reviews
Regulatory Review
Voluntary Standards Program
CRE Report Card
Public Docket Preparation
Consumer Response Service
Site Search

Enter keyword(s) to search TheCre.com:

PUBLIC COMMENT ON THE CRE DRAFT DATA QUALITY RULE

Commenter: Roger Walk, Ph.D.

Affiliation: Director, Worldwide Scientific Affairs Philip Morris USA

Comment:

June 26, 2000

Center for Regulatory Effectiveness
11 Dupont Circle, NW, Suite 700
Washington DC 20036

Re: Interactive Public Docket comment on the CRE Proposed Draft Quality Regulation

In response to your public encouragement to participate in the Interactive Public Docket regarding comments on the CRE Proposed Draft Quality Regulation, you will find Philip Morris USA's comments attached.

We are an enthusiastic and serious participant of such interactions and bring talented resources and expertise to such communication and collaboration. As stated at our home page www.philipmorrisusa.com, Quality is one of our core values and we are committed to quality in everything we do at Philip Morris USA.

In the attached document, Philip Morris USA has prepared comments and recommendations to the proposed regulation. We strongly encourage requirements for use of accredited, recognized and validated methods when available, including appropriate peer review, to form the basis for data quality standards and appropriate research practices.

We believe Data Quality forms the basis for high-quality research, collaboration, the exchange of information, and constructive dialogue.

Sincerely,

Dr. Roger Walk
Director Worldwide Scientific Affairs
Philip Morris USA


PHILIP MORRIS U.S.A.
P.O. Box 26583
Richmond VA 23261-6583

Comments On:

Center for Regulatory Effectiveness (CRE) Proposed Draft OMB Data Quality Regulation: Standards for Agency Regulations Governing Data Quality, EXCERPT: 2. Identification of substantive standards for data quality.

Posted for public comment at http://www.thecre.com/quality/dqreg.html by the Center for Environmental Effectiveness (CRE), 11 Dupont Circle, NW Suite 700, Washington DC 20036

June 26, 2000

SECTION I: OVERVIEW AND BACKGROUND

In order to assist Office of Management and Budget (OMB) in meeting its mandate from Congress, the Center for Regulatory Effectiveness (CRE) has developed a draft Data Quality regulation. This proposed regulation, Standards for Agency Regulations Governing Data Quality, defines four key data quality terms and lays out a proposed petition process and mechanism for correction of information. CRE posted the proposed regulation for public comment via the CRE website in March of this year.

Background: Pursuant to the FY 1999 Omnibus Appropriations Act (PL 105-277) and conference report, Congress directed the OMB to develop a regulation to ensure the "Quality, Objectivity, Utility, and Integrity" of information used or disseminated by the federal government. The rule, which was to be completed by September 30, 1999, included a petition process through which the public may petition the relevant agency for the correction of information that does not meet the standards laid out in the OMB rule. Individual agencies would then adopt conforming regulations to implement the OMB rule.

In this written comment, Philip Morris USA has prepared the following comments and recommendation to the proposed regulation to highlight requirements for use of accredited, recognized and validated methods when available, including appropriate peer review, to form the basis for data quality standards and appropriate research practices.

SECTION II: COMMENTS ON PROPOSED REGULATION {Note: Comments and recommendations will follow the paragraph numbering used by the CRE draft standard.}

CRE Proposed Standards for Agency Regulations Governing Data Quality. Section 2. Identification of substantive standards for data quality.

General Comment: Recommend the proposed regulation include references to U.S. and worldwide data quality benchmarks.1

Section 2.a. Quality:

(i) agency information arises from a soundly-designed study using appropriate research practices including design, analysis and interpretation which produces results that are accurate, internally consistent, and valid; and
Comment: Recommend the inclusion of the above noted terms and also recommend the use of accredited, recognized or validated methods that establish the context of appropriate research practices or established standards e.g. Good Laboratory Practices1and ISO.2 Data should also indicate estimates or measures of variability and uncertainty.

(ii) (i) (a) Accuracy: Information is precise and can be verified by an objective, authoritative standard.

Comment: Recommend the use of external peer review in the absence of an authoritative standard through the use of appropriate expert panels.

(ii) (ii) (b) Comparability: Similar information should be structured so as to facilitate review and analysis across data sets.

Comment: The use of appropriate research practices as outlined in 2.a.(i) in this proposed regulation would meet the requirements of this section.

(ii) (iii) (c) Impartiality: In providing scientific information, there should be fair treatment in all aspects of study design and execution.

Comment: Recommend that "Impartiality" be defined as "the control of bias as part of research practices." Although "Fairness" is an essential subjective element for effective discussion and conclusions, clear definition of criteria for judgement provide quantitative measurement.

(ii) (iv) (d) Representativeness: Data elements which are part of a study should provide a fair and accurate picture of the universe of potential subjects.

Comment: "Representativeness" requires that the data elements selected be representative of the body and volume of available information. Careful selection of data representative of the body of information will ensure a representative picture that is "fair and accurate." This includes the use of appropriate sampling techniques, which, when utilized will provide a "fair and accurate picture" representative of the total body of information. This analysis should include identification of sampling risks (alpha & beta) and having external peer review to identify any sampling bias.

(ii) (v) (e) Validity. Agency data should represent a true value of those measures it is designed to measure.

Comment: "Validity" is determined through external peer-review and through the use of proper measures. External peer-review can "validate" that the appropriate measures or "quality criteria" were established beforehand, and that what was to be measured, was measured appropriately. Validity includes the appropriate selection and use of statistical methods for analysis and interpretation (e.g., correctly identifying probability distribution model (normal, exponential, Weibull, Poisson, binomial, etc.) and associated tests of hypothesis; using non-parametric methods of data analysis when data is not normally distributed; calculating and reporting the power of the analysis, and associated confidence limits. Any deviation from standard practice of using a certain method, model, form of statistical analysis, etc., should disclosed with documentation supporting the decision to deviate from standard practice.

Section 2.b. Objectivity.

(i) the data must exist as a fact independent of the expectations of researchers or their subjects.

Comment: Recommend change section (ii) to read…. "is determined by how the researchers sought to prevent and account for the common causes of distortion or potential conflicts of interest." Data never exist "as a fact independent of expectations" as there is always the possibility of human intervention as part of the research practice (human error in the design, execution, and analysis).

With respect to research results, it is also required that the results be substantially reproducible by persons with levels of expertise similar to the level of expertise of the persons who created the data, upon independent analysis of the underlying data.

The Proposed Rule also establishes guidelines for documentation. Specifically, creators of information should be meticulous in their documentation. All aspects of study design and execution should be clearly articulated, and all areas of uncertainty and contradiction should be carefully explained, so as to assist in validation of results by peer reviewers and other interested parties.

Comment: This preceding section is acceptably well described as written.

Section 2.c. Utility.

Comment: Recommend definitions for 'timely', 'relevant' and 'useful.' Data "utility" also requires that the data can be readily transferred and managed through appropriate data management processes. Note that "Basic research" does not have an immediate "utility".

Section 2.d. Integrity.

In the scientific community, data is generally considered to be unbiased when it is based on the best available scientific, technical or other data which would produce the most accurate, "real world" results.

As regards data security, examples of data security problems include breaches of computer files by unauthorized parties who then corrupt existing data, and unethical falsification of data by researchers. Thus, appropriate "fire-walls" and other protective measures are often required to assure data integrity as well as proper data maintenance.

Comment: Recommend the inclusion of the above noted terms. Note that all data contains bias whether this bias is known or unknown. It is generally recognized as "unbiased" when steps have been instituted to control, minimize or account for bias. Possible bias should be identified and disclosed.

SECTION III: SUMMARY ON COMMENTS AND RECOMMENDATIONS

The above noted comments and recommendations to the proposed regulation highlight the requirement to use accredited, recognized and validated methods when available, including appropriate peer review, to form the basis for data quality standards and appropriate research practices.

--------------------------------------------

1 e.g., Food and Drug Administration GLPs (21 CFR Part 58)
FDA Preamble to the GLPs (Sept. 4, 1987)
EPA GLPs (40 CFR Part 160)
Preamble to EPA's 1989 FIFRA GLPs with index to sections
Good Automated Laboratory Practices, EPA 2185; Aug. 10, 1995
OECD's Principles of Good Laboratory Practice, Monographs 1, 11, and 49. IOMC, 1992-98

2 International Organization for Standardization homepage