THE NATIONAL ACADEMIES

Science, Technology and Law Program

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Ensuring the Quality of Information

Disseminated by the Federal Government

Workshop #3

Agency-Specific Guidelines

 

 

 

 

 

 

 

 

 

 

 

May 30, 2002

 

  The National Academies

               Main Auditorium

   Washington, D.C.

 

 

 

  


 

MS. PAUL:

 

Good morning.  I am Ellen Paul.  I am a public policy representative for the American Institute of Biological Sciences and I am very glad that this workshop is being held.  I sat here probably a year ago in the audience and listened to Jim Tozzi discussing the Shelby Amendment.  Toward the end of his talk, he mentioned “daughter of Shelby.”  My ears perked up at the end of a long discussion, but that one woke me up, and I thought, “well what is that.” I ran home and I Google’d and I found out what it was, and I have been concerned ever since.

 

Nothing that I have seen develop out of OMB or out of the two agencies that I am going to cover today is assuaging me at all.  I would like to make a couple of disclaimers.  First of all, these are my views, not those of the American Institute of Biological Sciences, which, like FASEB, has not discussed this in any great detail at this point, although we did file significant comments on the OMB guidelines and what I have to say today is consistent with those comments.

 

Secondly, I do not brook challenges.  If you don't like what I say, I am not the government.  But it is of the highest quality.  So, you need not worry.

 

The U.S. Department of Agriculture has both intramural and extramural research programs.  They, of course, have the National Research Initiative and a number of different kinds of extramural funding programs through the CSREES and I am good at the acronyms but not at what they stand for.  They also, of course, have quite a bit of intramural research primarily in the Forest Service and in the Agricultural Research Service.

They do, in fact, distinguish between the two types in their guidelines by excluding research that is published by cooperators, grantees and awardees so long as that information is published in a manner consistent with the way that others would publish that kind of information, i.e., in peer reviewed literature as such.  The don't state that, but that is apparently the intent.

 

There is no requirement of a disclaimer, unlike some of the other agency guidelines.  The USDA, in my view, really put some thought into this and I am not going to suggest that you should look at this chart for the detail, but just more for the extent of the thought that went into it.  As you can see, and you will see it with the subsequent overheads, they actually broke down the kind of information and then addressed each of the four standards in the context of that kind of information.

 


A great deal of what they did is restatement of generally sound research principles.  So, for example, you should use the appropriate statistical analysis.  You should make sure your data are clean.  You should design your study properly.  While that might seem obvious, it is probably worth restating.  It is not a bad thing to remind folks that these are the standards to which this agency adheres.  So, you will see that those are the kinds of things that they have talked about.  Clearly identify your objectives.  Clearly identify how you decided that this is the appropriate sample, for example.

 

The reproducibility issue is a little bit murkier in the sense that they don't address the problem where someone is going to -- and I am really not sure how they could, so I don't mean this as a criticism, but how are you going to know in advance if this is the kind of information that is going to end up being highly significant.  In some cases, you can probably make that assessment based on past uses of this kind of information. 

But I think at least half the time researchers will not have the ability to know whether their information falls into that category.  So, I would suggest that the kinds of processes that they are requiring for reproducibility, a researcher would be well advised to follow whether or not they have reason to know that this is, in fact, going to end up in some kind of NEPA statement or regulatory statement or otherwise being highly influential.

 


Utility is a bit of a problem.  What happened in some of these standards is that as always happens when someone is writing a document, sometimes language creeps in that folks don't think about the implications.  So, for example, in one of the utility standards, they talk about – “consult with the users as to whether the information will be useful in advance of doing the work.”

 

Well, one can easily envision a situation where certain user groups, because a user group is not monolithic, so some set of a potential population of users would, in fact, not want the study done, would not want those data available.  They wouldn't want someone to spend the money to generate data that might ultimately result in a regulatory decision that is antithetical to their interests.

 

So, there is no way to resolve that problem.  I don't think it was perceived by USDA in putting that language in there that this could occur.  But I think it is possible that it could occur and then there is nothing in the USDA guidelines to suggest how you would resolve that, where you have a potential conflict among users, one group not wanting the research done, the other group saying "no," this is important to us.

 

It also doesn't take into account USDA's own internal needs.  One thing that did occur with the USDA guidelines is that in some cases they address all the standards as a group or three of the four standards as a group.  It isn't clear to me that that wasn't simply a function of not formatting the document properly and saying this is the reproducibility standard.  This is the integrity standard.  This is the utility standard.

 


Again, just trying to give you an idea of the different kinds of research and information that they have used as categories for this particular analysis, one that is of note here is the regulatory information.  You will note that they include risk assessments, which is where you would expect a discussion of the Safe Drinking Water Act standards in particular.

 

In fact, there really is no discussion of it and that is why the asterisk is there.  That is my asterisk, not theirs.  This is my chart, not theirs.  This is a summary that I prepared.  There is no real discussion of risk assessment beyond the four categories of standards that are here.

 

So, the extent to which the use of models, for example, or risk assessment will be affected by these guidelines is unclear.  While it is not my intent to summarize all of the guidelines for you, I did want to get into the procedures to request a correction just briefly.  It is interesting that in this particular case, they put the burden of proof on the complainant.  I don't know if that was something that OMB envisioned happening.  I know it was in a number of the comments that were filed to OMB, including ours.

 


OMB hasn't addressed it.  I think it is an appropriate thing to do because these regulations or guidelines have a potential to become very burdensome.  Then, secondly, the requirements themselves are not legally binding.  So, someone could miss deadlines, not file in the appropriate manner, not that the requirements are difficult to meet, and still be able to file is a challenge.  So, it could be at any time and in any manner and there is really no penalty for not meeting the procedural requirements.

 

Then, finally, with regard to USDA, what I think is problematic here and really with regard to all the agency standards that I have looked at, is that there is no anticipation that complaints will be filed ad seriatim and that a given individual or group of individuals will repeatedly challenge -- wait the 45 days, get the response, file for reconsideration, get the reconsideration and then file another challenge by another individual, for example, that is substantially similar or to modify the complaint slightly so that this kind of thing can go on potentially for months and years.

 

I think the agencies need to anticipate that kind of thing happening and they haven't done that here.

 

Now, by contrast, the Department of Interior has very, very little extramural research.  Its primary research agency is, of course, U.S. Geological Survey and there is little extramural research funding out of that agency.  It is primarily intramural research.

 


Of course, the other agencies, the mission agencies, which they call bureaus -- you will see the word "bureaus" here frequently, the equivalent of agency -- is -- they also publish a great deal of information and they don't really have the capacity for the kind of review that is contemplated by these guidelines.  I think that is going to be problematic.

  

These guidelines were only published last Friday.  They are not that confusing and, in fact, they are not confusing at all.  I can summarize them in two pages and I did easily.  They essentially did two things.  They said we are going to do what OMB said and we are going to have our agencies implement.  So, presumably we will see some kind of guidelines coming out from the Fish and Wildlife Service, Minerals Management Service, U.S. Geological Survey, Bureau of Indian Affairs -- that will be an interesting one -- and so on.  So, you should see a multitude of implementing procedures and guidelines coming from the Department of Interior if this is, in fact, followed through to its conclusion.

 

They have not really addressed the issue of different kinds of research.  They haven't made any exclusion for extramural research or for funded or contractor or grantee research.  Even though there isn't much of it, they really should have.

 


I wanted to point out that they don't address the four standards individually, except, again, to literally incorporate the OMB definitions.  They spent a fair amount of time on procedures, but, again, I don't think they spent much time anticipating the kinds of problems that will come up with these challenges.

 

Neither agency addressed what I consider to be a real issue and that is the right of the researcher, the right of the publisher of these data to respond.  There is nothing in here addressing that.  To my mind, the biggest problem with these guidelines at any agency is not so much the -- especially research agencies, not so much the data quality assurance procedures because most research agencies have them, use them.  They are quite rigorous.  At least the agencies I have worked with, I can say that is the case.  The real problem is these challenge procedures and I have to wonder how an investigator coming out of graduate school will feel about going to work for an organization, knowing that his or her data and research can be challenged at any time without limit, literally for an entire career.

 

It has got to be a bit of a discouragement to researchers to go to work for the federal agency.  Furthermore, it is going to take their time, even if they don't mind the idea of a challenge coming from literally someone who can walk in off the street and has no scientific information and whose motivation really isn't a challenge to the science, but, instead, to slow down the process.  Even if they don't mind that and they think that is all right, it is still going to take their time and the problem is that this is all going to cost money and there have been no appropriations for the implementation of these systems and I think that is going to erode research because these research agencies will have to allocate funding to do this.

 

But as you can see, the Department of Interior has spent a great deal of time, probably one-fifth of its effort, on the correction procedures.  I suspect that the Department of Interior will have to put a little bit more time and energy into this, considering that the kind of information they generate is so often the subject of intense debate because it involves natural resource management.

 

Finally, the last comment I would like to make is that it is interesting to me the contrast between these two agencies and how much thought and effort went into it and how much detail one agency has and how little the other has.  I suspect that over time we will see these standards changing and morphing as the costs become a burden and as they become adept at handling these kinds of things.  So, I don't expect that this is going to be the last version.  I suspect that over time we will see them change.

 

Thank you.