Dr. Wayne Jonas Oral History 1996
Dr. Wayne B. Jonas
Director of the Office of Alternative Medicine
October 8, 1996
Building 31, National Institutes of Health
The interviewer is Dr. James Harvey Young
Young: One of the recent publications indicated that under consideration was the changing of the name of the office.
Young: Has that--
Jonas: It's not under consideration; it actually, I think, is on paper and has actually occurred, although we haven't publicly announced it. We don't plan on making it a big deal, but we are going to change our letterhead and it has been approved by the Office of the Director, and the Council approved it, and so I think we have to actually incorporate it into our Charter before it becomes official.
Young: So it would be?
Jonas: It would be the Office of Complementary and Alternative Medicine.
Young: Explain the value of the change to this.
Jonas: First of all, the term "complementary" medicine is more accurate in describing how people use this. Very few people actually use many of these practices by themselves as alternatives, although there are some. It's the vast minority that do that. Most people use them as complementary systems, that is as practices that they use alongside of conventional, established medicine that they get inside their hospital or their average doctors' offices, and then they supplement that care with this kind of care.
Young: So, accuracy of public comprehension lay behind the desire to get the name changed?
Jonas: Yes. This is the main thing, and for some consistency with other countries. For example, in England and in Europe, where they've, I think, been more seriously studying these things for longer than we have, they use the term complementary medicine, and so my feeling is that we ought to at least--and there are some good reasons that they do that too--so that is the second reason why I felt it was useful.
Young: Speaking of Europe, have the efforts that you've made to get international standards in the field made progress?
Jonas: Well, we just started that. I mean, again, I've come in about a year ago and basically reorganized the office to try to approach a variety of--this area--in a systematic way, and one of the things is developing international standards, and we're just starting on that and we have a very small office, very few people, and so we chip away at it a little bit at a time.
I came back from Germany about a week ago and there had some conversations with the BMBF, which is the Government health funding organization, and they have a section that funds complementary medicine also. And I met with them and I met with a number of their contractors and a number of their hospitals that they are, in fact, funding to organize this area and we specifically talked about issues of standards in these areas. We have interacted before on an informal level with a number of those groups for quite a while and I think we're going to, hopefully, have a more official kind of interaction to develop a cooperative effort in the area of developing standards. One of the obvious areas is in database development; just how do you classify the literature, how do you quality evaluate the literate. And this is one reason why I've pushed our involvement in the Cochran Collaboration, and we have supported the development of a field group in the Cochran Collaboration. One of our centers is the Coordinating Center for that field group in the Cochran Collaboration, and I've worked for it for years with them, and they are, in fact, working a lot on the areas of standards and getting international standards for the evaluation of the quality, at least in controlled clinical trials, and so my feeling is that this is where we ought to also have an integral part to make sure that those same standards are applied here. So, there is some progress along those areas.
Young: Well, let's step back then to your beginning. The literature you sent did give more detail about your biography. I was interested in what caused your curiosity to be stimulated when that was really quite rare, I would think, among those training to become a primary care physician, and why was it that you took this on eagerly and began then--maybe here, but particularly in Germany.
Jonas: I don't know. I mean, my rationalization is that I was looking for solutions to help patient problems that I didn't have tools necessarily to help, nor did anybody else that I perceived, and I guess that's my rationalization for taking it on. I got exposed to it in a number of different ways but I think most poignantly in Germany when I saw that a number of the physicians that I perceived as being just like me. Western trained conventional physicians, in fact, were incorporating many of the practices that were either not talked about or given a negative spin in this country, and they would incorporate them into their practice and said, "Oh, well, I use this for that, and this homeopathic remedy works well for sciatica, and this herb works well for dementia and tinnitus, and this type of thing,". I became intrigued by that because it appeared that they had a broader range of tools to use to help their patients and, as a family practitioner that deals with a variety of types of patients from the very young to the very old, the breadth of valuable tools, of useful tools, of effective tools, is extremely important.
Young: So, in a kind of an observational pragmatic way you came to feel that these tools that were neglected in the United States were proving out? At least you wanted to see if that were true?
Jonas: I wanted to explore it. I mean, the individuals that were using these were clearly not irrational quacks or charlatans or frauds. I mean, they were not making big money off this. They seemed to have the same motivation that I did in that they were trying to help the patients that came to them every day. And so I felt that it was irresponsible for me not to explore it, and so I did.
Young: Okay. Then, moving to your elevation to the directorship of the office that had been established brings up this matter of testing to evaluate in scientific ways that this perception might, indeed, be scientifically so. And I asked the question, and it still is of interest to me, how are the ways to do that scientifically, so that you feel assured at the end, how are these in comparison with, or different from, the ways that, say, the Food and Drug Administration employs when it is going to approve a new medicine as safe and effective?
Jonas: In other words, are the methods that are used in this area any different than what are used in conventional medicine?
Young: Yes, sir.
Jonas: And there is a very clear answer to that. The answer is no. The methodology that has been developed and still continues to evolve and, in fact, especially in the clinical area and in other areas in science, adequate methods are always evolving for the questions being asked, but the methodologies themselves are no different in these areas than not in these areas. In fact, we had a conference in April that I chaired before I came on, April two years ago before I came on in this position, specifically to look at the methodology issues. And one of the questions that was given to the 8 panels that we collected, all of which were individuals involved in research, many of whom knew nothing about these areas but were conventional researchers that were interested in these areas, the questions that were asked of all of them in different types of research, because they were asked to look at different kinds of research, was this very question; what kind of guidelines, what kind of standards, need to be used in these areas and should they be different and how can they be developed. And the answer from every single one of those was that the methods that we have in conventional science for exploring the variety of kinds of information that we need to know in medical science are the same kind of methods that we need to use to apply to these. There is no question about that. And we're going to publish the proceedings of that conference in the spring, hopefully, and this will, in more detail, make that very plain and actually give people examples and some--I won't say "guidelines"--but give some directions as to how they can apply conventional current methodology in the exploration of this area. I've asked this to many people in the alternative medicine community who I thought would not accept conventional scientific methods--and there are a lot of people that don't accept conventional scientific methods, and for the most part these are people who don't accept science period and they're not interested in the exploration of, in a rigorous way and an objective way, their own activities--and that group, that's an assumption that is a given before you even come into this place, okay, and so that group obviously is not going to participate with us.
There are, however, a number of individuals who I thought at the beginning would be in that group, okay, who would say, "No, I do spiritual healing, for example, and there is no way you can evaluate this using scientific methods and there is no need to evaluate it using scientific methods." And there are a number of individuals in those and other areas that then, once they understand what we're trying to do in science--and sometimes that takes a little while to communicate to them because usually they're not trained in this area and they don't really know what we're trying to get at in science--that are very interested in it, and there are many individuals who say, "Yes, I would love to have this evaluated in a controlled manner, in a blinded manner, with independent evaluations, with quality assurance aspects built in, with statistical evaluation built in," and those are the groups that we work with.
Young: Before you came aboard, one of the obvious real tensions lay in this area between Dr. Jacobs and what he called "Harkin's Cronies" on the initial committee, and one of the primary cases in which this dispute was most heated was in the area of the bee pollen capsules where Senator Harkin, himself, had become persuaded there was value here, yet Jacobs sensed that there was some trouble about doing field trials and getting a scientific answer, at least the way that they were designed. When you came on board, you closed out what was at least underway, but not moving very far, and started over on field trials. Where do they stand now?
Jonas: I changed completely the whole concept of the field trials. The way they were trying to do it was inviable; it was not going to give you the kind of information they wanted out of it. Okay? So it wouldn't have even worked for what they wanted. I don't think they understood that. I hope they… I think most of them understand that now. I won't say all of them do, but certainly most of them do. And so basically we changed the program in order to be able to get out of that kind of assessment the kind of information you can get out of that kind of assessment and no more.
They were interested in individuals actually going and seeing what was going on. Okay? So, in other words, this is where the whole idea of field investigations came from. Well, this is done all the time. I mean, epidemiological research involves going out into the field and assessing what's going on. In these particular areas what you can get out of that is, you can get a description--get somebody who observes it, looks at the information, writes down what the practice is all about--you can get some information about the quality of the data. Are they collecting data? number one; if they are collecting data, what are they collecting? Is it something that looks like it would be useful for assessment? This type of thing.
Jonas: Yes. Or assessment of even what is there in terms of data that might be then put into some kind of an evaluation. And you can get a beginning feel for is this a group that would be cooperative with a research endeavor? Is this a group that understands what research is about? If you communicate to them how you would go about the next step in evaluating their program, is this something they would be willing to buy into, for example, and that type of thing. And you can, in most cases, understand as to what kind of evaluation would be the next step--okay?--to get some idea of what's going on in that area. And then you can say, "Well, it looks like this is a practice where you could do a best case series, for example." In many of them that's not possible; the data isn't even there and you couldn't do a best case series. In some cases there is lots of data already and the nature of the condition that they're looking at would not--a best case series--would not give you very useful information. An example is breast cancer. One of the groups that we looked at was using special immunological types of therapy for breast cancer patients. Well, most of the patients they were dealing with were not end-stage type of metastatic breast cancer patients, they were Stage III and IV who, just by the natural course of the disease, are going to have quite a variable response in terms of how long they live, what happens to their tumors, this type of thing. A best case series in that circumstance is not very valuable because you've selected out ones that are probably just observing the natural course of the disease. That kind of situation requires a controlled trial so you can disposition that to that. You may not want to do a controlled trial, for other reasons, or you may not be able to, or whatever, but at least you can then make that kind of a disposition. So this really is the purpose of what we now call practice assessments, which I think more accurately describes what we're actually doing.
Young: Is the Brown bee pollen capsule still being worked on?
Jonas: No. It's not.
Young: It's fallen by the wayside?
Jonas: The second thing that is a little bit perhaps related to your question, I mean I don't know if Brown and the folks down at the University of Texas who originally were working on this--Kline, Dr. Kline [Allen] down at U. Texas--I don't know if they're continuing to try to develop a protocol in there. They may be. And if they would like to and submit for a grant application, that's certainly up to them. Anybody can do that. We're not involved in working with that group directly anymore, and that's for a variety of reasons, part of which is it was enrolled in this informal program that was not getting the kind of information that it thought it was going to get, as I just described. Now, the other reason is because they couldn't agree, the bee pollen company couldn't agree, with the investigator in terms of methodology, so they couldn't even come up with a good protocol. If they did come up with a protocol, we have established methods for getting that reviewed and evaluated by an outside group of scientists to make sure that the science is good and they would, in any case, have to apply using that process and get it properly reviewed.
Young: Now, those are the standards that the Institute of Medicine helped you develop with regard to prioritization?
Jonas: That's a little different. Let me just--
Young: Well, other things that were listed in that field trials category early on were Burzynski [Stanislaw W.] and, am I right that some kind of disputation between the National Cancer Institute people and Dr. Burzynski and picking patients has caused that to halt?
Jonas: Yes. Well, that was the argument, right, over the modifications of the selection criteria for patients that would be entered into that prospective trial that the NCI was running, and there was disagreement over that and they could not come to an agreement over that, even after they had already set up and begun the trial. They didn't have any provisions built in for what would they do if, in fact, they were unable to get adequate recruitment. When that came up, they had no--I think--effective way of handling that kind of a contingency in the protocol and those kinds of things, of course, need to be built in and often are built in, and so that hadn't been done and that was part of the reason.
Young: And Dr. Ravichy [spelled phonetically], or whatever, he also was listed in the field trial, in the initial phases.
Jonas: Let me give you an example, just to try to illustrate how we're handling these areas now with this practice assessment approach. Some of my staff had gone up and they had made previous visits with Dr. Ravichy [sp]. One of my staff, just recently, using our new approach went up and assessed the practice and discussed these items that I just mentioned in terms of what is the quality of the data, what is the description of the practice itself, what is their willingness and interest and ability to cooperate with a research approach, and they had collected a lot of information Ravichy [sp] had and put it into a format for best cases. And I have an oncologist who works with me part-time here from the NCI, and he went up and looked at that information and found that, while there was some interesting information, most of it was inadequate, even to do a best case series. Okay? In other words, the charts weren't complete, there was insufficient documentation of the original diagnosis, they did not capture information about concomitant therapies that were used, this type of thing. However, he found, I think, in the neighborhood of 15 or 20 cases in which he said, "If there was this information, this information and that information, if you could get that information, then you would have a situation where you could probably do a best case series assessment." Then we could set up a best case series in cooperation with the NCI to bring the team together that they do for that and look at those cases. So, we've provided him with that information and it's up to them to decide whether they can actually get it or not. If they can, then we'll look at that, and if it looks like it's useful, then we'll do a best case series. So that's an example.
Young: Cancer seems to be, by all odds, the primary interest of the thousand or so inquiries that you get every month. Let's say that these people ask about Burzynski or ask about Ravichy [spelled phonetically].
Jonas: He treated-- It's not Weewall but--
Jonas: Frank Wiewel. Right.
Young: Is Wiewel how he pronounced it? Do you know?
Jonas: Wiewel. Frank Wiewel.
Young: Frank Wiewel. Let's say that they ask about these. What is the answer from your database that you would give the inquirer?
Jonas: I think Anita Haas can give examples of the patient information packets that we send out of the Office and on specific aspects, and we're still just developing those. We have one on cancer, for example, and she can give that to you to see that.
Young: But you might be cautious about giving answers if a specific therapy or a purported therapy was asked about?
Jonas: Absolutely. First of all, we are not a medical office. We do not do medical practice, nor do we give medical advice. Okay? This is standard. We're a research organization. We collect information about research and we support and help coordinate the conduct of good research. It's irresponsible to give medical advice without a complete evaluation and without being primarily responsible for the care of an individual condition and so, under no circumstances, do we give out specific advice or recommendations to go see particular practitioners and this type of thing.
Young: But do you give recommendations not to go see somebody on the grounds that the evidence shows there isn't sufficient evidence yet to warrant this being tried?
Jonas: We don't give advice for people on their medical conditions, period.
Young: I see. So you wouldn't say, " Burzynski…”
Jonas: What we'll say is that there isn't any research information on whether these particular therapies are effective that we have available and you have to work with your physician or your oncologist in order to explore that information and to see whatever is appropriate for you.
Young: You won't be as blunt as the American Cancer Society?
Jonas: Well, I think that there are mechanisms that are available for making recommendations and they're built into the NIH, and I think they should be used. I can't tell the American Cancer Society to use them--they're not part of the NIH--but there are mechanisms built into the NIH that deal with information in a certain area adequate to make recommendations or not make recommendations. An example of that is the consensus conference procedures. We recently had one of those--
Young: I read the one about--
Jonas: Relaxation, about pain?
Young: About pain and insomnia and the different behavioral techniques. I had seen those before in other areas and I thought that was an impressive document. Now, you would cite that, the conclusions from that, if somebody raised a question?
Young: And are others of those on the drawing board?
Jonas: Yes. We have one currently in the planning stages on acupuncture and we're looking at other areas to see if there is information in other areas. There may be in botanicals, for example, in some of the herbal therapies, but we haven't decided on it yet.
Young: And you've had some kinds of meetings on botanicals with the FDA joint sponsorship? Is that right?
Young: Do you have documents on those meetings like the text of the consensus conference on pain?
Jonas: Yes. Yes, we do. The botanical one is-- Well, both the acupuncture and the botanicals one were not consensus conferences, but they were joint conferences with the FDA.
Young: And they did cause FDA to change its definition of the needles?
Jonas: Yes. That's right. That was very interesting.
Young: That was interesting, wasn't it?
Jonas: Yes, it was. Yes.
Young: I ask a question in here, because it does worry me, and the quotations from time to time have seemed to vary, that when there is not sufficient evidence and when it seems apparent that this is a non-viable thing being promoted either crookedly or by some self-deceived individual or group, that after all your labors trying to find something valuable, when you find something palpably not valuable and even maybe suspect of the motives of the promoters, that it shouldn't be a task of yours to give that information too to the public.
Jonas: I think the information that we give out should be what comes out of the research, regardless of whether it looks positive and favorable or looks negative for that, and that really should be our role. If you get into the next step of saying, "Well, you should promote something," or "You should actively demote something or discourage something," then that gets a little more complicated than simply providing the information. I think it's very clear that our role here is to provide the information. The validation of that information is part of the peer review process. It's a much larger process. And we engage with those groups, such as the Office of Medical Applications of Research in the consensus conferences and with other organizations that do those types of things in order to facilitate the validation process of whether this is a bad thing, or validation of whether this is a good thing. However, it's not our role to be the judge and jury on that. However, we can facilitate that process. This is something that I've argued with my Council since day one that I came on, is that certain members of my Council feel like the Office's role should be to validate and, unfortunately, it's written into our mission statement, and I say "unfortunately," because it is clearly not our role. Our role is to facilitate the validation of, whether it's positive or negative. However, our role is not to be the end-all body of that.
Young: So that others might take the judgement, the scientific judgement that you render, which might be negative, and bring it into a kind of crusade against cancer quackery? That would be helpful data to have, but you don't see that as your mission?
Jonas: Absolutely. That's exactly right. I mean, now, we try, as best as possible, to give some indications as to where people need to look. For example, I wrote a paper that will be published by Ciba-Geigy in a monograph on safety in complementary medicine, and one of the things that-- I mean there are a lot of issues that go into safety--direct and indirect effects and neglect--and this type of thing. I mean, beside the toxicity issues, there are also other issues.
Young: Yes. I see.
Jonas: Even if you have a therapy that is highly unlikely to have any direct adverse effects--biofeedback for example, or some kind of meditation practice, well probably homeopathy, although we don't know for sure because there hasn't been enough research on it--but it looks like it would be, at least from the direct effects, safe; it's not going to produce direct toxic effects, there is still the whole issue of indirect effects, of neglect effects which is if someone feels like, or believes, that a therapy is effective and continues that therapy and it's not effective, then that produces adverse effects because they assume that they're getting treatment that they're not getting treatment for and, if there is an effective therapy for that particular condition, then their disease is progressing in a way that it does not need to progress and so that's even worse than the fact that it's just progressing, and it could have been prevented. So, those are indirect effects, and those are effects of application issues--okay--and those occur no matter what the direct issues are.
Young: This professor at Harvard that almost thinks that all therapy is placebo therapy?
Jonas: Yes. Herbert Benson. Is that who you're talking about?
Young: I think so. Yes.
Jonas: Oh, yes. He believes belief can do almost anything, and that's why he promotes-- He was the inventor of the term "the relaxation response," because he thinks if you relax and learn how to believe it, then it probably will effect. And we know that expectation and belief do have major effects and those are lumped into an area called placebo. And placebo is a lot of different things. It's not a very useful term actually and we're going to have, in December actually, a conference on placebo.
Young: Oh, great.
Jonas: And specifically look at exploration of non-specific and placebo effects and hopefully delineate what are the elements of these non-specific placebo effects and how can they be properly investigated, how can they be researched, and are there models for doing that, because I think--
Young: Beyond hormones, or whatever they are?
Jonas: Yes. Exactly right. Because I think it's very clear that much of the effect in both conventional and unconventional medicine is due to non-specific effects. There is no question about that. And, you know, one of the goals of one arm of biomedical research is to try to isolate specific effects so that we can, in fact, know what's part of the process of the doctor-patient relationship and what's part of the specific aspects and the content, or the procedure, and this type of thing. Now, if it's a drug it's very easy to do. You can-- Well, in most cases you can blind it and we can look at the content. If it's a skill problem, if it's like a surgical problem, it's much more difficult to do. And even in surgery they have not solved this problem. They have not come up with a methodological approach for separating the expectation effects from the surgery from the actual technique. And so it depends on what you're looking at as to how well you can apply expectation control, which is probably only one aspect of the placebo.
Young: Some of the commentary, in effect, says it's early days and maybe more should have been done by now, but it's early days, and you've been setting up, you've been planning proper methodologies, beginning to educate people to wield them, but from the point of view of this scientific proof which you are seeking and believe to be fundamental, not much, so far, has been achieved. Can you answer that? Can you sum up, generalize, and maybe point to illustrations of things that have been achieved? As I say, you did a little bit of this in that address you gave before the science writers a year or so ago but now, with another year, what would be an answer there?
Jonas: The position and the role of the office--and people need to remember this--is a coordinating role, largely a coordinating role. We are not an institute, we're not a center; we don't directly fund any research. So we don't have the mechanisms, nor the position in the NIH, to actually say, "Do that research, we will fund it," like other institutes do. So, in terms of looking for that kind of results, we don't have the mechanism, nor is that our particular role.
Now, we do help coordinate that information. We do work to collect the information from what's going on around the NIH and other organizations and we do help to give some guidance as to where we think the direction of the research should go. This area is so diffuse and so wide that much of that involves defining what is valuable and what's not valuable; what is higher priority versus lower priority. And this is where the Institute of Medicine stuff comes in.
Young: Now, in effect, I would feel you are combating quackery if I knew what you looked at and discarded because your priority system tries to help those who are dealing with things that have the most promise.
Jonas: In terms of prioritizing it at the beginning is what you're interested in? Sure.
Young: And there are some that are discarded that you might, as a private individual, think that really ought not to be on the market at all?
Jonas: Right, exactly right. And this prioritization process, I think, gets really at the heart, perhaps, of your question of what is it that we, a priori, will put at lower priority or discard, and this is not an easy question and it's getting more and more complicated as we collect more and more research about these areas because, in the past, we used to say, "That whole field has got to be quackery just because of its rationale," and it made it easy to do that. It's becoming more and more difficult, and I'll give you an example of that, and I think a prime example is an area that I know quite a bit about and that's homeopathy.
Young: I look forward to reading your book.
Jonas: I mean, homeopathy, you know, 5-10 years ago, would be very easy to say, "This is an area that we shouldn't even consider. It's impossible for it to work. I mean, there is no possible way it could work. It's got to all be placebo effects and therefore people are getting duped into a system and getting charged and there is all these indirect adverse effects that are risky.
Young: I'm about to ship off one article about Dr. William Coak [spelled phonetically], in which I more or less come to that conclusion. Quackery is also, as a thing as practiced, as well as what the thing is that is used.
Jonas: Right. It's mostly how it's practiced but I agree.
Young: So a thing might be quackery with something that had a little value because of the way, the exaggerated way, in which it was practiced. But his was essentially homeopathic. That's why I said that. And pardon me for interrupting.
Jonas: Well, that's an example, okay, is that here is something that, you know, 5-10 years ago, I think, most everybody would agree is something you can just right off the top say, "This isn't worth investigating," and many people still say that. However, as we look into the actual research that's been done in those areas, and again where they've done this mostly is in Europe because they've been more open about doing this in a serious manner in Europe for a longer period of time than we have in this country, and they're finding some very surprising and unexpected results which is that, in many cases, it does appear to have effects over placebo effects that we can't explain away using the standard methodologies, even rigorous ones, that we have. An example of this was the British Medical Journal article several years ago in 1991 in which they reviewed 109 placebo-controlled trials in homeopathy. Some people from the University of Amsterdam, epidemiologists, who were not homeopaths, looked at this and they said, "Gosh, there are an awful lot of positive trials here. We wouldn't expect that." And they did a very detailed quality assessment and even looked at only the very rigorously done ones and said, "There is still a lot of positive trials, so what's going on?" Okay? So this makes the stance of saying "Let's write the whole area off" a little bit more difficult to do. If we had written it off at the beginning we never would have gotten to that particular point and so it makes it a little bit difficult in terms of applying this across the board. Now, there are other areas that I still think probably that's the case, but it makes it more difficult. So this is the exact reason why I'm going to the Institute of Medicine and saying, "We need to have some criteria for some prioritization criteria. We need to have them from the scientific community to say at this particular time what is a reasonable way to prioritize, or throw out things that shouldn't even be looked at, emphasize things that should be looked at, so that we can put our research. Give us a guide for seeing where our research resources should go. And this can only be done by groups that know how to do that and there is no better place than the Institute of Medicine that has done this for many other places and this is exactly what I'm asking for. I think that without that guide, you know, we can do the best we can, but we'll all still be kind of groping a bit in the dark in this area and mucking through a Pandora's Box, some of which used to be clearly bad and is now perhaps less clear about that.
Young: You've given a lot of grants and you set up the centers. Is there anything, major or minor, that you consider research that is determinative that have yet appeared, or are we still in too early of stages to say?
Jonas: All the grants that have been funded by this office, and also the ones that have been funded in cooperation with some of the institutes, have all been what I consider to be developmental. The initial pilot grants that were given out a couple of years ago, $30,000 dollar grants, for $30,000 dollars you're lucky if you can think of a developmental idea for that, but certainly it's all developmental. Despite that, some of that research was very intriguing and was very interesting.
Young: Have you got a list of publications that may have appeared from that research?
Jonas: We have a list of the completed grants and also a set of structured abstracts from those. One of the things that I'm very opinionated about is the importance of structured abstracts.
Young: So the word gets around faster?
Jonas: So people can understand it. You know, so much of what comes out is written in a way that people don't know. Such as what was the design? What was the number of people? I can't find them in there. Methodological issues are not illustrated. What were the outcomes that you decided to make, and this type of stuff. And you can't find that in most research reports, and so I follow the structured abstract format that the Annals of Internal Medicine follows and this type of stuff, and so we have worked on putting those into that format. And a number of them are very interesting, both on the positive and the negative side, and I'll give you an example. This is both in basic science and clinical research. Now, remember, none of them are definitive in my opinion. They were not set up to be definitive. Even if they appear to be definitive, they're not definitive, again, because of the size and the preliminary nature. But let me give you an example of a basic science study that gave us some useful information.
There is a cancer therapy that's used very extensively in Europe, and somewhat in this country--increasingly so in this country--that uses enzymes, pancreatic enzymes, as an integral part of its therapy, and they have this whole theoretical reason why they think it works, which is probably all gobbledy-gook but, in any case, they use this and they claim that it has some effects. And there is some old research and there is some research in Germany that looks like there might be something. So, one of the grants was to look at the use of pancreatic enzymes in an animal model to see if it had effects on cancer, because that's what they claim, with and without supplementation with a mineral that they often supplement with it, which is magnesium. And so one of the people who got one of these $30,000 dollar grants did a series of studies with implanted sarcomas in mice and fed them different levels of pancreatic enzymes, with and without magnesium, and they showed some very interesting things, that in fact the high levels of enzyme seemed to accelerate the cancer growths and the low levels, if there was magnesium added to it, seemed to inhibit the rate of metastasis. Now, all of them progressed, but it actually gave some indications that perhaps these high levels they were using might, in fact, be harmful. They might actually be accelerating the growth. And those that are reporting effects might be reporting effects because they're not complying with it, that they're only taking a few. In other words they get a small amount, and this then serves as perhaps some kind of an immunological stimulant. And they looked at a few of those parameters. It has nothing to do with the enzymes themselves, it just serves as a protein epitope stimulant for some immunological non-specific process that has some minor effect on the cancer. So, this is very the useful interesting information kind of thing that comes out and that you can get from that kind of thing.
Another trial looks at acupuncture in the treatment of depression. And this was a clinical trial, again small numbers of patients--this kind of a grant can't do more than small numbers of patients--looking at real acupuncture, sham acupuncture--that is acupuncture in sham points--and I think versus a wait list that then eventually got real or sham acupuncture, and it showed that the real acupuncture was effective in about 64 percent of the patients, that it was more effective than the sham and it was more effective than the wait list. Now, the numbers are so small because of the size of the grant that it's not statistically evaluable. But they developed the methods for doing it. They got that down. So they learned, in fact, that you can do, in this particular situation, that kind of a trial with sham acupuncture and a wait list, provided it's not too long, and that type of thing.
Young: If you can determine what sham acupuncture is.
Jonas: Right. Exactly.
Young: That was one of the things in that acupuncture thing that impressed me, the difficulty of finding non-points.
Jonas: Right. Well, in most of the studies in acupuncture too, this is a side issue, the sham acupuncture is better than no therapy at all, so it works as a pretty good placebo, even by the Chinese classification. That would be a Chinese placebo, if you will.
Young: Well, I went to--it wasn't quite as formalized as the conferences that you're having--at the CDC to an acupuncture thing 20 or 30 years ago and I was persuaded by the weight of testimony of the pain value by what they said. I had a specific here. I did a paper on quackery with regard to AIDS and so I was particularly interested in this Bastyr Center [Seattle, WA] and its working with alternative things in respect to AIDS. Has there been a report from them yet?
Jonas: This is part of your other question about what's actually happening, what kind of products are being produced.
Young: Yes, it is.
Jonas: We had, actually just before I went over to Germany for our last meeting, a one year update on the Center's activities. It wasn't quite a year. They hadn't been funded quite a year because of the budget issues and all that kind of stuff. When the Center started, basically I decided that I thought it was a good idea to begin to build some infrastructure in these areas and, in Germany for example, one of the things that they found in trying to develop a program in unconventional medicine or complementary medicine is that they needed to develop groups that could effectively carry out research in these areas. That is, they needed to be able to develop collaborative relationships with those who delivered the practice with experienced competent researchers. There needed to be an educational process that went on so that they understood what each other was about and what they were doing. Then they needed to be able to develop specific protocols that were viable and would come up with some kind of answer. This is a developmental thing that they found they needed in Germany and it was clear that that kind of thing was needed here, and so my task to the centers was to begin this process--okay--to begin to sort out, and we gave them the Institute of Medicine criteria, we gave them some other guideline criteria, to use in the interim in order to help prioritize and sort those things out. And also it depends upon their research opportunities at the particular center and university that they have. They may have a top priority for a particular area they think is important, but they may not have an opportunity at their center, at their university, to actually do that. There may not be an expert in that area who is involved in that who is interested in doing that. They may not have the funds to do it, or this type of thing. So ultimately what they do depends largely on the opportunities for what we call research readiness, okay, something is actually ready to do.
So, the first year reports, they described the bibliographic information that they were collecting, because we require all of them to collect information and evaluate the literature and use that as an integral part of deciding where their research dollars should go.
Young: This is all ten of them?
Jonas: This is all ten of them. Right, a priority-setting process. And then, you know, to identify within their resource capacity, which is very small for a center, areas that they thought that were viable that they could actually begin to initiate research. So they gave us what they were doing, and they are in various stages of development, as you would expect with ten of them. Some of them are gangbusters, ready to go, and some of them have completed some research already and some of them are still kind of floundering around, and they'll all be recompeted and we'll see who comes out in the wash. So, you know, those kinds of things we're now collating as to what they're actually doing and we don't have that in a succinct format yet, but we'll have that available fairly soon.
Young: So, are they privileged? Is the one that deals with the AIDS and alternative medicine from Bastyr, is that one that I might be permitted to see?
Jonas: Well, we have the summaries. They gave us-- We asked them, for our Council meeting, to provide us with one or two page summaries, and we can provide that for you and, in fact, we could give him one of the Council books from our last Council meeting.
Young: Yes. A printed record.
Jonas: Right, which describes some of the management process that we're developing here for integrating our review and approval process with the institutes and with the IOM aspects, so you might be interested in that, which that is more just management issues, and it also has in that a summary from each of the centers so you can see what they're looking at. Now, the Bastyr Center was one that was funded before I got here.
Young: Yes. One of the two. Yes.
Jonas: And they are rather unique among the centers in that most of the centers are at established institutions that have an infrastructure and a track record for doing research. Not all of the individuals necessarily do, but the organizations do, and some of them have a very good track record. And Bastyr has not really done research in the past. They're a naturopathic college that basically does therapies. And, you know, I consider that an experiment. Most of the other organizations have to reach down--I shouldn't say that--reach across to the practitioner side in order to develop collaborative relationships to conduct research. Bastyr is the other way around. They have the practitioners, if you will--and they have to reach over to the scientific community to make good connections.
Young: It's more like a full-time proprietary medical school, isn't it, 19th Century, late 19th Century, with the practitioners forming together?
Jonas: It's like an eclectic, I guess what 100 years ago was the eclectic schools. Yes. It's right out of that tradition.
Young: Yes. I didn't realize quite; I hadn't heard of it.
Jonas: And so they have a different type of bridging that they have to do and, you know, we'll see how they do in that. I mean, they're taking perhaps a bit of a risky approach, I think, in terms of collecting outcomes data because, you know, the value of what you'll be able to get out of that with a small amount of resources…I mean, if you have large datasets where you can do good medical epidemiology with that, then outcomes data can be perfectly useful for certain purposes, although there are some people that say you can never use that for any kind of falsifiable or causative exploration. You know, they're a bit risky because they do not have that kind of large dataset, and so the value of the dataset they will have is going to be…
Young: It resembles a little bit what the San Francisco AIDS organization itself did, with some liaison with FDA.
Jonas: Yes. Well, in fact, there is the Multicenter AIDS Cohort Study, the MACS study, which is going on right now, and I think is run out of Johns Hopkins, which is doing the same thing. They're collecting prospective information on a group of AIDS patients on a regular basis and this type of thing. The Bastyr Center is actually working with them in order to get another sample of patients in which they then query about their use of complementary and alternative medicine, compare it with the group that they're collecting data with, so that they have another set and they can increase their numbers and also get a little bit different angle on that plus, hopefully, they'll be able to use some of the methodological expertise that's built into the MACS study to help with theirs. So, you know, we'll see how it works. I think, you know, it's an experiment.
Young: Well, that seems like hitting the main points, and I appreciate your time and candor. And, well, I might ask about your relationship with FDA, finally. I'm going to have a visit with Freddy Hoffman, who is one of the primary liaison people between the organizations, whom I have talked with before when I was working on the AIDS project, and could you describe the primary elements of your liaison with them?
Jonas: Yes. You know, we have a good liaison with them, and Freddy is one of the integral connection points. We had someone who was detailed over here from the FDA part-time, who is now finished, specifically to try to develop some of the FDA connections and look at some of the FDA issues. Our role here in the office is to coordinate and facilitate the conduct of good research. So, our primary interest with the FDA revolves around issues of INDs.
Young: You help some people get the data organized in order to present.
Jonas: What we primarily do is that we primarily try to simplify some of the FDA regulations around INDs in a way that can communicate to these folks so that they can understand this and then they at least will know how to approach the FDA and ask the proper questions so that they can get information about do you need an IND and, if you do, what information is required for that, and this type of thing.
Young: Is this mostly in the herbal field?
Jonas: Well, the FDA, the herbal field is one major one, and the FDA and I think that Freddy will probably agree with this--does not have clear guidelines in many of these areas because they really haven't been researched much in the past. A lot of people haven't been applying for INDs.
Young: And their own rules are tougher than they used to be, especially in the whole supplement field because of that law.
Jonas: Exactly right. And also, when it comes to drug types of therapies, they're used to dealing with pharmaceutical companies and they have a very specific set of procedures, the goals of which are drug development and drug marketing. Well, a lot of these things, that's not particularly the goals and so the question is, what are reasonable ways to guard against safety and assure efficacy in these areas in a model that doesn't quite fit into that? And so these have not been clarified and we are working with them. We work with the working group that they have to look at these issues. Herbal is one area. What do you do if you have an herbal product that is characterized and standardized, but you don't know, in fact, what the active ingredient is? In other words, you can check for the safety and the content issues but you don't know what the active ingredient is.
Young: Or it may be a multi-ingredient?
Jonas: Yes, and it may be a multi-ingredient type of thing, which is the same issues, but are just a little bit more complicated in dealing with it. The same thing applies with a homeopathic product. You know, how do you characterize a product in which you can't measure any particular ingredient in it? Well, you have to fall back on some type of good manufacturing processes, similar to what they do with biologics, for example, in which you can't do a milligram dose of a particular epitope of a vaccine. It's that type of thing. And so these are things that, you know, have never really been developed in detail as to what are the exact procedures and what are reasonable procedures to do that. So this is largely where we work with the FDA in trying to work with them to develop some of these procedures so that researchers that now are becoming interested in these areas will know how to go through those types of procedures.
Now, our role is not to get into regulatory issues. That's not our role here and we try very hard not to do that, and I constantly am trying to make sure that certain groups that advise me understand that. Our role here is to forward the research activities and not the practice and the regulatory aspects which is, as you know, another thing that they have a responsibility for.
Young: Is the Advisory Council more harmonious than it seemed to be at the initial ad hoc advisory council?
Jonas: Oh, I think clearly it is. I think it's kind of like regression to the mean. It couldn't have gotten any worse.
Young: Yes. That's one of the things that impressed me.
Jonas: Anybody could have stepped in here and it had to get better, so it would have looked better.
Young: I felt sorry for Dr. Jacobs as I read.
Jonas: You know, I think this is evolving and I think that it's certainly getting better. We are bringing, as people rotate, we're bringing on individuals that are interested and involved in these areas but also are scientists that understand the procedure onto the board.
Young: So the scientific level of the total, one could say, has increased from the beginning?
Jonas: Oh, absolutely. Yes. No question about it. And it's one of the things that I'm trying to do with the Council is to try to elevate that level so that we can discuss the specific research issues themselves. And this, I think, is working. Certainly, at the last meeting it was clear that we got into more of that type of discussion and I think we'll continue to do that.
Young: And that will be reported in the next newsletter?
Jonas: Yes. We also, just along the FDA issues, Anita reminded me, we have asked Bob Temple, over at FDA, to serve as an ad hoc board member, in which he actually serves on the Council, and I think this will greatly facilitate discussions with the FDA and communication with the FDA.
Young: I knew him in the old days.
Jonas: Yes. I like him very much. He's, you know, a real advocate for clinical trials which, I think, in this area is an important thing--at least one element--that needs to be paid attention to.
Young: I guess I dealt with him when I was writing about laetrile.
Jonas: Yes. He has a lot of knowledge about these areas too.
Young: Oh sure. Well, I've taken a good deal of your time. You've been gracious with it, and I appreciate it, and I'll do the best I can. Of course, I've not been told I can give the paper yet, but I will find that out and it.
Jonas: You are welcome.
End of transcript