The Food and Drug Administration has gotten faster at approving new prescription drugs over the past four decades, but the evidence it relies on in making those decisions is getting weaker, according to new research published Tuesday.
As a result, there are more cures and treatments on the market but less proof that they are safe and effective.
"There has been a gradual erosion of the evidence that's required for FDA approval," says lead study author Jonathan Darrow, a lawyer with Harvard Medical School's Program on Regulation, Therapeutics and Law. As a result, patients and physicians "should not expect that new drugs will be dramatically better than older ones."
Almost half of recent new drug approvals were based on only one pivotal clinical trial instead of the two or more that used to be the norm, according to the study published Tuesday in JAMA, the journal of the American Medical Association. And the reliance on surrogate measures — stand-ins for presumed patient benefits — has increased. In the case of cancer drugs, a surrogate measure could be shrinkage of tumors instead of improvements in survival after treatment.
Darrow and his colleagues analyzed FDA drug approvals, changes in laws and regulations and the rise in industry funding of the agency's reviews from 1983 through 2018.
The researchers found that the average number of new drug approvals per year increased from 34 in the 1990s to 41 in the 2010s, after a dip to 25 a year in the 2000s.
New drugs are getting through the FDA process faster. In 2018, the median review time for standard drug applications was 10.1 months compared with 2.8 years for standard and priority applications for the period from 1986 and 1992.
Faster approval was the goal of many legislative and regulatory changes starting in the 1980s. For instance, in 1983, Congress passed the Orphan Drug Act to encourage drugmakers toward developing rare disease treatments that otherwise might not be profitable. The FDA allowed approval for these drugs based on smaller trials with more flexible evidence standards, and it offered more time on the market for the medicines without generic competition.
Drugmakers also began paying the FDA fees to fund the review process after AIDS activists protested the agency's sluggishness in the 1980s.
The FDA collected $29 million in fees in 1993, the year after Congress passed the Prescription Drug User Fee Act for the first time. The fees rose to $908 million in 2018. That year, industry fees amounted to about 80% of the money spent on FDA employee salaries for drug reviews, according to the study.
"There is some concern about the incentives that this is created within the FDA," Darrow says. "And whether it has created a culture in the FDA where the primary client is no longer viewed as the patient, but as the industry."
Former FDA Principal Deputy Commissioner Dr. Joshua Sharfstein says the FDA has succeeded in approving more drugs, but he also says that more changes are needed to make sure the medicines are worthwhile for patients.
"Some of them are really great, and some of them [are] not so great," says Sharfstein, now a professor at the Johns Hopkins Bloomberg School of Public Health. "And a lot of them are very expensive."
In an accompanying editorial in JAMA, Sharfstein offers several suggestions for change, including a reevaluation of the FDA's expedited approval programs to see which ones are working and which ones are mainly driving up health care costs.
"We've kind of reached a point where it makes sense to pause and see whether we can do things better," Sharfstein says. "And I think we can."
300x250 Ad
300x250 Ad