The consequences of Novartis’s regulatory data manipulation go beyond a single product approval

An image showing Vas Narasimhan, CEO of Swiss pharmaceutical group Novartis

Source: © Patrick Straub/EPA-EFE/Shutterstock

Novartis chief executive, Vas Narasimhan, has faced some tough questions about the company’s handling of the incident

Regulators rely on data. Before they’ll even consider allowing a new drug, crop protection agent or other chemical product onto the market, they need to see a truckload of it (quite literally – before electronic submissions, the boxes full of paperwork supporting a new drug application could easily fill a small lorry). But what if it turns out that the data is compromised?

This is the problem facing Novartis’s newly approved gene therapy, Zolgensma (onasemnogene abeparvovec,). Having discovered in mid-March that some of the preclinical data from mouse experiments had been manipulated, the company chose to launch an internal investigation rather than immediately telling the regulators – only notifying the US Food and Drug Administration in late June, after the agency had granted approval for Zolgensma.

Now, Novartis and the FDA agree that the manipulated data do not compromise the safety or effectiveness of the treatment, and its US approval still stands. But that decision to delay disclosure has strained relations between the two. The FDA has suggested it could bring civil or criminal charges against Novartis, and the company’s leaders have been harangued by members of the US Congress accusing them of obfuscation.

The European Medicines Agency has not yet made its final decision on Zolgensma, and has asked Novartis to provide more analysis on the manipulation before it does so. While the timing of the revelation was less critical in the European approval process, it’s clear that regulators on both sides of the Atlantic take data manipulation extremely seriously.

Regulators can check data for signs of potential manipulation, but it is difficult, time consuming and has no guarantee of success. In reality, regulators rely to a significant degree on companies providing a true and accurate representation of the testing that they have done.

Naturally, there is an inherent tension between companies’ desire to present their data in the best possible light, and the regulators’ duty to look beneath the surface polish and make an objective decision. Data integrity is sacrosanct in those dealings, though, so anything that casts doubt over the veracity of a company’s submissions is a fundamental problem.

How, then, can a company redeem itself from such transgressions in the eyes of regulators, lawmakers and public observers? Identifying and firing those responsible is the kind of response that appeals to politicians looking for positive action, but is unlikely to address any underlying cultural problem that allowed the manipulation to occur in the first place. Open, frank and transparent dialogue is an important first step to rebuilding trust, as is demonstrating that lessons have been learned and policies put in place to prevent repeat offending