Looking at the science versus the hype, I’ve been predicting for at least a couple of months now that hydroxychloroquine, a drug commonly used to treat malaria that also has immunomodulatory properties that make it useful for treating autoimmune diseases such as rheumatoid arthritis, will probably turn out to be ineffective against COVID-19, all the hype notwithstanding. I’ve often pointed out that the prior probability of hydroxychloroquine being effective based on what we’ve known is quite low given that its antiviral activity in cell culture has never successfully translated to antiviral activity in humans and that the claims for “miracle cures” of COVID-19 victims such as Rio Giardinieri, Jim Santilli, and Karen Whitsett, due to the drug just don’t stand up to scrutiny. In any case, my position has always been that the hype over hydroxychloroquine far outweighs any promise that it might have and that its effectiveness is likely to be very low or zero (more likely zero). Nonetheless, a “brave maverick” French scientist named Didier Raoult, grifting doctors (including Dr. Mehmet Oz), and Donald Trump with his sycophants, toadies, and lackeys have hyped the drug relentlessly and, as observational study after observational study has been published failing to find even a hint of a signal of benefit, also attacking the studies.
So it was with a study in The Lancet that was published two weeks ago that I wrote about that’s now been retracted. The study used data from a company called Surgisphere, and issues with data transparency after questions were raised about the study led to its retraction. Interestingly, as the Lancet paper was retracted, the New England Journal of Medicine published the first randomized controlled clinical trial of hydroxychloroquine for prophylaxis against COVID-19, which turned out to be, unsurprisingly, entirely negative.
Surgisphere: A company with no history of huge projects
The Lancet study using Surgisphere data caused a big splash two weeks ago (which, as with all things during this pandemic, seems like ancient history now) because it was the largest observational study yet published and found that there was not only no benefit from hydroxychloroquine and azithromycin (Didier Raoult’s favored combination) in COVID-19 patients but higher mortality and a lot more dangerous cardiac arrhythmias. At this point, I have to express a mea culpa. At the time, I thought the study was pretty solid. It might still be pretty solid, or it might be fraudulent. We can’t tell, for reasons I’ll get into in a moment. (An addendum with a link to this post will be added to my original post.) I wasn’t alone, either. A clinical trialist far more experienced and eminent than I also thought the Surgisphere study was well done, to the point where he suggested that doing randomized clinical trials of hydroxychloroquine for COVID-19 might now be unethical because there was no longer clinical equipoise (genuine uncertainty over whether the treatment was better than placebo):
As a result of concerns raised by this study, the World Health Organization suspended its hydroxychloroquine study to analyze safety data, although the suspension was brief and the study was resumed earlier this week. To say that the study caused quite a stir, with hydroxychloroquine believers attacking it nonstop, is an understatement. It also turns out that the Lancet paper wasn’t the only paper using Surgisphere data that was retracted. The New England Journal of Medicine also retracted a Surgisphere paper that had reported no correlation between the use of angiotensin-converting–enzyme (ACE) inhibitors or angiotensin-receptor blockers (ARBs) and increased risk of death.
So where did Dr. Topol, others, and myself go wrong? Basically, we took the description of Surgisphere’s dataset, which is massive and reported observations on over 96,000 patients, at face value. As questions were raised about the dataset, Surgisphere was—shall we say?—less than transparent about its data. This led to the editor of The Lancet saying:
The main criticisms of the study are summarized in this article by Dr. James Todaro. Before I continue, I feel obligated to mention that, based on his Twitter feed and other writings, I do know that Dr. Todaro definitely has a quacky vibe about him and is definitely a hydroxychloroquine believer. (He’s also a managing partner of a cryptocurrency company.) Indeed, this New York Times article notes that he has a connection to Didier Raoult, who allowed him to post one of his original studies on Twitter two days before it went live on a preprint server. Todaro’s Twitter feed also mentions that he’s co-authored a pro-hydroxychloroquine piece entitled An Effective Treatment for Coronavirus.
So why am I citing him, given that he’s clearly very invested in proving that hydroxychloroquine is effective against COVID-19? Because on this one issue, his was the most detailed deconstruction of what’s wrong with Surgisphere and the Lancet study. That’s the difference between the cultists and me. I’ll change my mind if they present new information that checks out when I dig into it. It’s also a lesson that a believer’s skepticism when examining something he disagrees with will always be far more rigorous than when looking at a study that goes against what he currently believes. Think of it as a somewhat embarrassing reminder to myself (coupled, perhaps, with a bit of self-flagellation) to remain humble in the future and not to be too fast to dismiss criticisms coming from even the cultists. If Dr. Todaro sees this, may message to him would be to urge him to start applying the same level of skepticism to hydroxychloroquine as he did to Surgisphere and The Lancet paper.
First, Dr. Todaro notes that Surgisphere is a dodgy organization:
Based on the Lancet study, it [Surgisphere] must be a very large, sophisticated network indeed to have partnered with hundreds of hospitals worldwide with the capability of retrieving detailed patient data in real-time.
One would expect a multinational database such as this to be a treasure trove coveted by researchers. Strangely, this is not so. Surgisphere has a razor thin folder of contributions to past publications. Besides the Lancet publication, Surgisphere’s only other peer-reviewed publication is one entitled Cardiovascular, Drug Therapy, and Mortality in Covid-19 that was published on May 1, 2020 in The New England Journal of Medicine.
The Research section of Surgisphere’s website features twenty-three “Case Studies from Around the World” as evidence of their prior work and product features. The vast majority of these “case studies” lack scientific substance and actually consist of short letters, press releases or potential use-cases for its database.
Dr. Todaro also noted that Surgisphere has only five employees, only one of whom, Dr. Sapan Desai, has a medical degree, while the remaining four are mainly business and marketing people, three of whom had joined the company only two months prior to publication of the Lancet paper. Also highly suspicious is the fact that Surgisphere blocked its website from the Wayback Machine at Archive.org, so that no one could check what the website looked like in the past. As Dr. Todaro points out:
There are primarily two ways for companies to hide internet histories. First, they can insert special codes into their websites to hide from the Wayback Machine’s automated crawlers. Secondly, companies can request the removal of their historical snapshots, but there’s no guarantee the Internet Archive will honor these requests. Both of these practices are highly unusual and almost exclusively used for obscuring nefarious activities.
Another story states that Surgisphere has eleven employees. The discrepancy likely derives from Dr. Todaro’s reliance mainly on LinkedIn profiles to find Surgisphere employees. Whichever number is correct, though, Surgisphere is clearly a small company, way too small and lacking in the necessary expertise to have built a database like the one it describes. It also started out primarily as a medical textbook company and has a history of submitting fake reviews to Amazon.com:
Reviews of the company’s products on Amazon are polarized, and a handful of positive reviews that appeared to impersonate actual physicians were removed when those doctors complained to Amazon. Kimberli S. Cox, a breast surgical oncologist based in Arizona, tells The Scientist that she was one of several practicing physicians who in 2008 discovered five-star reviews next to names that were identical or very similar to their own, that they had not written. She and her colleagues successfully persuaded Amazon to take the reviews down.
In the same story, Dr. Desai promises that there will be an independent third party audit of Surgisphere’s data. (Spoiler: There hasn’t been and won’t be.)
Todaro also notes that there are a number of Surgisphere subsidiaries that appear to have little or no substance:
A deeper dive into Surgisphere reveals three subsidiary companies: Surgical Outcomes Collaborative, Vascular Outcomes and Quartz Clinical. On each of the homepages of these three websites, the Surgisphere copyright is publicly visible near the bottom of the page.
Surgical Outcomes Collaborative has almost no internet history and the page does not appear in the Internet Archive until 2019, in which it just redirects to the webpage for Vascular Outcomes.
A search of https://vascularoutcomes.com in the Internet Archive returns one snapshot from December 2019. The snapshot shows a webpage that is largely similar to that of Surgical Outcomes Collaborative and does not include any details about a team or published research.
Similarly, Quartz Clinical, another healthcare data analytics branch of Surgisphere, also appears to be devoid of published research and without a publicly visible team.
Each of the company webpages above provide a LinkedIn link. Instead of showing company profiles with track records, however, the links all direct to the profile of just one person, Dr. Sapan Desai.
As Dr. Todaro points out, forming partnerships with dozens of hospitals, setting up a system to format, extract, and analyze data from electronic medical records that use many different EMR platforms and many different languages would be an incredibly difficult, if not insurmountable, task for a large multidisciplinary team with statisticians, computer programmers, etc. over many months, never mind the claim by Surgisphere that it is using machine learning and artificial intelligence. Even more oddly, the “get in touch with us” link redirected to strange WordPress template for cryptocurrency, at least before the company changed the link. Meanwhile, Dr. Desai has 39 publications over the last five years (which is quite good, almost eight publications a year), but none of them other than the COVID-19 papers used Surgisphere data, while Surgisphere itself won’t even specify which hospitals or countries contribute to its database; only continents are specified.
Data inconsistencies and implausibility
There are also oddities and inconsistencies in the dataset reported. The first of these was discovered in Australia, because Australia is unique in that it is both a continent and a country:
Australia is unique because it is both a country and continent, which makes data obfuscation more challenging. Thus, it is no surprise that false data was first discovered in Australia. The Guardian reported yesterday that the number of COVID-19 deaths included in the Lancet study for Australia exceeded the total nationally recorded number of COVID-19 deaths. The Lancet study reported 73 deaths from the continent of Australia, but records show that Australia had only a total of 67 COVID-19 deaths by April 21. When confronted with this inconsistency, the lead author of the study, Dr. Mandeep Mehra, admitted the error but dismissed it as simply a single hospital that was accidentally designated to the wrong continent.
The data from North America are also suspect, noting that Surgisphere reported on 63,315 COVID-19 cases out of the estimated 66,000 cases that had been recorded up to the concluding date of the study, a percentage that defies imagination and would require the company to have relationships with nearly every hospital in North America, asking:
Are we to believe that Surgisphere truly had relationships and data exchange agreements with 559 hospitals in the USA, Canada and Mexico that captured detailed patient records for 63,315 COVID-19 patients out of a total of 66,000 patients? These figures do not even include the 2,230 patients with COVID-19 who did not meet the inclusion criteria, meaning that Surgisphere is claiming they have patient data on even a greater number than 63,315 patients.
Another story on the controversy notes:
The Scientist has reached out to some of the largest health systems in the states hit hardest by the coronavirus pandemic to inquire whether they participated, but could not find any that did.
Instead, a number of hospitals confirmed that they did not contribute data, namely, New Jersey health systems RJWBarnabas Health, Cooper Health, and Atlantic Health System; NYC Health + Hospitals and NYU Langone in New York; and Illinois-based health systems Rush and Advocate Health Care.
When I read that, I started thinking that Surgisphere sounds scammier and scammier with every bit of new information I learn about it. I should have sensed it when I read the paper (yet another mea culpa). Indeed, in this story, someone actually comes right out and says it:
Peter Ellis, the chief data scientist of Nous Group, an international management consultancy that does data integration projects for government departments, expressed concern that Surgisphere database was “almost certainly a scam”.
“It is not something that any hospital could realistically do,” he said. “De-identifying is not just a matter of knocking off the patients’ names, it is a big and difficult process. I doubt hospitals even have capability to do it appropriately. It is the sort of thing national statistics agencies have whole teams working on, for years.”
Indeed, Surgisphere’s response to the criticisms of its studies seems fantastical for company with only at most 11 employees:
Making disparate EHR systems talk to one another is a well-known industry challenge. Surgisphere’s QuartzClinical data analytics platform serves, for us, as a template for aggregating and consolidating disparate data into our queryable registry. Our customers export deidentified data from their EHRs in a format Surgisphere defines. This becomes our data dictionary, and it ensures our registry information can be compared apples to apples across our 1,200 customers. Surgisphere does not reconcile languages or coding systems.
We take data security very seriously. Surgisphere is ISO 9001:2015 and ISO 27001:2013 certified. ISO 27001:2013 is a strict data security and data integrity validation. Mandatory audits happen at least four times a year, and everything from data acquisition to data reporting is independently reviewed by an external third-party auditor. Surgisphere has passed all of its prior audits with no major or minor nonconformities.
We also take data privacy very seriously. Our registry is an aggregation of customers who use our QuartzClinical data platform. Our strong privacy standards are a major reason that hospitals trust Surgisphere and we have been able to collect data from over 1,200 institutions across 46 countries. While our data use agreements with these institutions prevents us from sharing patient level data or customer names, we are able to complete appropriate analyses and share aggregate findings to the wider scientific community.
We’re apparently supposed to believe that a tiny company with little evidence of having employees with the expertise to undertake such a massive project can do all this, and we’re just supposed to trust it because, conveniently, its claimed data use agreements don’t let it allow third parties to see the raw data in the database, all while no one can seem to find a single hospital that admits to contributing to Surgisphere’s database.
Finally, data reported from Africa that would have required sophisticated patient monitoring technology and electronic medical records. Specifically, obtaining data on the incidence of ventricular tachycardia and ventricular fibrillation on that scale is highly implausible, as scientists signing an open letter pointed out. (As an aside, I must admit that I was too dismissive of that letter at first, another mea culpa that I must mention.)
From “expressions of concern” to retractions
The issues and questions described above ultimately led The Lancet and NEJM to publish “expressions of concern” over the Surgisphere papers earlier this week.
The NEJM wrote, for instance:
This retrospective study used data drawn from an international database that included electronic health records from 169 hospitals on three continents. Recently, substantive concerns have been raised about the quality of the information in that database. We have asked the authors to provide evidence that the data are reliable. In the interim and for the benefit of our readers, we are publishing this Expression of Concern about the reliability of their conclusions.
The Lancet’s expression of concern was very similar:
Important scientific questions have been raised about data reported in the paper by Mandeep Mehra et al—Hydroxychloroquine or chloroquine with or without a macrolide for treatment of COVID-19: a multinational registry analysis1—published in The Lancet on May 22, 2020. Although an independent audit of the provenance and validity of the data has been commissioned by the authors not affiliated with Surgisphere and is ongoing, with results expected very shortly, we are issuing an Expression of Concern to alert readers to the fact that serious scientific questions have been brought to our attention. We will update this notice as soon as we have further information.
Two days later, both papers were retracted at the request of the majority of authors. The NEJM explanation read:
Because all the authors were not granted access to the raw data and the raw data could not be made available to a third-party auditor, we are unable to validate the primary data sources underlying our article, “Cardiovascular Disease, Drug Therapy, and Mortality in Covid-19.”1 We therefore request that the article be retracted.
And The Lancet provides more detail:
After publication of our Lancet Article,1 several concerns were raised with respect to the veracity of the data and analyses conducted by Surgisphere Corporation and its founder and our co-author, Sapan Desai, in our publication. We launched an independent third-party peer review of Surgisphere with the consent of Sapan Desai to evaluate the origination of the database elements, to confirm the completeness of the database, and to replicate the analyses presented in the paper.
Our independent peer reviewers informed us that Surgisphere would not transfer the full dataset, client contracts, and the full ISO audit report to their servers for analysis as such transfer would violate client agreements and confidentiality requirements. As such, our reviewers were not able to conduct an independent and private peer review and therefore notified us of their withdrawal from the peer-review process.
We always aspire to perform our research in accordance with the highest ethical and professional guidelines. We can never forget the responsibility we have as researchers to scrupulously ensure that we rely on data sources that adhere to our high standards. Based on this development, we can no longer vouch for the veracity of the primary data sources. Due to this unfortunate development, the authors request that the paper be retracted.
Interestingly, three of the four authors signed this statement. The author hwho didn’t? Surprise! Surprise! It was Dr. Desai, founder of Surgisphere. Oddly enough, though, Dr. Desai did sign the statement asking NEJM to retract.
Lessons from the Surgisphere debacle
I’m furious over this debacle. First, I’m furious at myself (and more than a bit ashamed) for not having sniffed out how dubious Surgisphere was right from the start. I even recall having nagging misgivings as I perused the Surgisphere website, thinking that the website didn’t really provide much information or evidence for how its database was used and that it all seemed a bit…off. (I ignored them.)
Equally, I’m furious at the authors of these papers, the academics who collaborated with Dr. Desai and Surgisphere. To put a paper like this together with such a collaborator would require one of two things. Either they were so hands-off the data analysis as to have been totally irresponsible, or, if they weren’t, they should have gleaned from interacting with Dr. Desai that Surgisphere’s database was too good to be true and that Dr. Desai has no background that would suggest he could so such a sophisticated analysis.
Here’s how they describe the relative contributions of each author in the retracted manuscript:
The study was conceived and designed by MRM and ANP. Acquisition of data and statistical analysis of the data were supervised and performed by SSD. MRM drafted the manuscript and all authors participated in critical revision of the manuscript for important intellectual content. MRM and ANP supervised the study. All authors approved the final manuscript and were responsible for the decision to submit for publication.
[MRM = Mandeep R. Mehra, Brigham and Women’s Hospital Heart and Vascular Center and Harvard Medical School; ANP = Amit N. Patel, Department of Biomedical Engineering, University of Utah and HCA Research Institute, Nashville; SSD = Sapan S. Desai.]
Here’s another place where I went wrong. I should have sussed out that the number of authors was too small for an undertaking of this magnitude. Where are the statisticians, for instance? Drs. Mehra and Frank Ruschitzka (the latter of whom isn’t even mentioned above, making me wonder why he was included as an author of the manuscript) are a cardiologists. Dr. Amit Patel is a cardiothoracic surgeon who studies stem cell therapy for congestive heart failure. (Damn, why didn’t I look into their backgrounds more?) In any event, another red flag is the bit about the statistical analysis of the data being “supervised” by Dr. Desai? There really should have been a named statistician as co-author of this paper. If a statistician (or team of statisticians) did such a huge statistical analysis and remain unnamed as authors, that’s just academic malpractice. Everything about this paper smells like fraud now. Unfortunately, because Surgisphere refuses to let an independent third party take even a private look at the raw data, we’ll probably never know for sure if it’s fraudulent.
Finally, I’m furious at the authors of these papers and Surgisphere—not to mention the editors of The Lancet and NEJM—because they just handed an incident to the COVID-19 deniers and hydroxychloroquine cultists that they are using (and will continue to use) to cast doubt on all the other studies that failed to find any benefit in using the drug to treat—and handed it to them on a silver platter. This whole kerfuffle also overshadowed the publication Wednesday night of the first randomized, double-blind, placebo-controlled clinical study of hydroxychloroquine for prophylaxis among adults who had household or occupational exposure to someone with confirmed COVID-19. Guess what? It was negative. There was no difference between the hydroxychloroquine and placebo control groups in incidence of new illness compatible with COVID-19. The study had significant weaknesses, specifically that not all cases of COVID-19 were confirmed by PCR, but it’s still better evidence that hydroxychloroquine doesn’t prevent COVID-19 after exposure than anything else yet published and better evidence than anything Didier Raoult has ever published.
An accompanying editorial notes that there are 203 COVID-19 trials with hydroxychloroquine, 60 of which were focused on prophylaxis. This is utter madness and far more research attention than warranted based on any decent science suggesting that hydroxychloroquine is an effective treatment for COVID-19. As I’ve pointed out multiple times before, there were already a number of observational studies before the Surgisphere study that failed to find a benefit from hydroxychloroquine, leading me to predict again and again that (most likely) hydroxychloroquine doesn’t work or that whatever effect it might have will be very modest at best. Either way, there was no scientific or ethical justification for making the drug the de facto standard of care for COVID-19 patients for so long, nor is there any scientific justification for so many open clinical trials of the drug. Those are resources that would be better used to study other therapies or to put into vaccine development.
Whether the authors who collaborated with Surgisphere and Dr. Desai were dupes or complicit, they’ve done enormous damage to public health through their negligence and Surgisphere’s likely fraud. Already the hydroxychloroquine cultists are rejoicing and using the retraction to cast doubt on all negative studies of the drug. (It’s also painful to note that, on this matter, Didier Raoult was correct, as a stopped clock is twice a day.) If there is any justice in the world, Surgisphere will go out of business, and all the authors of these papers will suffer a blight on their reputations that will be very difficult to erase. (One hopes that they learn something from this.) Perhaps this scandal will also teach journal editors that too-rapid peer review can mean sloppier peer review than normal and that, when the results can have such an impact, journals should slow down and require the same information for observational studies that they require for clinical trials.
As Anders Perner put it:
As for myself, I will in the future try very hard not to ignore that little nagging voice in my head when it makes itself known, particularly if I’m analyzing a study whose conclusions agree with what I already believe. I should have looked at the background of the authors more carefully. I should have investigated Surgisphere more carefully. I should have taken the open letter more seriously when it was released. I can’t guarantee that something like this will never happen again, but hopefully it will be a long time before I let my guard down that much again.