When I first read the manuscript that eventually became the lead article for this month’s issue, I could see that for several reasons, I would have to comment on its findings in an editorial. Throughout the fall of 2021, I thought about this article almost every day, trying to come to grips with the best way to get my points across regarding its meaning, while at the same time, doing so in a way that would convince people of its importance, encourage them to reflect on what it says, and not alienate the very readers whom I would most like to consider these findings with an open mind. First, a disclaimer—I was the chair of the American Academy of Orthopaedic Surgeons’ (AAOS’) Clinical Practice Guideline (CPG) on the Diagnosis and Treatment of Carpal Tunnel Syndrome, which was released in 2016, and that is very central to the findings of the paper by Billig and Sears in this month’s issue of the Journal of Hand Surgery. I would not blame anyone for wondering whether my personal biases had anything to do with the recommendations of the latest guidelines. However, before assuming that it is critical to know how the guidelines are developed, because that also has some substantial bearing on how we should look at the results of the paper by Billig and Sears.
These guidelines were developed under the aegis of the Evidence-Based Quality and Value Committee of the AAOS. This is a highly structured process that involves methodologists from AAOS and a workgroup of clinical experts. The workgroup is meant to encompass the full breadth of clinical expertise, which is a very important consideration for a condition like carpal tunnel syndrome (CTS) that is managed by a broad spectrum of clinicians from different backgrounds. The 2016 guideline had 11 members, in addition to the chair and vice chair. These individuals included plastic surgeons, hand surgeons, a physiatrist, a neurologist, a hand health physician, and a hand therapist. The methodologists from the Clinical Quality and Value Department of AAOS work with medical librarians to identify all the literature pertinent to the guideline topic. They then evaluate the quality of studies meeting the inclusion criteria and abstract, analyze, interpret, and summarize the relevant data for the workgroup. Those summaries are presented to the workgroup for discussion and voting to ensure that there is a consensus on the recommendations. Recommendations that are adopted are given a strength based on GRADE’s Evidence to Decision framework (Grading of Recommendations Assessment, Development and Evaluation; https://www.gradeworkinggroup.org/). In other words, there is no latitude for workgroup members to inject their own biases into the nature of the recommendations because they are developed according to this highly formulaic methodology that is used for all AAOS CPGs. Once the guideline has been established, there is a period of public review before it is published. There is an extensive document describing the process in detail (https://www.aaos.org/globalassets/quality-and-practice-resources/methodology/cpg-methodology.pdf), which I would strongly recommend to anyone who has some doubt regarding the objectivity or integrity of this effort.
In summary, these guidelines represent an objective synthesis of the best evidence available on a topic: in this instance, the diagnosis and treatment of CTS. One of the recommendations with potentially the most impact for the way CTS is managed was that either clinical tests or electrodiagnostic studies (EDS) could be used to make a diagnosis. This recommendation was made based on “moderate evidence” that was derived from “evidence from two or more ‘moderate’ strength studies with consistent findings, or evidence from a single ‘high’ quality study recommending for or against the intervention.”
So now, 5 years after their publication, what effect are the guidelines having on the way members of the American Society for Surgery of the Hand manage the diagnosis of CTS? Well, that is the subject of the article by Billig and Sears. The study found that among the 770 respondents, 26% require EDS studies before they evaluate the patient. Only 38% thought that the guidelines were appropriate, and 89% stated that the CPG did not change their use of EDS. Surprisingly, just over 42% did not know whether the guidelines were appropriate because they were either uncertain whether they are appropriate or they did not know what the CPG stated.
The reasons why the respondents continue to obtain EDS—even, in a substantial proportion of instances, before they themselves have evaluated the patient—despite evidence strongly suggesting that they are not required to diagnose CTS are the ones that have been talked about for years: confirmation of the diagnosis, predicting the response to treatment, as a baseline in case treatment is unsuccessful, malpractice concerns, insurer requirements, and so forth. This is not the place to refute each of these reasons, but I will just summarize by stating that there is very little, if any, evidence to support most of them as a justification to obtain tests that are painful, are costly, and have clearly been shown to delay care.
1
To me, the bigger question is what the apparent disregard for evidence that is collected and summarized in an objective and unbiased way means about us, hand surgeons. We clearly live in an era of skepticism about facts, and sadly, that seems to have permeated even into our world. But if we are going to disregard facts and evidence, on what basis do we practice? First, guidelines are just that: suggestions on the best way to approach a problem. Clearly, there is plenty of latitude to use judgment, but is it really justified to exercise that judgment external to evidence-based principles of care, even if those principles conflict with an individual surgeon’s personal beliefs? On what basis would 60% of respondents to this survey think that the guidelines were not appropriate? Are they responding to a general resistance to being guided in the way they practice, even if that guidance reflects the best available evidence, or do they just not believe in evidence? I do not necessarily endorse the idea that hand surgeons should be slaves to everything they read in the Journal of Hand Surgery, but if they are not going to be influenced by evidence, like that encapsulated by the AAOS CPG, I am not sure what that means for our field. As for the 42% who did not even seem to know what was in the guideline 5 years after its release, I am frankly at a loss for an explanation. Although it is well known that the dissemination of evidence is an arduous and lengthy process, in an age where information is so readily and easily available, I do not see a viable reason for not being informed about one of the most encountered conditions in hand surgery practice.2
,3
Surgeons may be justified in complaining about the ongoing erosion of their autonomy, because there are many facets of practice in which that independence is being limited; however, this is not one of those. That is because, in the end, our responsibility is to the patients, who deserve to be cared for with the benefit of our aggregated experience, which is best represented by the careful curation of evidence, not anecdotes. We should be investing our collective energy in doing the best possible for the patients, and that requires being informed about evidence that has an impact on our care and our decisions, while carefully using our judgment in applying that evidence to the patient in front of us. The paper by Billig and Sears shows that, in the care of CTS, we have a long way to go.
References
- The impact of pre-referral advanced diagnostic testing on wait time to see a hand surgeon for common upper-extremity conditions.J Hand Surg Am. 2019; 44: 1013-1020
- Evidence-based implementation of evidence-based medicine.Jt Comm J Qual Improv. 1999; 25: 503-513
- Getting the word out: new approaches for disseminating public health science.J Public Health Manag Pract. 2018; 24: 102-111
Article info
Identification
Copyright
© 2022 by the American Society for Surgery of the Hand. All rights reserved.