As I write these words, I wonder whether anyone will read them. Perhaps the authors of research reports published in this journal wonder the same thing. The short, sobering answer is, probably not.
Last year in a commentary paper titled "Does anyone read medical journals anymore?" Dr Milton Packer questioned whether authors are just wasting their time.1 He was moved to write the article after asking nearly 200 young physicians at a meeting: Had any of them ever read an issue of a scientific journal delivered to them? No. Had any of them read the title page of the prestigious New England Journal of Medicine? No. Any journal in their field of interest? No. Any single paper on any topic from start to finish? No. As a medical researcher and author himself, he was understandably discouraged. The tremendous amount of work needed to produce top tier research would seem to be worth it if the work would have an impact on thinking or clinical practice, but if not, he asked, what is the point?
The point and worthy purpose of publication in professional scientific journals is knowledge translation, to move new evidence out into the field where it will be used in clinical decision-making and contribute to evidence-based practice (EBP). Toward this aim, researchers still write for publication in much the same way they have for decades. Clinicians, on the other hand, don't read journal articles the same way they used to, despite consistently reporting that they value research and EBP.2 If research and evidence are indeed important, why not read?
Mounting evidence indicates that, relatively speaking, professional journals and conferences are ineffective ways to rapidly and directly impact individual patient care practice.3 Although professional journals and conferences deliver highly valuable information, their direct clinical impact is limited because this delivery occurs distantly in time and space from the point of care.
Clinicians seek evidence to solve problems for the patient who is immediately in front of them. They want quick and easy access to evidence-based answers to their currently pressing clinical questions. Interactive, searchable, evidence-based "clinical decision support systems" have been developed to meet this demand.3 Readers are likely familiar with the Cochrane Library, which offers such a resource: Cochrane Clinical Answers.4 Several other systems are also available, such as ACP Journal Club (Annals of Internal Medicine), BMJ Best Practice, DynaMed Updates (EBSCO Host), Essential Evidence Plus, TRIP (Turning Research Into Practice), and UpToDate (Wolters Kluwer).5-10 These commercial subscription services provide research summaries written by capable experts with the time and expertise to select the highest quality studies, synthesize their results, and distill the most clinically relevant information into easy-to-read synopses. Examples of these research summaries include systematic reviews, meta-analyses, and clinical guidelines. These clinical question-and-answer-oriented methods of knowledge translation reduce many of the barriers to EBP reported by clinicians: insufficient time to read, feeling overwhelmed by the sheer number of studies, perceived lack of skill in understanding statistics and appraising scientific research, low tolerance for academic and scientific jargon, frustration reading single studies with conflicting results or irrelevant outcome measures, etc.11,12 Use of these systems by physical therapists may be limited, however. Existing commercial clinical decision support services target primarily physicians; as yet, no such system dedicated to supporting rehabilitation professionals has been developed. Access to these services requires a paid subscription, which may be a barrier for clinicians without institutional health information support.
Where do these changes in how clinicians prefer to seek evidence leave professional journals and the authors who publish in them? An equation from Shaughnessy et al13 helps to answer this question. They posit that
where U is the clinical usefulness of any research report, R is the clinical relevance, V is validity (scientific rigor) of the study, and W is the amount of work required for the reader to access and understand the study. Studies are considered highly clinically relevant when the measured outcomes are patient-oriented (versus disease or impairment-oriented), the studied diagnostic, therapeutic or preventive procedures are feasible and benefit patients, and-assuming the study was valid-the findings will require you to change the way you practice.
Point-of-care clinical decision support systems significantly reduce the work needed to obtain evidence. Yet the actual evidence-original research and its publication-must still be produced. Systematic research summaries cannot exist without high quality studies as their source. Researchers can contribute to EBP by increasing the scientific rigor and clinical relevance of their work, and including in their manuscripts explicit discussion of the practical implications and potential clinical applications of their study findings. Journals can support EBP by prioritizing the publication of systematic reviews, meta-analyses, clinical guidelines, and high-quality research reports sufficiently rigorous to earn their way into these research review summaries.
Encouragingly, Fell and colleagues14 reported that, for physical therapists who are APTA members and who participate in specialty section listservs and electronic newsletters, journal articles were the primary source of evidence for 66% of those surveyed.14 Thus, through a continued increase in the relevance and rigor of the articles published in the Journal of Geriatric Physical Therapy, we aim to expand our value to readers and our contributions to EBP.
REFERENCES