Big Data and the Art of Medicine
Posted March 20, 2015in Insights & Research
Tags: DynaMed
Increasingly, “big data” is emerging in our tech vocabulary. The expression means different things to different market sectors. In healthcare, it usually boils down to patient data – deidentified and not – run through analytics tools to gain clinical insights. The question big data begs: What is the clinical relevance of the insights these automation tools give us as our physicians practice medicine, which is and shall always be an art?
Analytics systems at pharmaceutical companies, public health organizations or even IBM’s Watson seek patterns in massive data sets for insights about prescribing, treatment outcomes and disease research. From these analytics, those patterns can reveal longitudinal trends that may otherwise be either too labor-intensive or even impossible to discover through other means.
Electronic medical reference publishers aren’t necessarily consumers of big data. In fact, in products like DynaMed, as we keep up with the daily flow of research published in journals and other relevant sources such as guideline collections and drug information repositories, it’s the “little data,” or day-to-day releases, that concern us as we add incremental gains in knowledge to what is already known. Putting it together, we can use our analytical tools and human intuition to determine how a new study might clarify or inform previously established evidence to guide treatment paths.
Mining data for clinical evidence
Today, we’re just starting to use big data in healthcare. Over the coming years, it will play a bigger role in clinical studies as researchers learn to use analytics tools to identify tighter cohorts, and to compare multiple trials to each other. We’re on the cusp of understanding how big data can be used as a research tool to find bigger, broader trends than possible with traditional methods. But we’re not quite there yet.
That being said, the tools for conducting clinical research – and communicating it – are getting faster. Studies are building up on each other faster than ever before, and it’s getting harder for physicians to keep up with the best available evidence to assist them in making the best treatment decisions. Emerging technologies help publishers like us to stay current, as we examine data sets and constantly compare them against the established guidelines clinicians trust.
Evidence-based guidelines, which in combination with a clinician’s experience and education guide patient-care decisions, present a healthcare paradox. As patients demand more personalized and precision medicine, they’re driving a need for more granular information and specific guidelines for different patient populations. Because, after all, one size does not fit all patients. Yet guidelines are developed from data that has been purposefully abstracted so that they are one-size-fits-all.
Human factors make medicine personalized
That’s where the physician comes in. Science can give us the best possible information, distilled into guidelines through careful examination of the evidence, which is constantly re-examined as new research is published. In the end, however, the most important data set for the patient isn’t contained in a cloud server farm. It’s in the exam room, wearing a white coat and stethoscope, making the best decisions on an individual basis.
Together, the evidence-based guideline – viewed through the lens of human intuition – create medicine. And that is why medicine will remain an art, no matter the assistance high-tech big data and analytics systems can offer.
Betsy Jones is Vice President of Medical Product Management and Chief Content Officer for DynaMed. Prior to that, she served as Senior Vice President and Publisher for the American Medical Association and JAMA Network, and was Executive Vice President and General Manager at Wolters Kluwer Health/LWW.