HealthJune 13, 2024

Labs and Large Language Models: Transforming healthcare informatics

In today’s healthcare ecosystem, the efficacy of medical decisions heavily hinges on laboratory test results, shaping nearly 70% of clinical decisions. Despite this, lab data is notoriously hard to work with. Failure to map or accurately interpret data poses multifaced risks, including compromised patient safety, gaps in care, and unreliable analytics. 

Challenges in normalizing lab data

According to the CDC, 14 billion laboratory tests are ordered annually in the US. How do we convert this vast amount of lab data into a valuable resource? Our endeavor begins by understanding the complexities involved in tasks like mapping lab results to LOINC, which are more challenging than they might appear.

Lab data is notoriously messy and often not standardized to a terminology. The standard for lab data is LOINC (Logical Observation Identifiers Names and Codes), but normalizing lab data to LOINC is more complicated than it appears at first glance, for several reasons.

  1. LOINC contains more than just laboratory codes, including clinical observations, HIPAA documents, and standardized survey instruments.
  2. LOINC is constantly updating, releasing several versions per year.
  3. Lab results are made up of six axis — spanning what’s measured, where, and how— that all must be correct to achieve an accurate map.

Approaches to data mapping through the years

Over the years, clinical informaticists have used the best technology and techniques available to normalize this data and unlock its value for informed decision making, quality improvement, and enhanced patient care. The approaches for mapping lab data have evolved from manual, rule-based approaches to more automated machine learning models.

Early 2010s - Clinical Rules 

Some of the first efforts in mapping complex data like labs were curated set of rules built by studying how humans approach the problem. Rules could be built using logic implicit to LOINC that included synonyms, word replacements, and unit conversions. This approach resulted in high accuracy rates, and was easy for mappers to understand the logic, but isn’t scalable and doesn’t learn from past results.

Late 2010s - Deep learning models

As AI and machine learning technology matured, the maps built in a clinical rules model could be leveraged to train a deep learning model to predict each of the six components that helped automate the effort. This improved accuracy and coverage across datasets. However, it is unable to generalize for unknown data, which presented challenges during the COVID pandemic when brand new LOINC codes and data were being introduced at a rapid pace, still requiring human intervention to map data and train the model.

This brings us to the current day.  The latest development in technology for mapping lab data is Large Language Models (LLMs).   

Unveiling the Power of Large Language Models

What are Large Language Models (LLMs)?

Large Language Models (LLMs) are a type of AI that can recognize and analyze text. They are based on transformers, a type of neural network architecture, that’s scalable and allows the LLM to be trained on vast amounts of data.

Integrating Large Language Models into the lab mapping workflow

Although it's a great step forward in terms of efficiency, a Large Language Model by itself is not enough to accurately map a complex clinical dataset like LOINC codes. LLMs offer a higher launching point compared to previous technologies, but they still need to be fine-tuned and have more clinical information imparted to bring them to a really usable and effective mapping algorithm. It’s when a LLM is combined with a foundation of terminology expertise, clinically accurate maps, and curated LOINC synonyms, that we’re able to see improvements in both speed and accuracy of mapping.

The call to action for healthcare organizations

For healthcare organizations, the call to action is not just to invest in technology but in transformation. This means putting resources into both new tools and the team. For informatics professionals and data scientists, the future looks promising. We have access to more powerful tools, but moving forward, we need a mix of respect for clinical accuracy, a willingness to explore new ways of doing things, and adaptability to navigate the changing healthcare data landscape.

In a landscape marked by data complexity , the strategic adoption of AI-driven lab mapping solutions provides a path to the forefront of data-driven healthcare transformation. Our terminology experts are here to help - reach out today to learn more!

Health Language Data Interoperability 
Chris Funk, Ph.D.
Senior Medical Informaticist of Health Language, Wolters Kluwer, Health

As a Senior Medical Informaticist, Christopher supports the company’s Health Language solutions by providing physician documentation within the electronic medical record, along with integrating advanced technology, such as clinical natural language processing.

Back To Top