Healthcare Embeddings API

Ricky Sahu
by Ricky Sahu
2023-08-22

Just recently we implemented an embedding API for GenHealth’s V1 model. To demonstrate its use, we took every input feature in our vocabulary and created an embedding for it. Then we mapped those embedding's to the most similar other input features. Finally, we displayed those relationships as a first of a kind interactive 3D graph of the relationships among these features. Take a look here at the medical embeddings graph here.

As healthcare organizations continue to invest in artificial intelligence (AI) technologies, one understudied area of focus has been generative AI transformer embeddings. These embeddings are exceptionally powerful and can be used to create relationships between medical concepts, enabling healthcare providers and organizations to better understand patient needs and develop more effective treatment plans and population health analytics.

At its core, an embedding is a mathematical representation of a concept or object. Within the transformer generative AI world, they are typically the last layer of the neural network. In the case of GenHealth, that is a vector of 800 numeric floating point values. This is the layer right before our V1 model decides which token will be next in a sequence.

History of Embeddings

Embeddings are widely used outside of healthcare, particularly in the field of natural language processing (NLP) and large language models. One of the early breakthroughs in this area was the development of Word2Vec, a tool that uses embeddings to represent words as vectors in a high-dimensional space. By doing so, Word2Vec can identify semantic relationships between words based on their proximity in the vector space. For example, the vectors for "king" and "queen" would be close together, while the vector for "king" and "banana" would be far apart.

These embeddings are used for a variety of tasks, including search and similarity. For example, search engines like Google use embeddings to match search queries with relevant web pages. Similarly, embeddings can be used to identify related documents or sentences based on the proximity of their vectors in the vector space. This proximity can be as simple as the Euclidian distance, but more advanced algorithms are used today.

More recently, large language models like GPT-3 have used embeddings to find semantically similar human-like language. These models are trained on massive amounts of text data, and use embeddings to represent words and phrases in a way that captures their semantic meaning. By doing so, these models that generate text can also be used to support embeddings.

Embeddings in Healthcare

In the context of healthcare, embeddings can be used to represent medical conditions, treatments, and other relevant data points. When combined with AI algorithms, these embeddings can be used to identify patterns and relationships that would otherwise be difficult to detect.

One of the key benefits of generative AI transformer embeddings is their ability to create relationships between medical concepts. For example, an embedding might be used to create a relationship between a particular medical condition and a specific treatment. This can help healthcare providers to more accurately diagnose and treat patients, leading to better health outcomes.

But the power of embeddings goes beyond individual medical concepts. By creating relationships between different higher level concepts, embeddings can be used to identify almost any relationships. For example, embeddings can be used to identify similarities between patients given entire medical histories. This can help healthcare providers to develop more targeted treatment plans or deduplicate patients in mountains of healthcare data.

In addition to patient care, generative AI transformer embeddings can also be used to improve healthcare operations. For example, embeddings might be used to create relationships between providers and patients, enabling healthcare organizations to more effectively match patients with providers based on their needs and preferences. Embeddings can also be used to improve search results within electronic health records (EHRs), making it easier for healthcare providers to find the information they need.

The embedding layer is a very powerful tool for creating relationships between medical concepts. This can be expanded to include relationships for higher order concepts like episodes of care, patient similarities, providers search, and more. As healthcare organizations continue to invest in AI technologies, it is likely that generative AI transformer embeddings will become an increasingly important tool for improving patient outcomes and reducing healthcare costs.