LabradorTransformer commited on
Commit
e14fefc
·
verified ·
1 Parent(s): 9cb83b9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -19,10 +19,10 @@ Laboratory data are a rich source of information about a patient's health. They
19
 
20
  Labrador is pre-trained on a large corpus of 100 million lab tests from over 260,000 patients. We rigorously evaluate Labrador on intrinsic and extrinsic tasks, including four real-world problems: cancer diagnosis, COVID-19 diagnosis, predicting elevated alcohol consumption and ICU mortality due to sepsis. We find that Labrador is superior to BERT across all evaluations but both are outperformed by XGBoost indicating that transfer learning from continuous EHR data is still an open problem.
21
 
22
- We discuss the limitations of our approach and suggest future directions for research in the corresponding paper, [Labrador: Exploring the Limits of Masked Language Modeling for Laboratory Data]().
23
 
24
 
25
- - **Developed by:** David Bellamy
26
  - **Model type:** BERT-style transformer
27
  - **License:** MIT
28
 
 
19
 
20
  Labrador is pre-trained on a large corpus of 100 million lab tests from over 260,000 patients. We rigorously evaluate Labrador on intrinsic and extrinsic tasks, including four real-world problems: cancer diagnosis, COVID-19 diagnosis, predicting elevated alcohol consumption and ICU mortality due to sepsis. We find that Labrador is superior to BERT across all evaluations but both are outperformed by XGBoost indicating that transfer learning from continuous EHR data is still an open problem.
21
 
22
+ We discuss the limitations of our approach and suggest future directions for research in the corresponding paper, Labrador: Exploring the Limits of Masked Language Modeling for Laboratory Data.
23
 
24
 
25
+ - **Developed by:** Anonymous Submitters
26
  - **Model type:** BERT-style transformer
27
  - **License:** MIT
28