Update README.md
Browse files
README.md
CHANGED
|
@@ -15,6 +15,20 @@ pipeline_tag: summarization
|
|
| 15 |
|
| 16 |
This is the fine-tuned BioBART model for summarizing findings in PET reports.
|
| 17 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 18 |
## 📑 Abstract
|
| 19 |
|
| 20 |
Purpose: To determine if fine-tuned large language models (LLMs) can generate accurate, personalized impressions for whole-body PET reports.
|
|
|
|
| 15 |
|
| 16 |
This is the fine-tuned BioBART model for summarizing findings in PET reports.
|
| 17 |
|
| 18 |
+
To check our fine-tuned large language models (LLMs) for PET report summarization:
|
| 19 |
+
- [BERT2BERT-PET](https://huggingface.co/xtie/Clinicallongformer2roberta-PET-impression)
|
| 20 |
+
- [BART-PET](https://huggingface.co/xtie/BART-PET-impression)
|
| 21 |
+
- [BioBART-PET](https://huggingface.co/xtie/BioBART-PET-impression)
|
| 22 |
+
- [PEGASUS-PET](https://huggingface.co/xtie/PEGASUS-PET-impression)
|
| 23 |
+
- [T5v1.1-PET](https://huggingface.co/xtie/T5v1.1-PET-impression)
|
| 24 |
+
- [Clinical-T5-PET](https://huggingface.co/xtie/ClinicalT5-PET-impression)
|
| 25 |
+
- [Flan-T5-PET](https://huggingface.co/xtie/Flan-T5-PET-impression)
|
| 26 |
+
- [GPT2-XL-PET](https://huggingface.co/xtie/GPT2-PET-impression)
|
| 27 |
+
- [OPT-1.3B-PET](https://huggingface.co/xtie/OPT-PET-impression)
|
| 28 |
+
- [LLaMA-LoRA-PET](https://huggingface.co/xtie/LLaMA-LoRA-PET-impression)
|
| 29 |
+
- [Alpaca-LoRA-PET](https://huggingface.co/xtie/Alpaca-LoRA-PET-impression)
|
| 30 |
+
|
| 31 |
+
|
| 32 |
## 📑 Abstract
|
| 33 |
|
| 34 |
Purpose: To determine if fine-tuned large language models (LLMs) can generate accurate, personalized impressions for whole-body PET reports.
|