Update README.md
Browse files
README.md
CHANGED
|
@@ -16,7 +16,7 @@ pipeline_tag: text-generation
|
|
| 16 |
the smallest model has 85 million parameters, while the largest has 1.2 billion. SARITA generates new S1 sequences using as an input the 14 amino acid sequence preceding it. The Results of SARITA are reported in the folliwing pre-print: https://www.biorxiv.org/content/10.1101/2024.12.10.627777v1.
|
| 17 |
The codes to train and to evaluate the model is avaiable on [GitHub](https://github.com/simoRancati/SARITA)
|
| 18 |
|
| 19 |
-
SARITA models trained with high-quality SARS-CoV-2 S1 sequences from December 2019 - March 2021. Click on any model name (e.g. Small, Medium, Large and XLarge) to go to its dedicated page, where you’ll find detailed access instructions and example code snippets to help you reproduce our results
|
| 20 |
|
| 21 |
Model | #Params | d_model | layers
|
| 22 |
--- | --- | --- | --- |
|
|
@@ -26,7 +26,8 @@ Model | #Params | d_model | layers
|
|
| 26 |
[XLarge](https://huggingface.co/SimoRancati/SARITA_XL)| 1.2B | 2048 | 24
|
| 27 |
|
| 28 |
|
| 29 |
-
SARITA models trained with high-quality SARS-CoV-2 S1 sequences from December 2019 - August 2024. Click on any model name (e.g. Small, Medium, Large and XLarge)
|
|
|
|
| 30 |
|
| 31 |
Model | #Params | d_model | layers
|
| 32 |
--- | --- | --- | --- |
|
|
|
|
| 16 |
the smallest model has 85 million parameters, while the largest has 1.2 billion. SARITA generates new S1 sequences using as an input the 14 amino acid sequence preceding it. The Results of SARITA are reported in the folliwing pre-print: https://www.biorxiv.org/content/10.1101/2024.12.10.627777v1.
|
| 17 |
The codes to train and to evaluate the model is avaiable on [GitHub](https://github.com/simoRancati/SARITA)
|
| 18 |
|
| 19 |
+
SARITA models trained with high-quality SARS-CoV-2 S1 sequences from December 2019 - March 2021. **Click on any model name (e.g. Small, Medium, Large and XLarge) to go to its dedicated page, where you’ll find detailed access instructions and example code snippets to help you reproduce our results.**
|
| 20 |
|
| 21 |
Model | #Params | d_model | layers
|
| 22 |
--- | --- | --- | --- |
|
|
|
|
| 26 |
[XLarge](https://huggingface.co/SimoRancati/SARITA_XL)| 1.2B | 2048 | 24
|
| 27 |
|
| 28 |
|
| 29 |
+
SARITA models trained with high-quality SARS-CoV-2 S1 sequences from December 2019 - August 2024. Click on any model name. **Click on any model name (e.g. Small, Medium, Large and XLarge) to go to its dedicated page, where you’ll find detailed access instructions and example code snippets to help you reproduce our results.**
|
| 30 |
+
|
| 31 |
|
| 32 |
Model | #Params | d_model | layers
|
| 33 |
--- | --- | --- | --- |
|