Update README.md
Browse files
README.md
CHANGED
|
@@ -1,27 +1,16 @@
|
|
| 1 |
---
|
| 2 |
library_name: transformers
|
| 3 |
pipeline_tag: summarization
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 4 |
---
|
| 5 |
|
| 6 |
-
tags:
|
| 7 |
-
- politics
|
| 8 |
-
- summarization
|
| 9 |
-
- climate change
|
| 10 |
-
- political party
|
| 11 |
-
- press release
|
| 12 |
-
- political communication
|
| 13 |
-
- European Union
|
| 14 |
-
- Speech
|
| 15 |
-
license: afl-3.0
|
| 16 |
-
language:
|
| 17 |
-
- en
|
| 18 |
-
- es
|
| 19 |
-
- da
|
| 20 |
-
- de
|
| 21 |
-
- it
|
| 22 |
-
- fr
|
| 23 |
-
- nl
|
| 24 |
-
- pl
|
| 25 |
# Text Summarization
|
| 26 |
|
| 27 |
The model used in this summarization task is a T5 summarization transformer-based language model fine-tuned for abstractive summarization.
|
|
@@ -30,6 +19,8 @@ This model is intended to summarize political texts regarding generates summarie
|
|
| 30 |
|
| 31 |
The model was fine-tuned on 10k political party press releases from 66 parties in 12 different countries via an abstract summary.
|
| 32 |
|
|
|
|
|
|
|
| 33 |
## Model Details
|
| 34 |
|
| 35 |
Pretrained Model: The model uses a pretrained tokenizer and model from the Hugging Face transformers library (e.g., T5ForConditionalGeneration).
|
|
@@ -97,4 +88,4 @@ This training process allowed the model to learn not only the specific language
|
|
| 97 |
journal={Comparative Political Studies},
|
| 98 |
year={2024},
|
| 99 |
publisher={SAGE Publications Sage CA: Los Angeles, CA}
|
| 100 |
-
}
|
|
|
|
| 1 |
---
|
| 2 |
library_name: transformers
|
| 3 |
pipeline_tag: summarization
|
| 4 |
+
tags:
|
| 5 |
+
- politics,
|
| 6 |
+
- summarization,
|
| 7 |
+
- climate
|
| 8 |
+
- political
|
| 9 |
+
- party,
|
| 10 |
+
- press
|
| 11 |
+
- european
|
| 12 |
---
|
| 13 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 14 |
# Text Summarization
|
| 15 |
|
| 16 |
The model used in this summarization task is a T5 summarization transformer-based language model fine-tuned for abstractive summarization.
|
|
|
|
| 19 |
|
| 20 |
The model was fine-tuned on 10k political party press releases from 66 parties in 12 different countries via an abstract summary.
|
| 21 |
|
| 22 |
+
True class lables generated via GPT 4o summarization of 10k political party press releases
|
| 23 |
+
|
| 24 |
## Model Details
|
| 25 |
|
| 26 |
Pretrained Model: The model uses a pretrained tokenizer and model from the Hugging Face transformers library (e.g., T5ForConditionalGeneration).
|
|
|
|
| 88 |
journal={Comparative Political Studies},
|
| 89 |
year={2024},
|
| 90 |
publisher={SAGE Publications Sage CA: Los Angeles, CA}
|
| 91 |
+
}
|