nielsr HF Staff commited on
Commit
0fa64f0
·
verified ·
1 Parent(s): ce9f5b0

Add pipeline tag and link to paper

Browse files

This PR adds the `pipeline_tag` to the model card, ensuring the model can be found at https://huggingface.co/models?pipeline_tag=text-classification. It also links to the paper [MATE: LLM-Powered Multi-Agent Translation Environment for Accessibility Applications](https://huggingface.co/papers/2506.19502).

Files changed (1) hide show
  1. README.md +5 -1
README.md CHANGED
@@ -1,7 +1,9 @@
1
  ---
2
  library_name: transformers
3
  license: mit
 
4
  ---
 
5
  ## Model Description
6
 
7
  We introduce ModCon-Task-Identifier, a fine-tuned BERT model that is capable of identifying the modality conversion task type based on the user’s prompt.
@@ -10,4 +12,6 @@ The model was developed as a part of the Multi-Agent MATE project, the goal of w
10
  conversion framework. Based on the user’s query, the system will convert the input file to the desired format by changing the modality
11
  (for instance, a text can be converted to an image, or a video can be converted to an audio)
12
 
13
- **The official project repository and the full project code are available at https://github.com/AlgazinovAleksandr/Multi-Agent-MATE**
 
 
 
1
  ---
2
  library_name: transformers
3
  license: mit
4
+ pipeline_tag: text-classification
5
  ---
6
+
7
  ## Model Description
8
 
9
  We introduce ModCon-Task-Identifier, a fine-tuned BERT model that is capable of identifying the modality conversion task type based on the user’s prompt.
 
12
  conversion framework. Based on the user’s query, the system will convert the input file to the desired format by changing the modality
13
  (for instance, a text can be converted to an image, or a video can be converted to an audio)
14
 
15
+ **The official project repository and the full project code are available at https://github.com/AlgazinovAleksandr/Multi-Agent-MATE**
16
+
17
+ This model is based on the paper [MATE: LLM-Powered Multi-Agent Translation Environment for Accessibility Applications](https://huggingface.co/papers/2506.19502)