Update README.md
Browse files
README.md
CHANGED
|
@@ -22,62 +22,8 @@ The `t5-small` model, developed by Google and fine-tuned by the Frontida team, s
|
|
| 22 |
- **Paper:** "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer" (Raffel et al., 2019)
|
| 23 |
- **Demo:** Frontida Chatbot Interface (link to demo if available)
|
| 24 |
|
| 25 |
-
|
| 26 |
|
| 27 |
-
|
| 28 |
-
|
| 29 |
-
|
| 30 |
-
|
| 31 |
-
### Downstream Use
|
| 32 |
-
|
| 33 |
-
While primarily designed for direct interaction within Frontida, the model's applications can extend to other mental health support systems, offering a foundation for empathetic, AI-driven conversation.
|
| 34 |
-
|
| 35 |
-
### Out-of-Scope Use
|
| 36 |
-
|
| 37 |
-
The model is not intended for clinical diagnosis or as a substitute for professional healthcare advice.
|
| 38 |
-
|
| 39 |
-
## Bias, Risks, and Limitations
|
| 40 |
-
|
| 41 |
-
We acknowledge the potential for biases in AI models and have taken steps to mitigate such risks in `t5-small`. However, users should be aware of the model's limitations, particularly in understanding the full scope of an individual's emotional state.
|
| 42 |
-
|
| 43 |
-
### Recommendations
|
| 44 |
-
|
| 45 |
-
Users are encouraged to use Frontida as a supplementary support tool alongside traditional mental health resources. Ongoing model training and refinement are priorities to ensure the most empathetic and accurate responses.
|
| 46 |
-
|
| 47 |
-
## How to Get Started with the Model
|
| 48 |
-
|
| 49 |
-
To interact with Frontida's `t5-small` model, users can access our chatbot via the Frontida web application. Developers interested in exploring the model's architecture and training can visit our repository on Hugging Face.
|
| 50 |
-
|
| 51 |
-
## Training Details
|
| 52 |
-
|
| 53 |
-
### Training Data
|
| 54 |
-
|
| 55 |
-
The model was fine-tuned on a curated dataset comprising diverse conversations and texts related to mental health, specifically postpartum depression, ensuring a wide range of scenarios and emotions are covered.
|
| 56 |
-
|
| 57 |
-
### Training Procedure
|
| 58 |
-
|
| 59 |
-
#### Preprocessing
|
| 60 |
-
|
| 61 |
-
Text data was normalized and tokenized using standard NLP preprocessing techniques to ensure consistency and improve model understanding.
|
| 62 |
-
|
| 63 |
-
#### Training Hyperparameters
|
| 64 |
-
|
| 65 |
-
- Training regime details are provided in the repository, focusing on optimizing performance while maintaining the model's efficiency.
|
| 66 |
-
|
| 67 |
-
## Evaluation
|
| 68 |
-
|
| 69 |
-
### Testing Data, Factors & Metrics
|
| 70 |
-
|
| 71 |
-
Evaluation was conducted using a separate test set, focusing on accuracy, empathy in responses, and relevance of video recommendations.
|
| 72 |
-
|
| 73 |
-
## Environmental Impact
|
| 74 |
-
|
| 75 |
-
Efforts were made to minimize the carbon footprint during training, with details on compute usage and emissions available upon request.
|
| 76 |
-
|
| 77 |
-
## Technical Specifications
|
| 78 |
-
|
| 79 |
-
Further details on the model's architecture, objective, and compute infrastructure are available in the Frontida repository.
|
| 80 |
-
|
| 81 |
-
## More Information
|
| 82 |
-
|
| 83 |
-
For additional details, including how to contribute to the model's development or integrate it into other applications, please visit the Frontida project page on Hugging Face.
|
|
|
|
| 22 |
- **Paper:** "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer" (Raffel et al., 2019)
|
| 23 |
- **Demo:** Frontida Chatbot Interface (link to demo if available)
|
| 24 |
|
| 25 |
+
### Team
|
| 26 |
|
| 27 |
+
- **Danroy Mwangi** - Team Lead and NLP Lead
|
| 28 |
+
- **Maria Muthiore** - Backend Lead
|
| 29 |
+
- **Nelson Kamau** - Frontend Lead
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|