# TADBot ## Overview TADBot is small language model that is trained on the dataset. It is a fine-tuned version of the Gemma 2 2B, which is a small language model with 2 billion parameters. TADBot is designed to assist people deal with mental problems and offer them advice based on the context of the conversation. It is not intended to replace professional mental health care, but rather to provide a supportive and empathetic resource for those who may be struggling with mental health issues. TADBot is still in development and is not yet available for public use. ## Technology used - Gemma 2 2B: A small language model with 2 billion parameters that TADBot is fine-tuned on. - : The dataset used to train TADBot on mental health and advice-giving tasks. - Hugging Face Transformers: A library used to fine-tune the Gemma 2 2B model on the dataset. - PyTorch: A library used for training and fine-tuning the language model. - Flask: A library used to create a server for TADBot. - Raspberry Pi: A small, low-cost computer used to host Test to Speech and Speech To Text models and TADBot server. - FER: A deep learning model used to detect emotions from faces in real-time using a webcam. # Features # How It Works ## Model # Implementation ## Deployment Instructions To deploy TADBot locally, you will need to follow these steps: - create a virtual environment(preferrablly python 3.11.10) with pip-tools or uv installed and install the required dependencies ``` pip sync requirements.txt #if u are using pip-tools pip install -r requirements.txt #if u are using pip uv sync #if u are using uv ```