luffycodes's picture
Update README.md
1da59e1
---
license: llama2
language:
- en
library_name: transformers
tags:
- chatgpt
- tutorbot
- physics
- code
- math
- mathematics
- llama
---
# Higgs Model Card
## Github details
Please checkout the repo: https://github.com/luffycodes/Tutorbot-Spock-Phys.
## Model details
**Model type:**
Higgs is an open-source educational tutoring chatbot trained by fine-tuning LLaMA-2-70B-chat model on synthetic student-tutorbot conversations generated using [specialized prompts](https://github.com/luffycodes/Tutorbot-Spock-Phys/tree/main/prompts/conversation_gen).
Higgs performs a code soliloquy (an inner monologue) in which Higgs prompts itself through a series of prompts to decide if it's next response to the student would need any math calculations.
If Higgs determines that the response might require such a calculation, such as in cases to verify student calculation, it outputs python code.
Then, it uses the output of the python code to frame a mathematically accurate reply to the student.
**Model date:**
Higgs was trained between May 2023 and Sept 2023.
**Organizations developing the model:**
The Higgs (Spock) team with members from Rice University and OpenStax.
**Where to send questions or comments about the model:**
Shashank Sonkar ([email protected])
If you use this work, please cite:
Code Soliloquies for Accurate Calculations in Large Language Models
https://arxiv.org/abs/2309.12161
```
@misc{sonkar2023code,
title={Code Soliloquies for Accurate Calculations in Large Language Models},
author={Shashank Sonkar and MyCo Le and Xinghe Chen and Lucy Liu and Debshila Basu Mallick and Richard G. Baraniuk},
year={2023},
eprint={2309.12161},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```