File size: 1,741 Bytes
01800b6 686539e 5788266 686539e 5788266 686539e 1da59e1 686539e 1da59e1 686539e |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 |
---
license: llama2
language:
- en
library_name: transformers
tags:
- chatgpt
- tutorbot
- physics
- code
- math
- mathematics
- llama
---
# Higgs Model Card
## Github details
Please checkout the repo: https://github.com/luffycodes/Tutorbot-Spock-Phys.
## Model details
**Model type:**
Higgs is an open-source educational tutoring chatbot trained by fine-tuning LLaMA-2-70B-chat model on synthetic student-tutorbot conversations generated using [specialized prompts](https://github.com/luffycodes/Tutorbot-Spock-Phys/tree/main/prompts/conversation_gen).
Higgs performs a code soliloquy (an inner monologue) in which Higgs prompts itself through a series of prompts to decide if it's next response to the student would need any math calculations.
If Higgs determines that the response might require such a calculation, such as in cases to verify student calculation, it outputs python code.
Then, it uses the output of the python code to frame a mathematically accurate reply to the student.
**Model date:**
Higgs was trained between May 2023 and Sept 2023.
**Organizations developing the model:**
The Higgs (Spock) team with members from Rice University and OpenStax.
**Where to send questions or comments about the model:**
Shashank Sonkar ([email protected])
If you use this work, please cite:
Code Soliloquies for Accurate Calculations in Large Language Models
https://arxiv.org/abs/2309.12161
```
@misc{sonkar2023code,
title={Code Soliloquies for Accurate Calculations in Large Language Models},
author={Shashank Sonkar and MyCo Le and Xinghe Chen and Lucy Liu and Debshila Basu Mallick and Richard G. Baraniuk},
year={2023},
eprint={2309.12161},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |