File size: 1,583 Bytes
c93ddd5 41ed0d1 1e85ac8 c93ddd5 ed9f218 41ed0d1 087cc8d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 |
---
library_name: peft
license: gpl-3.0
---
## MATHWELL
MATHWELL is the model released in the paper [MATHWELL: Generating Educational Math Word Problems at Scale](https://arxiv.org/abs/2402.15861).
MATHWELL is a finetuned Llama-2 (70B) model that generates customized educational grade school math word problems and Python function solutions to these problems. Generated problems are 1) solvable, 2) accurate, and 3) appropriate. These criteria are essential to successfully supplement grade-school students’ math education. On average, 74% of MATHWELL's problems with executable solutions are solvable, accurate, and appropriate.
For more details on how MATHWELL was trained and evaluated, please see our paper. Our [repo](https://github.com/bryanchrist/MATHWELL) contains a sample script for loading and interacting with MATHWELL.
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: True
- load_in_4bit: False
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: fp4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float32
### Framework versions
- PEFT 0.6.0.dev0
## Citation
```bash
@misc{christ2024mathwell,
title={MATHWELL: Generating Educational Math Word Problems at Scale},
author={Bryan R Christ and Jonathan Kropko and Thomas Hartvigsen},
year={2024},
eprint={2402.15861},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |