PEFT
File size: 1,746 Bytes
c93ddd5
41ed0d1
1e85ac8
c93ddd5
ed9f218
641637e
ed9f218
 
ddc3937
41ed0d1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
087cc8d
 
 
 
7243ed0
641637e
 
 
7243ed0
641637e
087cc8d
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
---
library_name: peft
license: gpl-3.0
---
## MATHWELL
MATHWELL is the model released in the paper [MATHWELL: Generating Educational Math Word Problems Using Teacher Annotations](https://arxiv.org/abs/2402.15861). 
MATHWELL is a finetuned Llama-2 (70B) model that generates customized educational grade school math word problems and Python function solutions to these problems. Generated problems are 1) solvable, 2) accurate, and 3) appropriate. These criteria are essential to successfully supplement grade-school students’ math education. On average, 74% of MATHWELL's problems with executable solutions are solvable, accurate, and appropriate.

For more details on how MATHWELL was trained and evaluated, please see our [paper](https://arxiv.org/abs/2402.15861). Our [repo](https://github.com/bryanchrist/MATHWELL) contains a sample script for loading and interacting with MATHWELL.
## Training procedure


The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: True
- load_in_4bit: False
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: fp4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float32
### Framework versions


- PEFT 0.6.0.dev0

## Citation
```bash
@inproceedings{christ_mathwell_2024,
	title = {{MATHWELL}: {Generating} {Educational} {Math} {Word} {Problems} {Using} {Teacher} {Annotations}},
	url = {https://openreview.net/forum?id=jNsjlRfpk0},
	booktitle = {The 2024 {Conference} on {Empirical} {Methods} in {Natural} {Language} {Processing}},
	author = {Christ, Bryan R. and Kropko, Jonathan and Hartvigsen, Thomas},
	year = {2024},
}
```