PEFT
MATHWELL / README.md
bryanchrist's picture
Update README.md
7243ed0 verified
metadata
library_name: peft
license: gpl-3.0

MATHWELL

MATHWELL is the model released in the paper MATHWELL: Generating Educational Math Word Problems Using Teacher Annotations. MATHWELL is a finetuned Llama-2 (70B) model that generates customized educational grade school math word problems and Python function solutions to these problems. Generated problems are 1) solvable, 2) accurate, and 3) appropriate. These criteria are essential to successfully supplement grade-school students’ math education. On average, 74% of MATHWELL's problems with executable solutions are solvable, accurate, and appropriate.

For more details on how MATHWELL was trained and evaluated, please see our paper. Our repo contains a sample script for loading and interacting with MATHWELL.

Training procedure

The following bitsandbytes quantization config was used during training:

  • quant_method: bitsandbytes
  • load_in_8bit: True
  • load_in_4bit: False
  • llm_int8_threshold: 6.0
  • llm_int8_skip_modules: None
  • llm_int8_enable_fp32_cpu_offload: False
  • llm_int8_has_fp16_weight: False
  • bnb_4bit_quant_type: fp4
  • bnb_4bit_use_double_quant: False
  • bnb_4bit_compute_dtype: float32

Framework versions

  • PEFT 0.6.0.dev0

Citation

@inproceedings{christ_mathwell_2024,
    title = {{MATHWELL}: {Generating} {Educational} {Math} {Word} {Problems} {Using} {Teacher} {Annotations}},
    url = {https://openreview.net/forum?id=jNsjlRfpk0},
    booktitle = {The 2024 {Conference} on {Empirical} {Methods} in {Natural} {Language} {Processing}},
    author = {Christ, Bryan R. and Kropko, Jonathan and Hartvigsen, Thomas},
    year = {2024},
}