|
--- |
|
license: openrail |
|
library_name: transformers |
|
datasets: |
|
- nuprl/MultiPL-T |
|
--- |
|
|
|
# MultiPL-T StarCoder2-15b |
|
|
|
|
|
This repository holds several [StarCoder2-15b](https://huggingface.co/bigcode/starcoder2-15b) fine-tunes, all fine-tuned on MultiPL-T data. |
|
Examine the commit message to determine the language and checkpoint. We have a checkpoint |
|
for each epoch. |
|
|
|
|
|
For more information the training process, see the MultiPL-T paper: |
|
|
|
``` |
|
@misc{cassano:multipl-t, |
|
title={Knowledge Transfer from High-Resource to Low-Resource Programming Languages for Code LLMs}, |
|
author={Federico Cassano and John Gouwar and Francesca Lucchetti and Claire Schlesinger and Anders Freeman and Carolyn Jane Anderson and Molly Q Feldman and Michael Greenberg and Abhinav Jangda and Arjun Guha}, |
|
year={2024}, |
|
eprint={2308.09895}, |
|
archivePrefix={arXiv}, |
|
primaryClass={cs.PL} |
|
} |
|
``` |
|
|
|
For usage instructions, see the model card for the original model. Replace the model name with the name of this repository, and set `revision=COMMIT_HASH`. |
|
|