Edit model card

ReGPT-125M-200G

This model was trained on GPT-Neo-125M with Mengzi Retrieval LM.

For more details, please refer to this document.

How to use

You have to use a forked transformers: https://github.com/Langboat/transformers

from transformers import Re_gptForCausalLM
model = Re_gptForCausalLM.from_pretrained('Langboat/ReGPT-125M-200G')
Downloads last month
44
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Spaces using Langboat/ReGPT-125M-200G 2