Papers
arxiv:2204.06745

GPT-NeoX-20B: An Open-Source Autoregressive Language Model

Published on Apr 14, 2022
Authors:
,
,
,
,
,
,
,
,
,
,
,
,

Abstract

We introduce GPT-NeoX-20B, a 20 billion parameter autoregressive language model trained on the Pile, whose weights will be made freely and openly available to the public through a permissive license. It is, to the best of our knowledge, the largest dense autoregressive model that has publicly available weights at the time of submission. In this work, we describe 's architecture and training and evaluate its performance on a range of language-understanding, mathematics, and knowledge-based tasks. We find that GPT-NeoX-20B is a particularly powerful few-shot reasoner and gains far more in performance when evaluated five-shot than similarly sized GPT-3 and FairSeq models. We open-source the training and evaluation code, as well as the model weights, at https://github.com/EleutherAI/gpt-neox.

Community

Sign up or log in to comment

Models citing this paper 57

Browse 57 models citing this paper

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2204.06745 in a dataset README.md to link it from this page.

Spaces citing this paper 237

Collections including this paper 2