Edit model card

goldfish-gpt2-japanese-1000mb-ud-causal

Model Description

This is a GPT-2 model pretrained for POS-tagging and dependency-parsing, derived from jpn_jpan_1000mb refined for UD_Japanese-GSDLUW.

How to Use

from transformers import pipeline
nlp=pipeline("universal-dependencies","KoichiYasuoka/goldfish-gpt2-japanese-1000mb-ud-causal",trust_remote_code=True)
print(nlp("全学年にわたって小学校の国語の教科書に挿し絵が用いられている"))
Downloads last month
4
Inference Examples
Unable to determine this model's library. Check the docs .

Model tree for KoichiYasuoka/goldfish-gpt2-japanese-1000mb-ud-causal

Finetuned
(1)
this model

Dataset used to train KoichiYasuoka/goldfish-gpt2-japanese-1000mb-ud-causal