|
--- |
|
language: |
|
- ja |
|
library_name: transformers |
|
tags: |
|
- conversational |
|
- ja |
|
- japanese |
|
- gpt2 |
|
- text-generation |
|
- lm |
|
- nlp |
|
--- |
|
|
|
|
|
# Japanese DialoGPT trained with Aozora |
|
|
|
**(ja) 青空文庫のセリフで学習した日本語のDialoGPT Smallです** |
|
**(en) Japanese DialoGPT Small trained on Aozora Bunko.** |
|
|
|
## [Demo](https://huggingface.co/spaces/akiFQC/Japanese_DialoGPT_small_Aozora) |
|
Demo in this page is not working so well. I recommend you to try it on [Hugging Face Spaces Version](https://huggingface.co/spaces/akiFQC/Japanese_DialoGPT_small_Aozora). |
|
|
|
|
|
## Reference |
|
- [Aozora-bunko](https://www.aozora.gr.jp/) |
|
- Japanese public domain books. |
|
- I extracted the dialogue part from the books and used it as the training data. |
|
- [japanese-gpt2-small](https://huggingface.co/rinna/japanese-gpt2-small) |
|
- Novel Japanese GPT2. I used a small model because of the limitation of GPU memory of my desktop PC(with RTX3060x1) 😢. |
|
- I used this model as a pre-trained model. |
|
- [DialoGPT: Large-Scale Generative Pre-training for Conversational Response Generation](https://arxiv.org/abs/1911.00536) |
|
|