metadata
language:
- java
- code
license: apache-2.0
datasets:
- code_search_net
widget:
- text: >-
public <mask> isOdd(Integer num){if (num % 2 == 0) {return "even";} else
{return "odd";}}
JavaRoBERTa-Tara
A RoBERTa model pretrained on, code_search_net Java software code.
Training Data
The model was trained on 10,223,695 Java files retrieved from open source projects on GitHub.
Training Objective
A MLM (Masked Language Model) objective was used to train this model.
Usage
from transformers import pipeline
pipe = pipeline('fill-mask', model='emre/java-RoBERTa-Tara-small')
output = pipe(CODE) # Replace with Java code; Use '<mask>' to mask tokens/words in the code.
Why Tara?
she is the name of my little baby girl :)