julien-c HF staff commited on
Commit
579436a
1 Parent(s): e5199e2

Migrate model card from transformers-repo

Browse files

Read announcement at https://discuss.huggingface.co/t/announcement-all-model-cards-will-be-migrated-to-hf-co-model-repos/2755
Original file history: https://github.com/huggingface/transformers/commits/master/model_cards/activebus/BERT-PT_rest/README.md

Files changed (1) hide show
  1. README.md +42 -0
README.md ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # ReviewBERT
2
+
3
+ BERT (post-)trained from review corpus to understand sentiment, options and various e-commence aspects.
4
+
5
+ `BERT-DK_rest` is trained from 1G (19 types) restaurants from Yelp.
6
+ `BERT-PT_*` addtionally uses SQuAD 1.1.
7
+
8
+ ## Model Description
9
+
10
+ The original model is from `BERT-base-uncased` trained from Wikipedia+BookCorpus.
11
+ Models are post-trained from [Amazon Dataset](http://jmcauley.ucsd.edu/data/amazon/) and [Yelp Dataset](https://www.yelp.com/dataset/challenge/).
12
+
13
+
14
+ ## Instructions
15
+ Loading the post-trained weights are as simple as, e.g.,
16
+
17
+ ```python
18
+ import torch
19
+ from transformers import AutoModel, AutoTokenizer
20
+
21
+ tokenizer = AutoTokenizer.from_pretrained("activebus/BERT-PT_rest")
22
+ model = AutoModel.from_pretrained("activebus/BERT-PT_rest")
23
+
24
+ ```
25
+
26
+
27
+ ## Evaluation Results
28
+
29
+ Check our [NAACL paper](https://www.aclweb.org/anthology/N19-1242.pdf)
30
+
31
+
32
+ ## Citation
33
+ If you find this work useful, please cite as following.
34
+ ```
35
+ @inproceedings{xu_bert2019,
36
+ title = "BERT Post-Training for Review Reading Comprehension and Aspect-based Sentiment Analysis",
37
+ author = "Xu, Hu and Liu, Bing and Shu, Lei and Yu, Philip S.",
38
+ booktitle = "Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics",
39
+ month = "jun",
40
+ year = "2019",
41
+ }
42
+ ```