Text Classification
Transformers
Safetensors
English
HHEMv2Config
custom_code
simonhughes22 commited on
Commit
afedd94
1 Parent(s): c27faa6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +32 -0
README.md CHANGED
@@ -19,6 +19,7 @@ widget:
19
  example_title: "Positive"
20
  - text: "A boy is jumping on skateboard in the middle of a red bridge. [SEP] The boy skates down the sidewalk on a blue bridge"
21
  example_title: "Negative"
 
22
  ---
23
  # Cross-Encoder for Hallucination Detection
24
  This model was trained using [SentenceTransformers](https://sbert.net) [Cross-Encoder](https://www.sbert.net/examples/applications/cross-encoder/README.html) class.
@@ -48,6 +49,7 @@ etc. See examples below for expected probability scores.
48
 
49
  ## Usage with Sentencer Transformers (Recommended)
50
 
 
51
  The model can be used like this, on pairs of documents, passed as a list of list of strings (```List[List[str]]]```):
52
 
53
  ```python
@@ -73,6 +75,36 @@ array([0.61051559, 0.00047493709, 0.99639291, 0.00021221573, 0.99599433, 0.00141
73
  Note that the model is designed to work with entire documents, so long as they fit into the 512 token context window (across both documents).
74
  Also note that the order of the documents is important, the first document is the source document, and the second document is validated against the first for factual consistency, e.g. as a summary of the first or a claim drawn from the source.
75
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
76
  ## Usage with Transformers AutoModel
77
  You can use the model also directly with Transformers library (without the SentenceTransformers library):
78
 
 
19
  example_title: "Positive"
20
  - text: "A boy is jumping on skateboard in the middle of a red bridge. [SEP] The boy skates down the sidewalk on a blue bridge"
21
  example_title: "Negative"
22
+
23
  ---
24
  # Cross-Encoder for Hallucination Detection
25
  This model was trained using [SentenceTransformers](https://sbert.net) [Cross-Encoder](https://www.sbert.net/examples/applications/cross-encoder/README.html) class.
 
49
 
50
  ## Usage with Sentencer Transformers (Recommended)
51
 
52
+ ### Inference
53
  The model can be used like this, on pairs of documents, passed as a list of list of strings (```List[List[str]]]```):
54
 
55
  ```python
 
75
  Note that the model is designed to work with entire documents, so long as they fit into the 512 token context window (across both documents).
76
  Also note that the order of the documents is important, the first document is the source document, and the second document is validated against the first for factual consistency, e.g. as a summary of the first or a claim drawn from the source.
77
 
78
+ ### Training
79
+
80
+ ```python
81
+ from sentence_transformers.cross_encoder.evaluation import CEBinaryClassificationEvaluator
82
+ from sentence_transformers import SentenceTransformer, InputExample, losses
83
+
84
+ num_epochs = 5
85
+ model_save_path = "./model_dump"
86
+
87
+ # Load some training examples as such, using a pandas dataframe with source and summary columns:
88
+ train_examples, test_examples = [], []
89
+ for i, row in df_train.iterrows():
90
+ train_examples.append(InputExample(texts=[row['source'], row['summary']], label=int(row['label'])))
91
+
92
+ for i, row in df_test.iterrows():
93
+ test_examples.append(InputExample(texts=[row['source'], row['summary']], label=int(row['label'])))
94
+ test_evaluator = CEBinaryClassificationEvaluator.from_input_examples(test_examples, name='test_eval')
95
+
96
+ # Then train the model as such as per the Cross Encoder API:
97
+ train_dataloader = DataLoader(train_examples, shuffle=True, batch_size=train_batch_size)
98
+ warmup_steps = math.ceil(len(train_dataloader) * num_epochs * 0.1) #10% of train data for warm-up
99
+ model.fit(train_dataloader=train_dataloader,
100
+ evaluator=test_evaluator,
101
+ epochs=num_epochs,
102
+ evaluation_steps=10_000,
103
+ warmup_steps=warmup_steps,
104
+ output_path=model_save_path,
105
+ show_progress_bar=True)
106
+ ```
107
+
108
  ## Usage with Transformers AutoModel
109
  You can use the model also directly with Transformers library (without the SentenceTransformers library):
110