nguyenthaibinh commited on
Commit
4efb231
1 Parent(s): 5419c30

remove redundant files

Browse files
Files changed (2) hide show
  1. README.md +0 -45
  2. model_descriptions.json +0 -6
README.md DELETED
@@ -1,45 +0,0 @@
1
- ---
2
- library_name: light-embed
3
- pipeline_tag: sentence-similarity
4
- tags:
5
- - sentence-transformers
6
- - feature-extraction
7
- - sentence-similarity
8
-
9
- ---
10
-
11
- # sbert-all-MiniLM-L12-v2-onnx
12
-
13
- This is the ONNX version of the Sentence Transformers model sentence-transformers/all-MiniLM-L12-v2 for sentence embedding, optimized for speed and lightweight performance. By utilizing onnxruntime and tokenizers instead of heavier libraries like sentence-transformers and transformers, this version ensures a smaller library size and faster execution. Below are the details of the model:
14
- - Base model: sentence-transformers/all-MiniLM-L12-v2
15
- - Embedding dimension: 384
16
- - Max sequence length: 128
17
- - File size on disk: 0.12 GB
18
-
19
- This ONNX model consists all components in the original sentence transformer model:
20
- Transformer, Pooling, Normalize
21
-
22
- <!--- Describe your model here -->
23
-
24
- ## Usage (LightEmbed)
25
-
26
- Using this model becomes easy when you have [LightEmbed](https://www.light-embed.net) installed:
27
-
28
- ```
29
- pip install -U light-embed
30
- ```
31
-
32
- Then you can use the model like this:
33
-
34
- ```python
35
- from light_embed import TextEmbedding
36
- sentences = ["This is an example sentence", "Each sentence is converted"]
37
-
38
- model = TextEmbedding('sentence-transformers/all-MiniLM-L12-v2')
39
- embeddings = model.encode(sentences)
40
- print(embeddings)
41
- ```
42
-
43
- ## Citing & Authors
44
-
45
- Binh Nguyen / [email protected]
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
model_descriptions.json DELETED
@@ -1,6 +0,0 @@
1
- {
2
- "base_model": "sentence-transformers/all-MiniLM-L12-v2",
3
- "embedding_dim": 384,
4
- "max_seq_length": 128,
5
- "model_file_size (GB)": 0.12
6
- }