Xenova HF staff commited on
Commit
5030dcb
1 Parent(s): b78e63e

Add transformers.js example code

Browse files

[Transformers.js](http://huggingface.co/docs/transformers.js) is a JavaScript library for running `transformers` models directly in the browser (or server-side w/ Node.js or Deno). This sample code produces the exact same result as the python version.

NOTE: Requires https://huggingface.co/Alibaba-NLP/gte-base-en-v1.5/discussions/4 to be merged first (adds ONNX weights).

Files changed (1) hide show
  1. README.md +26 -0
README.md CHANGED
@@ -4,6 +4,7 @@ tags:
4
  - sentence-transformers
5
  - gte
6
  - mteb
 
7
  license: apache-2.0
8
  language:
9
  - en
@@ -2684,6 +2685,31 @@ embeddings = model.encode(sentences)
2684
  print(cos_sim(embeddings[0], embeddings[1]))
2685
  ```
2686
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2687
  ## Training Details
2688
 
2689
  ### Training Data
 
4
  - sentence-transformers
5
  - gte
6
  - mteb
7
+ - transformers.js
8
  license: apache-2.0
9
  language:
10
  - en
 
2685
  print(cos_sim(embeddings[0], embeddings[1]))
2686
  ```
2687
 
2688
+ Use with `transformers.js`:
2689
+
2690
+ ```js
2691
+ import { pipeline, dot } from '@xenova/transformers';
2692
+
2693
+ // Create feature extraction pipeline
2694
+ const extractor = await pipeline('feature-extraction', 'Alibaba-NLP/gte-base-en-v1.5', {
2695
+ quantized: false, // Comment out this line to use the quantized version
2696
+ });
2697
+
2698
+ // Generate sentence embeddings
2699
+ const sentences = [
2700
+ "what is the capital of China?",
2701
+ "how to implement quick sort in python?",
2702
+ "Beijing",
2703
+ "sorting algorithms"
2704
+ ]
2705
+ const output = await extractor(sentences, { normalize: true, pooling: 'cls' });
2706
+
2707
+ // Compute similarity scores
2708
+ const [source_embeddings, ...document_embeddings ] = output.tolist();
2709
+ const similarities = document_embeddings.map(x => 100 * dot(source_embeddings, x));
2710
+ console.log(similarities); // [34.504930869007296, 64.03973265120138, 19.520042686034362]
2711
+ ```
2712
+
2713
  ## Training Details
2714
 
2715
  ### Training Data