Phi models - are they available via CDN at all?

#1
by r0-0rd - opened

Sorry if this is a silly question, I've tried to have a dig but I'm not sure whether my code just isn't working of whether I've misunderstood how the models are listed.

The Phi models seem incredible for free, lightweight models that run in the browser. I can see on the

I also had a look on the Xenova CDN at https://cdn.jsdelivr.net/npm/@xenova/[email protected]/dist/transformers.js and found specific mentions of Phi

/* harmony export */   "PhiForCausalLM": () => (/* binding */ PhiForCausalLM),
/* harmony export */   "PhiModel": () => (/* binding */ PhiModel),
/* harmony export */   "PhiPreTrainedModel": () => (/* binding */ PhiPreTrainedModel),

I've been trying to use them with a vanilla script import from the CDN but I keep hitting notFound or "model not supported" errors

but I don't know if;

  1. My code is just bad
  2. They're not supported on the CDN yet

Any pointers would be amazing!

My current code is

import { pipeline } from 'https://cdn.jsdelivr.net/npm/@xenova/transformers@latest';

async function runInference() {
  const inputText = document.getElementById('input-text').value;
  const outputElement = document.getElementById('output');

  // Clear previous output
  outputElement.textContent = 'Loading model and running inference...';

  try {
    // Use the pipeline function to load the Phi model
    const generator = await pipeline('text-generation', '{model location}');

I've tried the following for the model location value (verbatim), occassionally I've had 404 not found responses, occassionally not authorised responses but the most promising error I've got seems to be the model not supported one because that suggests the script has an idea of what I'm asking for

  • onnx-community/Phi-3.5-mini-instruct-onnx-web
  • Xenova/Phi-3-mini-4k-instruct
  • microsoft/Phi-3-mini-4k-instruct-onnx-web
  • Xenova/tiny-random-PhiForCausalLM
  • Xenova/phi-1_5_dev
  • BricksDisplay/phi-1_5
  • BricksDisplay/phi-1_5-q4
  • BricksDisplay/phi-1_5-bnb4
  • Xenova/Phi-3-mini-4k-instruct_fp16
  • Xenova/tiny-random-LlavaForConditionalGeneration_phi

Thanks very much in advance for your help!

That's because you're using the V2 version of Transformers.js instead of the V3 version.

Thanks for coming back @BoscoTheDog , appreciate it! That's the context I was missing! It looks like there isn't a Xenova CDN hosted version of the library that is >=3 so it's a case of either waiting for that or switching to the main Hugging Face library and installing it myself with NPM.

Thanks for your response - saved me ages of barking up the wrong tree

r0-0rd changed discussion status to closed

It is on CDN actually:

https://cdn.jsdelivr.net/npm/@huggingface/[email protected]

You can also build it yourself with:

git clone -b v3 https://github.com/xenova/transformers.js.git; cd transformers.js; npm i; npm run build

And then it's in the dist folder.

Also, switch to using Onnx-community, not webml community.

Sign up or log in to comment