Unable to load the model with onnx runtime

#1
by ti3x-m - opened

Error: Uncaught 4024042896

transformer.js version: 3.3.1
https://cdn.jsdelivr.net/npm/@huggingface/[email protected]/dist/ort-wasm-simd-threaded.jsep.mjs

onnx runtime web version: 1.21.0-dev.20250109-3328eb3bb3

Code:

const modelUrl = "http://localhost:8111/OuteTTS-0.2-500M/onnx/model.onnx"; //any direct URL path to model.onnx file
const session = await ort.InferenceSession.create(modelUrl);

This error is reproduced if running a rollup bundled build, and using the example code from https://huggingface.co/onnx-community/OuteTTS-0.2-500M.

rollup bundle config on the github repo:

import json from "@rollup/plugin-json";

export default {
  input: "outetts.js/index.js",
  output: {
    file: "dist/bundle.js",
    format: "es", 
    sourcemap: true, 
  },
  plugins: [
    json()
  ],
};
ONNX Community org

Hi there! Can you try with v3.3.3? We fixed some bundling issues.

@Xenova Thanks for response. I tried using import { Tensor, AutoTokenizer, AutoModelForCausalLM, PreTrainedModel } from 'https://cdn.jsdelivr.net/npm/@huggingface/[email protected]';
in the resulting bundle.js, still hitting the same error.
Uncaught 4024588040
Code i tried is similar to:
https://huggingface.co/OuteAI/OuteTTS-0.2-500M

ONNX Community org

Sign up or log in to comment