Skip to content

[Web] Can't create session loading large models #13006

@edhyah

Description

@edhyah

Describe the issue

Hello,

I'm trying to load a UNet model (> 2GB), which means I have a model.onnx file as well as a lot of .weight and .bias files.

Using onnxruntime-web, I'm attempting to load it into a Node/Javascript environment using the following code:

const ort = require('onnxruntime-web');
try {
    const session = await ort.InferenceSession.create('./unet/model.onnx')
} catch (e) {
    console.log(e);
}

When I do this, I'm faced with this warning:

Deserialize tensor onnx::MatMul_12479 failed.tensorprotoutils.cc:640 TensorProtoToTensor External initializer: onnx::MatMul_12479 offset: 0 size to read: 409600 given file_length: 32 are out of bounds or can not be read in full.

And then this error:

Error: Can't create a session.

Using this same method and code I've been able to load smaller models < 2GB without errors and warnings.

Is there anything I can do to investigate why I'm running into this issue?

To reproduce

UNet model can be accessed here: https://drive.google.com/drive/folders/1l65XlDrYYE7m1EUPdEZ6SDGWjrlIljJu?usp=sharing

You can use this code (copied and pasted from above):

const ort = require('onnxruntime-web');
try {
    const session = await ort.InferenceSession.create('./unet/model.onnx')
} catch (e) {
    console.log(e);
}

Urgency

It's somewhat urgent.

ONNX Runtime Installation

Released Package

ONNX Runtime Version or Commit ID

1.8.0 (installed onnxruntime-web through npm)

Execution Provider

Other / Unknown

Metadata

Metadata

Assignees

Labels

platform:webissues related to ONNX Runtime web; typically submitted using template

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions