JSPM

onnxruntime-web

1.7.0-dev.5
  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 257948
  • Score
    100M100P100Q170818F
  • License MIT

A Javascript library for running ONNX models on browsers

Package Exports

  • onnxruntime-web

This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (onnxruntime-web) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

Readme

ONNX Runtime Web

ONNX Runtime Web is a Javascript library for running ONNX models on browsers and on Node.js.

ONNX Runtime Web has adopted WebAssembly and WebGL technologies for providing an optimized ONNX model inference runtime for both CPUs and GPUs.

Why ONNX models

The Open Neural Network Exchange (ONNX) is an open standard for representing machine learning models. The biggest advantage of ONNX is that it allows interoperability across different open source AI frameworks, which itself offers more flexibility for AI frameworks adoption. See Getting ONNX Models.

Why ONNX Runtime Web

With ONNX Runtime Web, web developers can score pre-trained ONNX models directly on browsers with various benefits of reducing server-client communication and protecting user privacy, as well as offering install-free and cross-platform in-browser ML experience.

ONNX Runtime Web can run on both CPU and GPU. For running on CPU, WebAssembly is adopted to execute the model at near-native speed. Furthermore, ONNX Runtime Web utilizes Web Workers to provide a "multi-threaded" environment to parallelize data processing. Empirical evaluation shows very promising performance gains on CPU by taking full advantage of WebAssembly and Web Workers. For running on GPUs, a popular standard for accessing GPU capabilities - WebGL is adopted. ONNX Runtime Web has further adopted several novel optimization techniques for reducing data transfer between CPU and GPU, as well as some techniques to reduce GPU processing cycles to further push the performance to the maximum.

See Compatibility and Operators Supported for a list of platforms and operators ONNX Runtime Web currently supports.

Usage

Refer to ONNX Runtime JavaScript examples for samples and tutorials.

Documents

Developers

Refer to Using VSCode for setting up development environment.

For information about building ONNX Runtime Web development, please check Build.

Getting ONNX models

You can get ONNX models easily in multiple ways:

Learn more about ONNX

Compatibility

OS/Browser Chrome Edge Safari Electron
Windows 10 ✔️ ✔️ - ✔️
macOS ✔️ - ✔️ ✔️
Ubuntu LTS 18.04 ✔️ - - ✔️
iOS ✔️ ✔️ ✔️ -
Android ✔️ - - -

Operators

WebAssembly backend

ONNX Runtime Web currently support all operators in ai.onnx and ai.onnx.ml.

WebGL backend

ONNX Runtime Web currently supports most operators in ai.onnx operator set v7 (opset v7). See operators.md for a complete, detailed list of which ONNX operators are supported by WebGL backend.

License

License information can be found here.