Package Exports
- tensornet
- tensornet/dist/index.js
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (tensornet) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
tensornet
pure js classify base64 between Tensorflow Models
Installation
npm i tensornet --save
Getting Started
Make sure to have @tensorflow/tfjs-core
installed and a valid tensorflow backend set.
You also need to pick between sync package jpeg-js or async package sharp.
# pure js full sync blocking installation
npm i @tensorflow/tfjs-core jpeg-js
# if going to use async non blocking
npm i @tensorflow/tfjs-core sharp
View the classify.test.ts file for an example setup.
import { classify, classifyAsync } from "tensornet";
import { setBackend } from "@tensorflow/tfjs-core";
import "@tensorflow/tfjs-backend-wasm";
await setBackend("wasm");
const classification = await classify(mybase64); //using jpeg-js.
// or use native sharp for increased performance 2x
const classificationA = await classifyAsync(mybase64);
// output example
// [
// {
// className: 'Siamese cat, Siamese',
// probability: 0.9805548787117004
// }
// ]
Why
The benefits of using pure js to calc the image is in a couple areas:
- size and portablity required is drastically less since you do not need
cairo
or any of the native img dev converters. - speed is also faster since the calcs are done at hand without needing to bridge any calls.
- can use tensors in worker threads - allows for properly using Tensorflow wasm backends in an API service 🥳.
Benchmarks
Examples of some test ran on a mac m1(64gb):
Name | chars | size | sync | async |
---|---|---|---|---|
jpeg | 26791 | 26.16 KB | 100ms | 50ms |