Package Exports
- react-tensorflow
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (react-tensorflow) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
React-Tensorflow
A library of React hooks and HOCs written in Typescript to use Tensorflow models in your application! 🤖🧠
Installation
yarn add react-tensorflownpm i react-tensorflow -SPeer dependencies
- react >=16.8.0
- @tensorflow/tfjs >=2.0.0
Basic usage
import { useModel } from 'react-tensorflow'
const MyModelComponent = () => {
const model = useModel(`${PATH_TO_MODEL}`)
// ...do something with the model
return null
}API
useModel
useModel({model?: any, modelUrl?: string, layers?: boolean}): GraphModel | LayersModel | nullIf model or modelUrl is omitted useModel will look to find the ModelProvider as it's context for returning the model. When loading a model with this hook, the layers boolean is passed if your TF model should be loaded with the function tf.loadLayersModel otherwise it is assumed the model should be loaded with tf.loadGraphModel.
ModelProvider
<ModelProvider url={string} layers={boolean}>
<App />
</ModelProvider>Wraps the children in a React Provider to be consumed by Context's in either the useModel hook or withModel HOC. The props passed to this provider are the same as the documented props for useModel.
withModel
withModel(Component: React.ComponentType): JSX.ElementWraps the provided component in a React Context, passing the model give to the provider as a prop.
useWebcam
useWebcam (options?: {
width?: number
height?: number
facingMode?: string
}): [React.MutableRefObject, tf.Tensor | null]Provides a ref to be used on a video element, the hook then returns a tensor with shape [1, width, height, 3] where the width and height are either dictated by the element's width & height or the provided argument documented above. The options argument while documented above can infact take all the properties of the MediaStreamConstraints.
usePrediction
usePrediction (options?: {
predictConfig?: {},
useExecute?: boolean = false,
outputName?: string,
predictionFunction?: string,
...useModelProps,
}): [React.MutableRefObject<tf.Tensor>, tf.Tensor | tf.Tensor[] | tf.NamedTensorMap | null]Provides a ref to the data you want to use to create a prediction. The data must be in the form of a tensor. It then returns a new tensor as the prediction using either the model set with the ModelProvider component or by passing a modelUrl as an argument as it uses useModel under the hood. You can then perform different actions such as normalizing the data for to classify the original input. By default usePrediction uses .predict, if you want to force the use of .execute set useExecute: true and if you want to use a custom predict function, pass it's name via the predictionFunction key. If you're using a LayersModel you must set outputName. At this time using a @tensorflow-models model is not supported with usePrediction.
Contributing
Contributions are very welcome and wanted.
Before submitting a new pull request, please make sure:
- Consider if the pull-request should be going to the master branch or the latest release branch.
- If merging to master, you have updated the package.json version.
- You report your changes into the CHANGELOG file.
- make sure you run the test and build script before submitting your merge request.
- make sure you've added the documentation of your changes.