llama.native.js
use `npm i --save llama.native.js` to run lama.cpp models on your local machine. features a socket.io server and client that can do inference with the host of the model.
Found 122 results for huggingface
use `npm i --save llama.native.js` to run lama.cpp models on your local machine. features a socket.io server and client that can do inference with the host of the model.
Your private AI code reviewer
transformers.js mod for react-native
Utilities to convert URLs and files to Blobs, internally used by Hugging Face libs
A Reactive CLI that generates git commit messages with various AI
Transformer neural networks in the browser
Turn any Hugging Face Space or Gradio application into a discord.js bot.
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
serve websocket GGML 4/5bit Quantized LLM's based on Meta's LLaMa model with llama.ccp
Node-RED wrapper node for AIsBreaker.org
simple, type-safe, isomorphic LLM interactions (with power)
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
CMMV module for LLM integration, tokenization, RAG dataset creation, and fast FAISS-based vector search for code indexing.
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
A utility function for translating text using AI.
Simplify AI integration in web apps
Your AI code reviewer. Improve code quality and catch bugs before you break production
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
Core functionalities for generative-ts
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
Advanced AI utilities library with method chaining and multiple AI providers
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
A lightweight utility library providing built-in functions to easily interact with HuggingFace Text Generation Inference (TGI) APIs, enabling seamless text generation and model inference.
quipper is a simple wrapper for the OpenAI API that makes it easy to generate inventive images from a quote.
A typescript library to interact with the HuggingFace Datasets API.
MCP server for ML training script development with progressive scaling and intelligent recovery
A library to easily integrate various LLM models and vendors into applications, with advanced features.
a GGUF parser that works on remotely hosted files
An in-memory semantic search database using AI
A huggingface api wrapper for bloom.
use HuggingFace datasets from Node.js
A simple spam detection library using a pre-trained model from Hugging Face (JS version)
Huggingface Hub API for Javascript
simple huggingface inference module
Unified SDK for interacting with various AI LLM providers
Pocket-Sized Multimodal AI for Content Understanding and Generation
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
Unified TypeScript bridge for multiple AI providers (OpenAI, Claude, Gemini, Groq, Hugging Face) with consistent API and Telegram integration
Your AI code reviewer. Improve code quality and catch bugs before you break production
A super simple script that can run using `npx` (or equivalent) in order to download certain files from HuggingFace 🤗