aws-sagemaker-huggingface-llm
The Hugging Face LLM CDK Construct Library provides constructs to easily deploy a Hugging Face LLM model to Amazon SageMaker.
Found 122 results for huggingface
The Hugging Face LLM CDK Construct Library provides constructs to easily deploy a Hugging Face LLM model to Amazon SageMaker.
Testing @xenova's v3 branch
Lightweight CLI tool and library for detecting AI model drift using embeddings and scalar metrics. Tracks semantic, conceptual, and lexical change over time.
AI-powered image captioning for Vue.js applications using Hugging Face BLIP models
A Reactive CLI that generates git commit messages with various AI
Typescript client for the Hugging Face Inference Providers and Inference Endpoints
A React provider and hooks for using Hugging Face Transformers.js in React applications with built-in loading states, error handling, and Suspense support
Pocket-Sized Multimodal AI for Content Understanding and Generation
Genkit plugin for Hugging Face models
Simplify AI integration in web apps with local and offline model support
An API to simplify Weaviate vector db queries
An interactive CLI tool leveraging multiple AI models for quick handling of simple requests
Remix IDE NLUX integration. Remix IDE is the leading IDE for building and deploying smart contracts on Ethereum. NLUX is a JavaScript and React library for building conversational AI experiences.
Call 30+ LLMs with a single API. * Send multiple prompts to multiple LLMs and get the results back in a single response. * Zero dependencies (under 10kB minified) * Bring your own API keys * Works anywhere (Node, Deno, browser)
Advanced AI utilities library with method chaining and multiple AI providers
Your AI code reviewer. Improve code quality and catch bugs before you break production
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
A Node.js package to send user prompts to various AI models and return the output.
🔍 A local semantic caching library for Node.js.
use `npm i --save llama.native.js` to run lama.cpp models on your local machine. features a socket.io server and client that can do inference with the host of the model.
Very alpha lib to check DDUF compliance
Your private AI code reviewer
transformers.js mod for react-native
Utilities to convert URLs and files to Blobs, internally used by Hugging Face libs
A Reactive CLI that generates git commit messages with various AI
Transformer neural networks in the browser
Turn any Hugging Face Space or Gradio application into a discord.js bot.
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
CMMV module for LLM integration, tokenization, RAG dataset creation, and fast FAISS-based vector search for code indexing.
serve websocket GGML 4/5bit Quantized LLM's based on Meta's LLaMa model with llama.ccp
Node-RED wrapper node for AIsBreaker.org
simple, type-safe, isomorphic LLM interactions (with power)
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
A utility function for translating text using AI.
Your AI code reviewer. Improve code quality and catch bugs before you break production
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
Core functionalities for generative-ts
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
Simplify AI integration in web apps
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
Advanced AI utilities library with method chaining and multiple AI providers
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
A lightweight utility library providing built-in functions to easily interact with HuggingFace Text Generation Inference (TGI) APIs, enabling seamless text generation and model inference.
quipper is a simple wrapper for the OpenAI API that makes it easy to generate inventive images from a quote.
A typescript library to interact with the HuggingFace Datasets API.
MCP server for ML training script development with progressive scaling and intelligent recovery
A library to easily integrate various LLM models and vendors into applications, with advanced features.
a GGUF parser that works on remotely hosted files
An in-memory semantic search database using AI
A huggingface api wrapper for bloom.
use HuggingFace datasets from Node.js
Huggingface Hub API for Javascript
simple huggingface inference module
A simple spam detection library using a pre-trained model from Hugging Face (JS version)
Unified SDK for interacting with various AI LLM providers
Pocket-Sized Multimodal AI for Content Understanding and Generation
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
Unified TypeScript bridge for multiple AI providers (OpenAI, Claude, Gemini, Groq, Hugging Face) with consistent API and Telegram integration
Your AI code reviewer. Improve code quality and catch bugs before you break production
A super simple script that can run using `npx` (or equivalent) in order to download certain files from HuggingFace 🤗