@llamaindex/env
environment wrapper, supports all JS environment including node, deno, bun, edge runtime, and cloudflare worker
Found 25 results for vectorstore
environment wrapper, supports all JS environment including node, deno, bun, edge runtime, and cloudflare worker
Mastra is a framework for building AI-powered applications and agents with a modern TypeScript stack.
The Retrieval-Augmented Generation (RAG) module contains document processing and embedding utilities.
<p align="center"> <img height="100" width="100" alt="LlamaIndex logo" src="https://ts.llamaindex.ai/square.svg" /> </p> <h1 align="center">LlamaIndex.TS</h1> <h3 align="center"> Data framework for your LLM application. </h3>
plugin service for agent mode of chatluna
A comprehensive enterprise-grade toolkit for building RAG (Retrieval-Augmented Generation) pipelines with advanced AI/ML capabilities
Core embedding and vector store utilities for AskText voice Q&A.
This library includes vector storage to convert documents to vectors.
[](https://www.npmjs.com/package/llamaindex) [](https://www.npmjs.com/package/llamaindex) [](https://www.npmjs.com/package/llamaflowjs) [](https://www.npmjs.com/package/llamaflowjs) [ module contains document processing and embedding utilities.
Node implements memory vector store
Wrapper for configuration of environments
The Retrieval-Augmented Generation (RAG) module contains document processing and embedding utilities.
a restorable memory vector store for LangChainJS that works in browsers
The core foundation of the Mastra framework, providing essential components and interfaces for building AI-powered applications.
The core foundation of the Mastra framework, providing essential components and interfaces for building AI-powered applications.
EarnKit Core provides the foundational infrastructure for building, deploying, and managing efficient event-driven agent-based systems. This package serves as the base layer for the entire EarnKit SDK ecosystem.
A LangChain compatible vector store implementation using IndexedDB