JSPM

@getanthill/datastore

0.95.1
  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 772
  • Score
    100M100P100Q125343F
  • License MIT

Event-Sourced Datastore

Package Exports

  • @getanthill/datastore
  • @getanthill/datastore/dist/sdk/Datastore.js
  • @getanthill/datastore/dist/sdk/aggregator/Aggregator.js
  • @getanthill/datastore/dist/sdk/index.js

This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (@getanthill/datastore) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

Readme

getanthill Datastore

pipeline Quality Gate Status

Coverage Security Rating Reliability Rating

Twitter

🎯 Purpose

The goal of this project is to provide a system to easily access the full power of Event-Source / CQRS systems.

📚 Documentation

https://datastore.getanthill.org/

✨ Key Features

The getanthill Datastore is engineered to empower developers with a robust, scalable, and highly observable data management system built on modern architectural patterns.

CQRS & Event Sourcing: At its core, the Datastore fully embraces Command Query Responsibility Segregation (CQRS) and Event Sourcing. Manage every data entity as an immutable event stream, enabling precise atomic updates, historical reconstruction of entities to any past state, and advanced "time-travel" debugging.

Contract-First Development with JSON Schema: All data models within the Datastore are rigorously contractualized using the JSON Schema standard. This ensures data integrity and provides a strict, machine-readable contract for every piece of information, forming the foundation for reliable API interactions and event processing.

OpenAPI 3.0 Compliant API & Automatic Documentation: Leveraging its JSON Schema contracts, the Datastore automatically generates an OpenAPI 3.0 (formerly Swagger) specification. This provides comprehensive, up-to-date, and interactive documentation for your RESTful API, simplifying integration and ensuring clarity for all consumers.

Real-time Data Streaming: Process data in real-time with an integrated stream API entrypoint. Deploy workers with automatic reconnection capabilities, robust pattern matching, and built-in logging, enabling immediate reactions to data changes.

Flexible Message Broker Integration: The Datastore offers seamless integration with popular message brokers such as MQTT and RabbitMQ (AMQP). This allows for flexible event distribution, enabling advanced asynchronous communication patterns and microservices architectures.

Fine-grained Access Control: Secure your data with a comprehensive role-based access control system. Four distinct access levels (READ, DECRYPT, WRITE, ADMIN) allow for granular permissions, protecting sensitive information and operations.

Data Encryption: Easily encrypt sensitive fields within your data, providing robust data security. The system supports multiple keys, key rotation, and on-demand document encryption, ensuring data privacy even at rest.

Advanced Data Aggregation & Projections: Chain complex data projections and aggregations across multiple Datastore instances. This powerful pipeline enables sophisticated business logic, data transformation, and event tracking, facilitating advanced analytics and reporting.

Configurable & Observable: Tailor the Datastore to your needs with flexible configuration options for security, OpenAPI behavior, and feature enablement. Integrated telemetry and logging provide deep insights into system operations, simplifying monitoring and debugging.

💻 Command Line Interface (CLI)

The Datastore provides a powerful Command Line Interface for managing various aspects of your data and system.

CLI Features

  • Datastore Management: Interact with different datastore instances, configured via DATASTORE_CONFIGS environment variable.
  • Data Operations: Perform CRUD operations on your data models.
  • Model Handling: Manage and inspect data models, which are contractualized using json-schema.
  • Security: Configure and manage security-related aspects.
  • Event Streaming: Stream entities changes or events from different connectors (e.g., HTTP, AMQP) and synchronize them to other datastores.
  • Interactive Shell: Utilize an interactive run command for executing CLI commands in a persistent session, complete with command history and auto-completion.
  • Heartbeat Check: Verify the availability of datastore services.

💿 Installation

To get started with the getanthill Datastore, follow these steps:

  1. Clone the repository:

    git clone https://gitlab.com/getanthill/datastore.git
    cd datastore
  2. Install dependencies:

    npm install
  3. Environment Variables: Create a .env file in the root directory and configure necessary environment variables. Example:

    DATASTORE_API_URL=http://localhost:3001
    DATASTORE_ACCESS_TOKEN=your_secret_token
  4. Run the Datastore:

    npm start

🚀 Usage Examples

Running the API Server

npm start

The API will be available at http://localhost:3001 (or your configured DATASTORE_API_URL).

Using the CLI

Interactive Shell:

npm run cli run

Stream Events:

npm run cli stream <model> <source> -- --datastore default --output entity --connector http

Check Heartbeat:

npm run cli heartbeat

🤝 Contributing

We welcome contributions from the community! To contribute to the getanthill Datastore, please follow these guidelines:

  1. Fork the repository.
  2. Create a new branch: git checkout -b feature/your-feature-name or bugfix/your-bugfix-name.
  3. Make your changes and commit them: Follow our commit message guidelines.
  4. Push to your fork and open a Pull Request.

Please ensure your code adheres to our coding standards and passes all tests.

🙏 Acknowledgements

  • Thanks to all the contributors who have helped make this project better!
  • Special thanks to the open-source community for their invaluable tools and inspiration.