JSPM

  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 3
  • Score
    100M100P100Q36891F
  • License SEE LICENSE IN LICENSE

!!!

Package Exports

  • @gptp/core
  • @gptp/core/dist/esm/index.js
  • @gptp/core/dist/umd/index.js

This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (@gptp/core) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

Readme

๐ŸŒ  Prompt template pipelines

(Needs to be fixed - inputed by voice)

Library for super boosting your usage of large language Models

Concept

When you have simple sigle Prompt in your up It Just doesn't matter how it is integrated If It's Direct calling of Rest API or using Open Ai library and hardcoilding prompt in the codefile or importing text file

When you need Something Special or Scale The capebilities of the llms, you have Generali Free Ways to come:

  1. find tuning some model to your perfection or even creating your own
  2. prompt tuning
  3. multishot promoting

with each of this situations this Library can help to boost your performance

this Library can help with severo severo Things:

  • separation of responsibilities between fromt engineer and programmer and between code Files and fromp Files and between fromps and from templates
  • testing witch prompt is working best for you loging
  • setting up some Common Format for trumps to be des fronts Intel change Bill between Project AS R For Example React Components Intro changebob between Applications This is implementation for typescript JavaScript but it Can be implemented for Any coule Stick We want
  • Simply find your code to in your code is Just one Simple function Which do All The Magic also DRY
  • boosting The Performance of the App and Allo 11
  • streaming

Prompt template pipelines (for prompt-engeneers)

(TODO: Write this section):

# Write a joke

Dictionary

(TODO: Write this section)

Prompt

(TODO: Write this section)

Prompt Template

(TODO: Write this section)

Model Requirements

Connected with each prompt template (TODO: Write this section)

Prompt Template Params

(TODO: Write this section)

Prompt Template Pipeline

(TODO: Write this section)

it can have 3 formats:

  • .ptp.md - markdown format
  • .ptp.json - json format
  • object

Prompt Template Pipeline Library

(TODO: Write this section)

Prompt Result

(TODO: Write this section)

Execution Tools

(TODO: Write this section)

OpenAiExecutionTools, AzureOpenAiExecutionTools, BardExecutionTools, LamaExecutionTools and special case are RemoteExecutionTools

Executor

(TODO: Write this section)

Xxxxx

(TODO: Write this section)

Xxxxx

(TODO: Write this section)

Xxxxx

(TODO: Write this section)

Xxxxx

(TODO: Write this section)

Xxxxx

(TODO: Write this section)

Xxxxx

(TODO: Write this section)

Xxxxx

(TODO: Write this section)

Usage and integration (for developers)

First you need to install this library:

npm install --save @gptp/core

(TODO: Write this section)

FAQ

If you have a question start a discussion, open an issue or write me an email.

Why not just use OpenAI library?

Different levels of abstraction. OpenAI library is for direct usage of OpenAI API. This library is for higher level of abstraction. It is for creating prompt templates and prompt template pipelines which are indipedent on the underlying library, LLM model or even LLM provider.

How it different from Langchain library?

Langchain is primarly focued on ML engeneers working in python. This library is for developers working in javascript/typescript creating applications for end users.

We are considering to create a bridge/convertor between these two libraries.

TODOs

  • !! Make this working as external library
  • [๐Ÿง ] Figure out the best name for this library - Prompt Template Pipeline, Prompt Template Engine, Prompt Template Processor, Open Prompt Initiative
  • Export all promptTemplatePipeline as ptp alias from library
  • Make from this folder a separate repository + npm package
  • Add tests
  • Annotate all entities
  • Make internal string aliases
  • Make branded types instead of pure string aliases
  • Remove all anys
  • Make PTP non-linear
  • Logging pipeline name, version, step,...
  • No circular dependencies
  • [ ][๐Ÿง ] Wording: "param" vs "parameter" vs "variable" vs "argument"
  • All entities must have public / private / protected modifiers
  • Everything not needed should be private or not exported
  • Refactor circular dependencies
  • Importing subtemplates
  • Use spaceTrim more effectively
  • [ ][๐Ÿง ] Figure out best word for "entry" and "result" params
  • [๐Ÿคนโ€โ™‚๏ธ] Allow chats to be continued with previous message
  • [๐Ÿง ][๐Ÿคนโ€โ™‚๏ธ] How to mark continued chat in .ptp.md format?
  • xxx
  • xxx
  • xxx
  • xxx
  • xxx
  • xxx
  • xxx
  • xxx
  • xxx
  • xxx
  • xxx
  • xxx