Package Exports
- baby-prompts
- baby-prompts/index.js
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (baby-prompts) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
Baby Prompts
Providing super basic prompt techniques and chains for OpenAI's response API.
Overview
👉 The library allows you to easily create different prompt techniques (see below) and choose formatting output.
A full list of examples is available here.
Installation
Install the library by typing
npm install baby-prompts
Prompt techniques examples
Before you can invoke any prompt, you need to configure your prompt by choosing a model.
import {
getPrompt,
promptChain,
invoke,
outputText,
jsonFormatter,
json,
user,
assistant,
developer,
tap,
withPreviousResponse,
} from 'baby-prompts';
// Get the prompt function with custom options
const prompt = getPrompt('You are a helpful assistant.', {
model: 'gpt-4.1-mini',
temperature: 0,
});The default model is gpt-4.1-mini.
Follow the OpenAI documentation for choosing models and options.
Here are some examples of how to use different prompting techniques:
- Zero-shot prompting
- Few-shot prompting
- Prompt chains
1. Zero-shot prompting
Here is a simple example of invoking a prompt.
prompt(developer('Be a funny assistant'), 'Tell me a joke') // setup the prompt
.pipe(invoke) // execute it
.pipe(outputText) // extract the output_text from the response
.pipe(console.log); // print itNote that the pipe method is basically a then call (i.e., method of a Promise).
If you prefer to async/await, here is the same code.
const result = await invoke(
prompt(developer('Be a funny assistant'), 'Tell me a joke')
);
console.log(outputText(result)); // or result.output_text2. Few-shot prompting
Multiple messages can be combined before invocation, and you can choose the user (default), developer, or assistant roles. Note that this is still a single prompt (a single invoke method is called).
prompt(
developer('Ask a question following this style'),
user('How are you?'),
assistant('How are you, human?'),
user('What time is it?'),
assistant('What time is it, human?'),
user('Where are you from?'),
assistant('Where are you from, human?'),
user('What is your age?') // expected: "What is your age, human?"
)
.pipe(invoke)
.pipe(outputText)
.pipe(console.log);3. Prompt chaining
With prompt chaining, you can chain the output of a prompt directly into the input of the next one.
promptChain(
prompt(user('What is 1+1?')),
prompt(user('Say that without using numbers.')),
prompt(user('Add an emoji at the end.'))
)
.pipe(outputText)
.pipe(console.log);Please note that when using chains, you do not need to call the invoke method manually, as it is called for you by the promptChain function.
For more complex examples, involving the usage of the tap function and formatted output, look at this.
Structured output
You can structure the output of a prompt just before invocation. For that, you need to use the zod library, which is already included as a dependency.
import { z } from 'zod';
const Person = z.object({
name: z.string(),
age: z.number(),
});
const PeopleList = z.object({
people: z.array(Person).length(10), // exactly 10 people
});
// prompt
prompt(
developer('You are a helpful assistant'), //
'Write a list of 10 people with name and age'
)
.pipe(jsonFormatter(PeopleList))
.pipe(invoke)
.pipe(json)
.pipe(console.log);Conversational history
You can preserve information across multiple messages or turns in a conversation by passing a response as an input to the next prompt using the withPreviousResponse function.
Here a couple of examples.
import {
getPrompt,
invoke,
withPreviousResponse,
outputText,
promptChain,
} from 'baby-prompts';
// Get the prompt function with custom options
const prompt = getPrompt(
'Do not add any explanation. Just return what you are asked for.'
);
// Get the first prompt
const res = await prompt('My name is Jon Snow.').pipe(invoke);
// Using a single prompt
await prompt('What is my name?')
.then(withPreviousResponse(res)) // pass in the previous response
.then(invoke)
.then(outputText)
.then(console.log); // "Jon Snow"
// Using a chain
await promptChain(
prompt('What is my name?').then(withPreviousResponse(res)), // pass in the previous response
prompt('Add an emoji to my name.')
)
.then(outputText)
.then(console.log); // "Jon Snow 🐺"Requirments
This code requires that you have a .env file with the variable OPENAI_API_KEY set to your own OpenAI API key. You can find an API key here.
Credits
Developed by MAKinteract with ♥️.