JSPM

@atmus/nestjs-elevenlabs

0.1.0
  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 7
  • Score
    100M100P100Q47246F
  • License MIT

NestJS - Elevenlabs integration

Package Exports

  • @atmus/nestjs-elevenlabs
  • @atmus/nestjs-elevenlabs/dist/index.js

This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (@atmus/nestjs-elevenlabs) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

Readme

NestJS OpenAI


Features

  • Configure and inject the OpenAI API client.

Installation

yarn add @atmus/nestjs-openai

Usage

Settings

OpenAIModule.register

Import the module at your module. Configure the OpenAI API client.

import { OpenAIModule } from "@atmus/nestjs-openai";

@Module({
  imports: [
    OpenAIModule.register({
      apiKey: "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
    }),
  ],
  controllers: [AppController],
  providers: [AppService],
})
export class AppModule {}

OpenAIModule.registerAsync

You can also use OpenAIModule.registerAsync to configure it asynchronously. This way you can hide your api key from the code. It's recommended to use this method.

import { OpenAIModule } from "@atmus/nestjs-openai";

@Module({
  imports: [
    OpenAIModule.registerAsync({
      useFactory: (config: ConfigService<EnvVars>) => ({
        apiKey: config.get("OPENAI_API_KEY"),
      }),
      inject: [ConfigService],
    }),
  ],
  controllers: [AppController],
  providers: [AppService],
})
export class AppModule {}

Inject OpenAIService

Inject OpenAIService anywhere. Injected service is identical to OpenAI API client.

@Injectable()
export class TextCompletionService {
  constructor(private readonly openai: OpenAIService) {}

  async completion(
    messages: CreateChatCompletionRequest["messages"],
  ): Promise<string> {
    const { data } = await this.openai.createCompletion({
      model: "davinci",
      prompt: "This is a test",
    });
    return data.choices[0].text;
  }
}