JSPM

@vercel/blob

0.9.1
  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 1295452
  • Score
    100M100P100Q224298F
  • License Apache-2.0

The Vercel Blob JavaScript API client

Package Exports

  • @vercel/blob
  • @vercel/blob/dist/index.cjs
  • @vercel/blob/dist/index.js

This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (@vercel/blob) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

Readme

🍙 @vercel/blob

The Vercel Blob JavaScript API client.

Install

npm install @vercel/blob

Usage

import * as vercelBlob from '@vercel/blob';

// usage
async function someMethod() {
  const blob = await vercelBlob.put(
    'profilesv1/user-12345.txt', // pathname for the blob
    'Hello World!', // body
    { access: 'public' }, // mandatory options
  );

  console.log(blob.url);
  // https://public.blob.vercel-storage.com/n1g9m63etib6gkcjqjpspsiwe7ea/profilesv1/user-12345-NoOVGDVcqSPc7VYCUAGnTzLTG2qEM2.txt
}

API

put(pathname, body, options)

Upload a blob to the Vercel Blob API, and returns the URL of the blob.

async function put(
  pathname: string,
  body: ReadableStream | String | ArrayBuffer | Blob | File // All fetch body types are supported: https://developer.mozilla.org/en-US/docs/Web/API/fetch#body
  options: {
    access: 'public', // mandatory, as we will provide private blobs in the future
    contentType?: string, // by default inferred from pathname
    // `token` defaults to process.env.BLOB_READ_WRITE_TOKEN on Vercel
    // and can be configured when you connect more stores to a project
    // or using Vercel Blob outside of Vercel
    // on the client `token` is mandatory and must be generated by "generateClientTokenFromReadWriteToken"
    token?: string,
  }): Promise<{
      size: number;
      uploadedAt: Date;
      pathname: string;
      contentType: string;
      contentDisposition: string;
      url: string;
    }> {}

del(url, options)

Delete one or multiple blobs by their full URL. Returns the deleted blob(s) or null when not found.

async function del(
  url: string | string[],
  options?: {
    token?: string;
  },
): Promise<
  | {
      size: number;
      uploadedAt: Date;
      pathname: string;
      contentType: string;
      contentDisposition: string;
      url: string;
    }
  | null
  | ({
      size: number;
      uploadedAt: Date;
      pathname: string;
      contentType: string;
      contentDisposition: string;
      url: string;
    } | null)[]
> {}

head(url, options)

Get the metadata of a blob by its full URL. Returns null when the blob does not exist.

async function head(
  url: string,
  options?: {
    token?: string;
  },
): Promise<{
  size: number;
  uploadedAt: Date;
  pathname: string;
  contentType: string;
  contentDisposition: string;
  url: string;
} | null> {}

list(options)

List blobs and get their metadata in the store. With an optional prefix and limit. Paginate through them.

async function list(options?: {
  token?: string;
  limit?: number; // defaults to 1,000
  prefix?: string;
  cursor?: string;
}): Promise<{
  blobs: {
    size: number;
    uploadedAt: Date;
    pathname: string;
    contentType: string;
    contentDisposition: string;
    url: string;
  }[];
  cursor?: string;
  hasMore: boolean;
}> {}

generateClientTokenFromReadWriteToken(options)

Generates a single-use token that can be used from within the client. This is useful when uploading directly from browsers to circumvent the 4MB limitation of going through a Vercel-hosted route.

Once created, a client token is valid by default for 30 seconds (can be customized by configuring the validUntil field). This means you have 30 seconds to initiate an upload with this token.

async function generateClientTokenFromReadWriteToken(options?: {
  token?: string;
  pathname?: string;
  onUploadCompleted?: {
    callbackUrl: string;
    metadata?: string;
  };
  maximumSizeInBytes?: number;
  allowedContentTypes?: string[];
  validUntil?: number; // timestamp in ms
}): string {}

Note: This method should be called server-side, not client-side.

Examples

Next.js App Router example

This example shows a form uploading a file to the Vercel Blob API.

// /app/UploadForm.tsx

'use client';

import type { BlobResult } from '@vercel/blob';
import { useState } from 'react';

export default function UploadForm() {
  const [blob, setBlob] = useState<BlobResult | null>(null);

  return (
    <>
      <form
        action="/api/upload"
        method="POST"
        encType="multipart/form-data"
        onSubmit={async (event) => {
          event.preventDefault();

          const formData = new FormData(event.currentTarget);
          const response = await fetch('/api/upload', {
            method: 'POST',
            body: formData,
          });
          const blob = (await response.json()) as BlobResult;
          setBlob(blob);
        }}
      >
        <input type="file" name="file" />
        <button type="submit">Upload</button>
      </form>
      {blob && (
        <div>
          Blob url: <a href={blob.url}>{blob.url}</a>
        </div>
      )}
    </>
  );
}
// /app/api/upload/route.ts

import * as vercelBlob from '@vercel/blob';
import { NextResponse } from 'next/server';

export async function POST(request: Request) {
  const form = await request.formData();
  const file = form.get('file') as File;

  if (!file) {
    return NextResponse.json(
      { message: 'No file to upload.' },
      { status: 400 },
    );
  }

  const blob = await vercelBlob.put(file.name, file, { access: 'public' });

  return NextResponse.json(blob);
}

Uploading directly from browsers

The above example uploads a file through a vercel route. This solution is limited to a 4Mb file size. In order to bypass this limit, it is possible to upload a file directly from within the client, after generating a single-use token.

// /app/UploadForm.tsx

'use client';

import type { BlobResult, put } from '@vercel/blob';
import { useState } from 'react';

export default function UploadForm() {
  const inputFileRef = useRef<HTMLInputElement>(null);
  const [blob, setBlob] = useState<BlobResult | null>(null);
  return (
    <>
      <h1>App Router Client Upload</h1>

      <form
        onSubmit={async (event): Promise<void> => {
          event.preventDefault();

          const file = inputFileRef.current?.files?.[0];
          if (!file) {
            return;
          }

          const clientTokenData = (await fetch(
            '/api/generate-blob-client-token',
            {
              method: 'POST',
              body: JSON.stringify({
                pathname: file.name,
              }),
            },
          ).then((r) => r.json())) as { clientToken: string };

          const blobResult = await put(file.name, file, {
            access: 'public',
            token: clientTokenData.clientToken,
          });

          setBlob(blobResult);
        }}
      >
        <input name="file" ref={inputFileRef} type="file" />
        <button type="submit">Upload</button>
      </form>
      {blob && (
        <div>
          Blob url: <a href={blob.url}>{blob.url}</a>
        </div>
      )}
    </>
  );
}
// /app/api/generate-blob-client-token/route.ts

import {
  generateClientTokenFromReadWriteToken,
  type GenerateClientTokenOptions,
} from '@vercel/blob';
import { NextResponse } from 'next/server';

export async function POST(request: Request): Promise<NextResponse> {
  // On a real website, this route would be protected by authentication, see: https://nextjs.org/docs/pages/building-your-application/routing/authenticating
  // Here, we accept the `pathname` from the browser, but in some situations, you may even craft the pathname
  // based on the authentication result
  const body = (await request.json()) as { pathname: string };

  const timestamp = new Date();
  timestamp.setSeconds(timestamp.getSeconds() + 60);
  return NextResponse.json({
    clientToken: await generateClientTokenFromReadWriteToken({
      ...body,
      onUploadCompleted: {
        callbackUrl: `https://${
          process.env.VERCEL_URL ?? ''
        }/api/file-upload-completed`,
        metadata: JSON.stringify({ userId: 12345 }),
      },
      maximumSizeInBytes: 10_000_000, // 10 Mb
      allowedContentTypes: 'text/plain',
      validUntil: timestamp.getTime(), // default to 30s
    }),
  });
}
// /app/api/file-upload-completed/route.ts

import {
  type BlobUploadCompletedEvent,
  verifyCallbackSignature,
} from '@vercel/blob';
import { NextResponse } from 'next/server';

export async function POST(request: Request): Promise<NextResponse> {
  const body = (await request.json()) as BlobUploadCompletedEvent;
  console.log(body);
  // { type: "blob.upload-completed", payload: { metadata: "{ foo: 'bar' }", blob: ... }}

  if (
    !(await verifyCallbackSignature({
      signature: request.headers.get('x-vercel-signature') ?? '',
      body: JSON.stringify(body),
    }))
  ) {
    return NextResponse.json(
      {
        response: 'invalid signature',
      },
      {
        status: 403,
      },
    );
  }
  const metadata = JSON.parse(body.payload.metadata as string) as {
    userId: string;
  };
  const blob = body.payload.blob;

  console.log(metadata.userId); // 12345
  console.log(blob); // { url: '...', size: ..., uploadedAt: ..., ... }

  return NextResponse.json({
    response: 'ok',
  });
}

How to list all your blobs

This will paginate through all your blobs in chunks of 1,000 blobs. You can control the number of blobs in each call with limit.

let hasMore = true;
let cursor: string | undefined;
while (hasMore) {
  const listResult = await vercelBlob.list({
    cursor,
  });
  console.log(listResult);
  hasMore = listResult.hasMore;
  cursor = listResult.cursor;
}

Error handling

All methods of this module will throw if the request fails for either:

  • missing parameters
  • bad token or token doesn't have access to the resource
  • or in the event of unknown errors

You should acknowledge that in your code by wrapping our methods in a try/catch block:

try {
  await vercelBlob.put('foo', 'bar');
} catch (error) {
  if (error instanceof vercelBlob.BlobAccessError) {
    // handle error
  } else {
    // rethrow
    throw error;
  }
}

Releasing

pnpm changeset
git commit -am "New version"

Once such a commit gets merged in main, then GitHub will open a versioning PR you can merge. And the package will be automatically published to npm.

A note about Vercel file upload limitations

When using Serverless or Edge Functions on Vercel, the request body size is limited to 4MB.

When you want to send files larger than that to Vercel Blob, you can do so by using @vercel/blob from a regular Node.js script context (like at build time). This way the request body will be sent directly to Vercel Blob and not via an Edge or Serverless Function.

We plan to allow sending larger files to Vercel Blob from browser contexts soon.

Running examples locally

  • how to run examples locally (.env.local with token)
  • how to run examples on Vercel (vc deploy)
  • how to contribute (pnpm dev to rebuild, example uses local module)
  • for Vercel contributors, link on how to run the API locally (edge-functions readme link, wrangler dev, pnpm dev for module)

A note for Vite users

@vercel/blob reads the token from the environment variables on process.env. In general, process.env is automatically populated from your .env file during development, which is created when you run vc env pull. However, Vite does not expose the .env variables on process.env.

You can fix this in one of following two ways:

  1. You can populate process.env yourself using something like dotenv-expand:
pnpm install --save-dev dotenv dotenv-expand
// vite.config.js
import dotenvExpand from 'dotenv-expand';
import { loadEnv, defineConfig } from 'vite';

export default defineConfig(({ mode }) => {
  // This check is important!
  if (mode === 'development') {
    const env = loadEnv(mode, process.cwd(), '');
    dotenvExpand.expand({ parsed: env });
  }

  return {
    ...
  };
});
  1. You can provide the credentials explicitly, instead of relying on a zero-config setup. For example, this is how you could create a client in SvelteKit, which makes private environment variables available via $env/static/private:
import { put } from '@vercel/blob';
+ import { BLOB_TOKEN } from '$env/static/private';

const kv = await head("filepath", {
-  token: '<token>',
+  token: BLOB_TOKEN,
});

await kv.set('key', 'value');