JSPM

  • Created
  • Published
  • Downloads 15224
  • Score
    100M100P100Q132617F
  • License MIT

Download selected files from an Amazon S3 bucket as a zip file.

Package Exports

  • s3-zip

This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (s3-zip) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

Readme

s3-zip

npm version Build Status Coverage Status JavaScript Style Guide

Download selected files from an Amazon S3 bucket as a zip file.

Install

npm install s3-zip

AWS Configuration

Refer to the AWS SDK for authenticating to AWS prior to using this plugin.

Usage

Zip specific files

var fs = require('fs')
var join = require('path').join
var s3Zip = require('s3-zip')

var region = 'bucket-region'
var bucket = 'name-of-s3-bucket'
var folder = 'name-of-bucket-folder/'
var file1 = 'Image A.png'
var file2 = 'Image B.png'
var file3 = 'Image C.png'
var file4 = 'Image D.png'

var output = fs.createWriteStream(join(__dirname, 'use-s3-zip.zip'))

s3Zip
  .archive({ region: region, bucket: bucket}, folder, [file1, file2, file3, file4])
  .pipe(output)

You can also pass a custom S3 client. For example if you want to zip files from a S3 compatible storage:

var aws = require('aws-sdk')

var s3Client = new aws.S3({
  signatureVersion: 'v4',
  s3ForcePathStyle: 'true',
  endpoint: 'http://localhost:9000',
})

s3Zip
  .archive({ s3: s3Client, bucket: bucket }, folder, [file1, file2])
  .pipe(output)

Zip files with AWS Lambda

Example of s3-zip in combination with AWS Lambda.

Zip a whole bucket folder

var fs = require('fs')
var join = require('path').join
var AWS = require('aws-sdk')
var s3Zip = require('s3-zip')
var XmlStream = require('xml-stream')

var region = 'bucket-region'
var bucket = 'name-of-s3-bucket'
var folder = 'name-of-bucket-folder/'
var s3 = new AWS.S3({ region: region })
var params = {
  Bucket: bucket,
  Prefix: folder
}

var filesArray = []
var files = s3.listObjects(params).createReadStream()
var xml = new XmlStream(files)
xml.collect('Key')
xml.on('endElement: Key', function(item) {
  filesArray.push(item['$text'].substr(folder.length))
})

xml
  .on('end', function () {
    zip(filesArray)
  })

function zip(files) {
  console.log(files)
  var output = fs.createWriteStream(join(__dirname, 'use-s3-zip.zip'))
  s3Zip
   .archive({ region: region, bucket: bucket, preserveFolderStructure: true }, folder, files)
   .pipe(output)
}

Tar format support

s3Zip
  .setFormat('tar')
  .archive({ region: region, bucket: bucket }, folder, [file1, file2])
  .pipe(output)

Archiver options

We use archiver to create archives. To pass your options to it, use setArchiverOptions method:

s3Zip
  .setFormat('tar')
  .setArchiverOptions({ gzip: true })
  .archive({ region: region, bucket: bucket }, folder, [file1, file2])

Organize your archive with custom paths and permissions

You can pass an array of objects with type EntryData to organize your archive.

var files = ['flower.jpg', 'road.jpg'];
var archiveFiles = [
  { name: 'newFolder/flower.jpg' },

  /* _rw_______ */
  { name: 'road.jpg', mode: parseInt('0600', 8)  }
];
s3Zip.archive({ region: region, bucket: bucket }, folder, files, archiveFiles)

Debug mode

Enable debug mode to see the logs:

s3Zip.archive({ region: region, bucket: bucket, debug: true }, folder, files)

Testing

Tests are written in Node Tap, run them like this:

npm t

If you would like a more fancy coverage report:

npm run coverage