Package Exports
- @adobe/httptransfer
- @adobe/httptransfer/es2015
- @adobe/httptransfer/es2015/index.js
- @adobe/httptransfer/es2015/logger
- @adobe/httptransfer/es2015/logger.js
- @adobe/httptransfer/index.js
- @adobe/httptransfer/lib/logger
- @adobe/httptransfer/lib/logger.js
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (@adobe/httptransfer) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
node-httptransfer
Introduction
The node-httptransfer package is designed to easily and correctly transfer file content from HTTP(S) urls to HTTP(S) urls and between HTTP(S) urls.
The lower-level stream API allows you to transfer content from a URL to any writable stream, and similarly transfer any readable stream to a URL.
The higher-level file API abstracts away the streams and allows you to transfer content to/from files on disk.
The node-httptransfer package requires the async/await features and is built using the node-fetch-npm package.
To use block transfer capabilities, the upload/download servers must support the range header.
Installation
npm i @adobe/httptransferUsing streams
Download a stream:
const { downloadStream } = require('@adobe/httptransfer');
async main() {
const stream = fs.createWriteStream('test.png');
await downloadStream('http://my.server.com/test.png', stream);
}Upload a stream using PUT:
const { uploadStream } = require('@adobe/httptransfer');
async main() {
const stream = fs.createReadStream('test.png');
await uploadStream(stream, 'http://my.server.com/test.png');
}Using files
Download a file:
const { downloadFile } = require('@adobe/httptransfer');
async main() {
await downloadFile('http://my.server.com/test.png', 'test.png');
}Upload a file using PUT:
const { uploadFile } = require('@adobe/httptransfer');
async main() {
await uploadFile('test.png', 'http://my.server.com/test.png');
}Upload a file to multiple URLs using PUT (used by AEM multi-part upload):
const { uploadAEMMultipartFile } = require('@adobe/httptransfer');
async main() {
await uploadAEMMultipartFile('test.png', {
urls: [ "http://my.server.com/test.png.1", "http://my.server.com/test.png.2" ],
maxPartSize: 1000000
});
}Using block upload/downland to upload/download files concurrently
const { downloadFileConcurrently } = require('@adobe/httptransfer');
async main() {
await downloadFile('http://my.server.com/test.png', 'test.png');
}Upload a file using PUT:
const { uploadFileConcurrently } = require('@adobe/httptransfer');
async main() {
await uploadFile('test.png', 'http://my.server.com/test.png');
}Upload a file to multiple URLs using PUT (used by AEM multi-part upload):
const { uploadMultiPartFileConcurrently } = require('@adobe/httptransfer');
async main() {
await uploadMultiPartFileConcurrently('test.png', {
urls: [ "http://my.server.com/test.png.1", "http://my.server.com/test.png.2" ],
maxPartSize: 1000000
});
}Upload multiple files to multiple URLs using PUT:
const { uploadFilesConcurrently } = require('@adobe/httptransfer');
async main() {
await uploadFilesConcurrently([{
filepath: 'file1.png',
target: {
urls: [ "http://my.server.com/file1.png.1", "http://my.server.com/file1.png.2" ],
maxPartSize: 1000000
}
}], {
filepath: 'file2.png',
target: {
urls: [ "http://my.server.com/file2.png.1", "http://my.server.com/file2.png.2" ],
maxPartSize: 1000000
}
}]);
}Assuming test.png is 1,800,000 bytes this will upload the first 1,000,000 bytes to http://my.server.com/test.png.1 and the next 800,000 bytes to http://my.server.com/test.png.2.
Debugging
To enable debug output when using node-httptransfer library, set the DEBUG environment variable to httptransfer:*.
You can also specify a specific loglevel per ./lib/logger.js, e.g.:
$ DEBUG='httptransfer:warn' npm run testTestbed
A CLI tool testbed is provided to try out the node-httptransfer functionality. It supports uploading, downloading, and transferring file content. It also supports Azure Blob stores through Shared Access Signature (SAS) urls created on the fly.
The tool is not intended to be useful on its own, only to test out new features or debug issues.
Build
cd testbed/
npm installAzure credentials
export AZURE_STORAGE_ACCOUNT=<storage account name from https://portal.azure.com>
export AZURE_STORAGE_KEY=<storage key from https://portal.azure.com>Examples
Download an image from a website:
node index.js https://website.com/path/to/image.gif image.gifDownload blob.txt from azure:
node index.js azure://container/path/to/blob.txt blob.txtUpload blob.txt to azure:
node index.js blob.txt azure://container/path/to/blob.txtUpload blob.txt in 10,000 byte blocks:
node index.js --max 10000 blob.txt azure://container/path/to/blob.txtCopy blob.txt within a container:
node index.js azure://container/path/to/blob.txt azure://container/path/to/target.txtEnd-to-End Tests
The module provides some integration-style tests for verifying basic transfer functionality with an Adobe Experience Manager instance. To run the tests:
- Create a
.envfile by following the instructions in .env_example. - Run the tests by executing
npm run e2efrom the root directory of the repository.
End-to-End block upload/download Tests
If you want to just run the block upload/download tests, you only need Azure credentials:
export AZURE_STORAGE_ACCOUNT=<storage account name from https://portal.azure.com>
export AZURE_STORAGE_KEY=<storage key from https://portal.azure.com>
export AZURE_STORAGE_CONTAINER_NAME=<storage container name>The run npm run e2e-block
Contributing
Contributions are welcomed! Read the Contributing Guide for more information.
Licensing
This project is licensed under the Apache V2 License. See LICENSE for more information.