Package Exports
- csv-stringify
- csv-stringify/lib
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (csv-stringify) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
This project is part of the CSV module
and a stringifier converting records into a CSV text and implementing the
Node.js stream.Transform
API. It is also providing a simple callback-base API
for converniency. It is both extremely easy to use and powerfull. It was
released since 2010 and is tested against very large dataset by a large
community.
Documentation for the "csv-stringify" package is available here.
Note
This module is to be considered in alpha stage. It is part of an ongoing effort to split the current CSV module into complementary modules with a cleaner design and the latest stream implementation. However, the code has been imported with very little changes and you should feel confident to use it in your code.
Usage
Run npm install csv
to install the full csv module or run
npm install csv-stringify
if you are only interested by the CSV stringifier.
Use the callback style API for simplicity or the stream based API for scalability.
Using the callback API
The stringifier receive an array and return a string inside a user-provided
callback. This example is available with the command node samples/callback.js
.
var stringify = require('csv-stringify');
input = [ [ '1', '2', '3', '4' ], [ 'a', 'b', 'c', 'd' ] ];
stringify(input, function(err, output){
output.should.eql('1,2,3,4\na,b,c,d');
});
Using the stream API
// node samples/stream.js
var stringify = require('csv-stringify');
data = '';
stringifier = stringify({delimiter: ':'})
stringifier.on('readable', function(){
while(row = stringifier.read()){
data += row;
}
});
stringifier.on('error', function(err){
consol.log(err.message);
});
stringifier.on('finish', function(){
data.should.eql(
"root❌0:0:root:/root:/bin/bash\n" +
"someone❌1022:1022:a funny cat:/home/someone:/bin/bash"
);
});
stringifier.write([ 'root','x','0','0','root','/root','/bin/bash' ]);
stringifier.write([ 'someone','x','1022','1022','a funny cat','/home/someone','/bin/bash' ]);
stringifier.end();
Using the pipe function
One usefull function part of the Stream API is pipe
to interact between
multiple streams. You may use this function to pipe a stream.Readable
array
or object source to a stream.Writable
string destination. The next example
available as node samples/pipe.js
generate records, stringify them and print
them to stdout.
stringify = require('csv-stringify');
generate = require('csv-generate');
generator = generate({objectMode: true, seed: 1, headers: 2});
stringifier = stringify();
generator.pipe(stringifier).pipe(process.stdout);
Migration
Most of the generator is imported from its parent project CSV in a effort to split it between the generator, the parser, the transformer and the stringifier.
Development
Tests are executed with mocha. To install it, simple run npm install
followed by npm test
. It will install mocha and its dependencies in your
project "node_modules" directory and run the test suite. The tests run
against the CoffeeScript source files.
To generate the JavaScript files, run make build
.
The test suite is run online with Travis against the versions 0.9, 0.10 and 0.11 of Node.js.
Contributors
- David Worms: https://github.com/wdavidw