Package Exports
- asar
- asar/lib/disk
- asar/lib/disk.js
- asar/lib/filesystem
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (asar) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
asar - Atom-Shell Archive
Asar is a simple extensive archive format, it works like tar that concatenates
all files together without compression, while having random access support.
Features
- Support random access
- Use JSON to store files' information
- Very easy to write a parser
Command line utility
Install
$ npm install asarUsage
$ asar --help
Usage: asar [options] [command]
Commands:
pack|p <dir> <output>
create asar archive
list|l <archive>
list files of asar archive
extract-file|ef <archive> <filename>
extract one file from archive
Options:
-h, --help output usage information
-V, --version output the version number
Format
Asar uses Pickle to safely serialize binary value to file, there is
also a node.js binding of Pickle class.
The format of asar is very flat:
| UInt32: header_size | String: header | Bytes: file1 | ... | Bytes: file42 |The header_size and header are serialized with Pickle class, and
header_size's Pickle object is 8 bytes.
The header is a JSON string, and the header_size is the size of header's
Pickle object.
Structure of header is something like this:
{
"files": {
"tmp": {
"files": {}
},
"usr" : {
"files": {
"bin": {
"files": {
"ls": {
"offset": "0",
"size": 100,
"executable": true
},
"cd": {
"offset": "100",
"size": 100,
"executable": true
}
}
}
}
},
"etc": {
"files": {
"hosts": {
"offset": "200",
"size": 32
}
}
}
}
}offset and size records the information to read the file from archive, the
offset starts from 0 so you have to manually add the size of header_size and
header to the offset to get the real offset of the file.
offset is a UINT64 number represented in string, because there is no way to
precisely represent UINT64 in JavaScript Number. size is a UINT32 number for
the same reason, which indicates that we can not save a file larger than 4.2GB
(though the archive itself doesn't have size limitation). We didn't store size
in UINT64 because file size in Node.js is represented as Number and it is not
safe to convert Number to UINT64.