Skip to content

Commit

Permalink
Merge branch 'master' into master
Browse files Browse the repository at this point in the history
  • Loading branch information
ZJONSSON committed Apr 14, 2024
2 parents 6444593 + 7c4604e commit c10bf09
Show file tree
Hide file tree
Showing 26 changed files with 422 additions and 233 deletions.
13 changes: 0 additions & 13 deletions .circleci/config.yml

This file was deleted.

50 changes: 50 additions & 0 deletions .github/workflows/coverage.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
name: Deploy coverate report to Pages

on:
# Runs on pushes targeting the default branch
push:
branches: ["master"]

# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:

# Sets permissions of the GITHUB_TOKEN to allow deployment to GitHub Pages
permissions:
contents: read
pages: write
id-token: write

# Allow only one concurrent deployment, skipping runs queued between the run in-progress and latest queued.
# However, do NOT cancel in-progress runs as we want to allow these production deployments to complete.
concurrency:
group: "pages"
cancel-in-progress: false

jobs:
# Single deploy job since we're just deploying
coverage:
environment:
name: github-pages
url: ${{ steps.deployment.outputs.page_url }}
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Use Node.js 18
uses: actions/setup-node@v3
with:
node-version: '18.x'
- run: npm install
- run: npm run build --if-present
- run: npm test
- run: npx lcov-badge2 .tap/report/lcov.info -o .tap/report/badge.svg
- name: Setup Pages
uses: actions/configure-pages@v3
- name: Upload artifact
uses: actions/upload-pages-artifact@v2
with:
path: '.tap/report'
- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@v2

26 changes: 26 additions & 0 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
name: Node.js CI

on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
workflow_dispatch:

jobs:
test:
runs-on: ubuntu-latest

strategy:
matrix:
node-version: [18.x]

steps:
- uses: actions/checkout@v3
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v3
with:
node-version: ${{ matrix.node-version }}
- run: npm install
- run: npm run build --if-present
- run: npm test
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -3,3 +3,5 @@
/test.js
/.nyc_output/
/coverage/
.tap/
package-lock.json
7 changes: 3 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
[![NPM Version][npm-image]][npm-url]
[![NPM Downloads][downloads-image]][downloads-url]
[![Test Coverage][travis-image]][travis-url]
[![Coverage][coverage-image]][coverage-url]
[![code coverage](https://zjonsson.github.io/node-unzipper/badge.svg)](https://zjonsson.github.io/node-unzipper/)

[npm-image]: https://img.shields.io/npm/v/unzipper.svg
[npm-url]: https://npmjs.org/package/unzipper
Expand Down Expand Up @@ -43,7 +42,7 @@ fs.createReadStream('path/to/archive.zip')
.pipe(unzipper.Extract({ path: 'output/path' }));
```

Extract emits the 'close' event once the zip's contents have been fully extracted to disk. `Extract` uses [fstream.Writer](https://www.npmjs.com/package/fstream) and therefore needs need an absolute path to the destination directory. This directory will be automatically created if it doesn't already exits.
Extract emits the 'close' event once the zip's contents have been fully extracted to disk. `Extract` uses [fstream.Writer](https://www.npmjs.com/package/fstream) and therefore needs an absolute path to the destination directory. This directory will be automatically created if it doesn't already exist.

### Parse zip file contents

Expand Down Expand Up @@ -207,7 +206,7 @@ fs.createReadStream('path/to/archive.zip')
## Open
Previous methods rely on the entire zipfile being received through a pipe. The Open methods load take a different approach: load the central directory first (at the end of the zipfile) and provide the ability to pick and choose which zipfiles to extract, even extracting them in parallel. The open methods return a promise on the contents of the directory, with individual `files` listed in an array. Each file element has the following methods:
* `stream([password])` - returns a stream of the unzipped content which can be piped to any destination
* `buffer([password])` - returns a promise on the buffered content of the file)
* `buffer([password])` - returns a promise on the buffered content of the file.
If the file is encrypted you will have to supply a password to decrypt, otherwise you can leave blank.
Unlike `adm-zip` the Open methods will never read the entire zipfile into buffer.

Expand Down
12 changes: 0 additions & 12 deletions lib/Buffer.js

This file was deleted.

5 changes: 0 additions & 5 deletions lib/BufferStream.js
Original file line number Diff line number Diff line change
@@ -1,10 +1,5 @@
var Promise = require('bluebird');
var Stream = require('stream');
var Buffer = require('./Buffer');

// Backwards compatibility for node versions < 8
if (!Stream.Writable || !Stream.Writable.prototype.destroy)
Stream = require('readable-stream');

module.exports = function(entry) {
return new Promise(function(resolve,reject) {
Expand Down
4 changes: 0 additions & 4 deletions lib/Decrypt.js
Original file line number Diff line number Diff line change
@@ -1,10 +1,6 @@
var bigInt = require('big-integer');
var Stream = require('stream');

// Backwards compatibility for node versions < 8
if (!Stream.Writable || !Stream.Writable.prototype.destroy)
Stream = require('readable-stream');

var table;

function generateTable() {
Expand Down
5 changes: 0 additions & 5 deletions lib/NoopStream.js
Original file line number Diff line number Diff line change
@@ -1,10 +1,5 @@
var Stream = require('stream');
var util = require('util');

// Backwards compatibility for node versions < 8
if (!Stream.Writable || !Stream.Writable.prototype.destroy)
Stream = require('readable-stream');

function NoopStream() {
if (!(this instanceof NoopStream)) {
return new NoopStream();
Expand Down
107 changes: 53 additions & 54 deletions lib/Open/directory.js
Original file line number Diff line number Diff line change
@@ -1,13 +1,12 @@
var binary = require('binary');
var PullStream = require('../PullStream');
var unzip = require('./unzip');
var Promise = require('bluebird');
var BufferStream = require('../BufferStream');
var parseExtraField = require('../parseExtraField');
var Buffer = require('../Buffer');
var path = require('path');
var Writer = require('fstream').Writer;
var parseDateTime = require('../parseDateTime');
var parseBuffer = require('../parseBuffer');

var signature = Buffer.alloc(4);
signature.writeUInt32LE(0x06054b50,0);
Expand All @@ -20,11 +19,11 @@ function getCrxHeader(source) {
if (signature === 0x34327243) {
var crxHeader;
return sourceStream.pull(12).then(function(data) {
crxHeader = binary.parse(data)
.word32lu('version')
.word32lu('pubKeyLength')
.word32lu('signatureLength')
.vars;
crxHeader = parseBuffer.parse(data, [
['version', 4],
['pubKeyLength', 4],
['signatureLength', 4],
]);
}).then(function() {
return sourceStream.pull(crxHeader.pubKeyLength +crxHeader.signatureLength);
}).then(function(data) {
Expand All @@ -39,12 +38,12 @@ function getCrxHeader(source) {

// Zip64 File Format Notes: https://pkware.cachefly.net/webdocs/casestudies/APPNOTE.TXT
function getZip64CentralDirectory(source, zip64CDL) {
var d64loc = binary.parse(zip64CDL)
.word32lu('signature')
.word32lu('diskNumber')
.word64lu('offsetToStartOfCentralDirectory')
.word32lu('numberOfDisks')
.vars;
var d64loc = parseBuffer.parse(zip64CDL, [
['signature', 4],
['diskNumber', 4],
['offsetToStartOfCentralDirectory', 8],
['numberOfDisks', 4],
]);

if (d64loc.signature != 0x07064b50) {
throw new Error('invalid zip64 end of central dir locator signature (0x07064b50): 0x' + d64loc.signature.toString(16));
Expand All @@ -58,18 +57,18 @@ function getZip64CentralDirectory(source, zip64CDL) {

// Zip64 File Format Notes: https://pkware.cachefly.net/webdocs/casestudies/APPNOTE.TXT
function parseZip64DirRecord (dir64record) {
var vars = binary.parse(dir64record)
.word32lu('signature')
.word64lu('sizeOfCentralDirectory')
.word16lu('version')
.word16lu('versionsNeededToExtract')
.word32lu('diskNumber')
.word32lu('diskStart')
.word64lu('numberOfRecordsOnDisk')
.word64lu('numberOfRecords')
.word64lu('sizeOfCentralDirectory')
.word64lu('offsetToStartOfCentralDirectory')
.vars;
var vars = parseBuffer.parse(dir64record, [
['signature', 4],
['sizeOfCentralDirectory', 8],
['version', 2],
['versionsNeededToExtract', 2],
['diskNumber', 4],
['diskStart', 4],
['numberOfRecordsOnDisk', 8],
['numberOfRecords', 8],
['sizeOfCentralDirectory', 8],
['offsetToStartOfCentralDirectory', 8],
]);

if (vars.signature != 0x06064b50) {
throw new Error('invalid zip64 end of central dir locator signature (0x06064b50): 0x0' + vars.signature.toString(16));
Expand Down Expand Up @@ -107,16 +106,16 @@ module.exports = function centralDirectory(source, options) {
var data = d.directory;
startOffset = d.crxHeader && d.crxHeader.size || 0;

vars = binary.parse(data)
.word32lu('signature')
.word16lu('diskNumber')
.word16lu('diskStart')
.word16lu('numberOfRecordsOnDisk')
.word16lu('numberOfRecords')
.word32lu('sizeOfCentralDirectory')
.word32lu('offsetToStartOfCentralDirectory')
.word16lu('commentLength')
.vars;
vars = parseBuffer.parse(data, [
['signature', 4],
['diskNumber', 2],
['diskStart', 2],
['numberOfRecordsOnDisk', 2],
['numberOfRecords', 2],
['sizeOfCentralDirectory', 4],
['offsetToStartOfCentralDirectory', 4],
['commentLength', 2],
]);

// Is this zip file using zip64 format? Use same check as Go:
// https://github.com/golang/go/blob/master/src/archive/zip/reader.go#L503
Expand Down Expand Up @@ -179,25 +178,25 @@ module.exports = function centralDirectory(source, options) {

vars.files = Promise.mapSeries(Array(vars.numberOfRecords),function() {
return records.pull(46).then(function(data) {
var vars = binary.parse(data)
.word32lu('signature')
.word16lu('versionMadeBy')
.word16lu('versionsNeededToExtract')
.word16lu('flags')
.word16lu('compressionMethod')
.word16lu('lastModifiedTime')
.word16lu('lastModifiedDate')
.word32lu('crc32')
.word32lu('compressedSize')
.word32lu('uncompressedSize')
.word16lu('fileNameLength')
.word16lu('extraFieldLength')
.word16lu('fileCommentLength')
.word16lu('diskNumber')
.word16lu('internalFileAttributes')
.word32lu('externalFileAttributes')
.word32lu('offsetToLocalFileHeader')
.vars;
var vars = vars = parseBuffer.parse(data, [
['signature', 4],
['versionMadeBy', 2],
['versionsNeededToExtract', 2],
['flags', 2],
['compressionMethod', 2],
['lastModifiedTime', 2],
['lastModifiedDate', 2],
['crc32', 4],
['compressedSize', 4],
['uncompressedSize', 4],
['fileNameLength', 2],
['extraFieldLength', 2],
['fileCommentLength', 2],
['diskNumber', 2],
['internalFileAttributes', 2],
['externalFileAttributes', 4],
['offsetToLocalFileHeader', 4],
]);

vars.offsetToLocalFileHeader += startOffset;
vars.lastModifiedDateTime = parseDateTime(vars.lastModifiedDate, vars.lastModifiedTime);
Expand Down
4 changes: 0 additions & 4 deletions lib/Open/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,6 @@ var Promise = require('bluebird');
var directory = require('./directory');
var Stream = require('stream');

// Backwards compatibility for node versions < 8
if (!Stream.Writable || !Stream.Writable.prototype.destroy)
Stream = require('readable-stream');

module.exports = {
buffer: function(buffer, options) {
var source = {
Expand Down
33 changes: 14 additions & 19 deletions lib/Open/unzip.js
Original file line number Diff line number Diff line change
Expand Up @@ -2,15 +2,10 @@ var Promise = require('bluebird');
var Decrypt = require('../Decrypt');
var PullStream = require('../PullStream');
var Stream = require('stream');
var binary = require('binary');
var zlib = require('zlib');
var parseExtraField = require('../parseExtraField');
var Buffer = require('../Buffer');
var parseDateTime = require('../parseDateTime');

// Backwards compatibility for node versions < 8
if (!Stream.Writable || !Stream.Writable.prototype.destroy)
Stream = require('readable-stream');
var parseBuffer = require('../parseBuffer');

module.exports = function unzip(source,offset,_password, directoryVars) {
var file = PullStream(),
Expand All @@ -23,19 +18,19 @@ module.exports = function unzip(source,offset,_password, directoryVars) {

entry.vars = file.pull(30)
.then(function(data) {
var vars = binary.parse(data)
.word32lu('signature')
.word16lu('versionsNeededToExtract')
.word16lu('flags')
.word16lu('compressionMethod')
.word16lu('lastModifiedTime')
.word16lu('lastModifiedDate')
.word32lu('crc32')
.word32lu('compressedSize')
.word32lu('uncompressedSize')
.word16lu('fileNameLength')
.word16lu('extraFieldLength')
.vars;
var vars = parseBuffer.parse(data, [
['signature', 4],
['versionsNeededToExtract', 2],
['flags', 2],
['compressionMethod', 2],
['lastModifiedTime', 2],
['lastModifiedDate', 2],
['crc32', 4],
['compressedSize', 4],
['uncompressedSize', 4],
['fileNameLength', 2],
['extraFieldLength', 2],
]);

vars.lastModifiedDateTime = parseDateTime(vars.lastModifiedDate, vars.lastModifiedTime);

Expand Down
Loading

0 comments on commit c10bf09

Please sign in to comment.