Skip to content

Commit e6870fe

Browse files
authored
fix: commonjs types, run tsc and lint to validate changes (#397)
* fix: types weren't working for commonjs. Run tsc and lint to validate changes * chore: needs to work on linux and BSD
1 parent 61d4eea commit e6870fe

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

50 files changed

+1593
-44
lines changed

CONTRIBUTING.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -2,8 +2,8 @@
22

33
* Before you open a ticket or send a pull request, [search](https://github.com/adaltas/node-csv/issues) for previous discussions about the same feature or issue. Add to the earlier ticket if you find one.
44

5-
* Before sending a pull request for a feature, be sure to have [tests](https://github.com/adaltas/node-csv/tree/master/test).
5+
* Before sending a pull request for a feature, be sure to have [tests](https://github.com/adaltas/node-csv/tree/master/packages/csv/test).
66

7-
* Use the same coding style as the rest of the [codebase](https://github.com/adaltas/node-csv/tree/master/src). If you’re writing a test and if you're just getting started with CoffeeScript, there’s a nice [style guide](https://github.com/polarmobile/coffeescript-style-guide).
7+
* Use the same coding style as the rest of the [codebase](https://github.com/adaltas/node-csv/tree/master/packages). If you’re writing a test and if you're just getting started with CoffeeScript, there’s a nice [style guide](https://github.com/polarmobile/coffeescript-style-guide).
88

99
* Documentation is published on [GitHub](https://github.com/adaltas/node-csv-docs) and you are invited to submit a pull request with your changes. For convenience, you can also browse the website and click on the Edit link present at the top of every page.

demo/eslint/.eslintrc.js

+2-1
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@ module.exports = {
1111
ecmaVersion: 'latest',
1212
},
1313
rules: {
14-
// 'import/no-unresolved': [2, { commonjs: false }],
14+
'import/no-unresolved': [2, { commonjs: false }],
15+
'no-console': 'off',
1516
},
1617
};

package.json

+2
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,8 @@
1616
"build": "lerna run build",
1717
"postinstall": "husky install",
1818
"publish": "lerna publish from-git --yes",
19+
"lint": "lerna run lint",
20+
"pretest": "npm run lint",
1921
"test": "lerna run test",
2022
"test:legacy": "lerna run test:legacy",
2123
"version": "lerna version"

packages/csv-generate/dist/cjs/stream.d.cts

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11

2-
import { Options } from './index.js';
2+
import { Options } from './index.cjs';
33

44
declare function generate(options?: Options): ReadableStream<Buffer>;
55
// export default generate;

packages/csv-generate/dist/cjs/sync.d.cts

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11

2-
import { Options } from './index.js';
2+
import { Options } from './index.cjs';
33

44
declare function generate<T = any>(options: number | Options): string & Array<T>;
55
// export default generate;

packages/csv-generate/dist/esm/index.js

+64
Original file line numberDiff line numberDiff line change
@@ -3295,6 +3295,26 @@ BufferList.prototype.concat = function (n) {
32953295
};
32963296

32973297
// Copyright Joyent, Inc. and other Node contributors.
3298+
//
3299+
// Permission is hereby granted, free of charge, to any person obtaining a
3300+
// copy of this software and associated documentation files (the
3301+
// "Software"), to deal in the Software without restriction, including
3302+
// without limitation the rights to use, copy, modify, merge, publish,
3303+
// distribute, sublicense, and/or sell copies of the Software, and to permit
3304+
// persons to whom the Software is furnished to do so, subject to the
3305+
// following conditions:
3306+
//
3307+
// The above copyright notice and this permission notice shall be included
3308+
// in all copies or substantial portions of the Software.
3309+
//
3310+
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
3311+
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
3312+
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
3313+
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
3314+
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
3315+
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
3316+
// USE OR OTHER DEALINGS IN THE SOFTWARE.
3317+
32983318
var isBufferEncoding = Buffer.isEncoding
32993319
|| function(encoding) {
33003320
switch (encoding && encoding.toLowerCase()) {
@@ -4381,6 +4401,9 @@ function indexOf(xs, x) {
43814401
}
43824402

43834403
// A bit simpler than readable streams.
4404+
// Implement an async ._write(chunk, encoding, cb), and it'll handle all
4405+
// the drain event emission and buffering.
4406+
43844407
Writable.WritableState = WritableState;
43854408
inherits$1(Writable, EventEmitter);
43864409

@@ -4892,6 +4915,47 @@ function onEndNT(self) {
48924915
}
48934916

48944917
// a transform stream is a readable/writable stream where you do
4918+
// something with the data. Sometimes it's called a "filter",
4919+
// but that's not a great name for it, since that implies a thing where
4920+
// some bits pass through, and others are simply ignored. (That would
4921+
// be a valid example of a transform, of course.)
4922+
//
4923+
// While the output is causally related to the input, it's not a
4924+
// necessarily symmetric or synchronous transformation. For example,
4925+
// a zlib stream might take multiple plain-text writes(), and then
4926+
// emit a single compressed chunk some time in the future.
4927+
//
4928+
// Here's how this works:
4929+
//
4930+
// The Transform stream has all the aspects of the readable and writable
4931+
// stream classes. When you write(chunk), that calls _write(chunk,cb)
4932+
// internally, and returns false if there's a lot of pending writes
4933+
// buffered up. When you call read(), that calls _read(n) until
4934+
// there's enough pending readable data buffered up.
4935+
//
4936+
// In a transform stream, the written data is placed in a buffer. When
4937+
// _read(n) is called, it transforms the queued up data, calling the
4938+
// buffered _write cb's as it consumes chunks. If consuming a single
4939+
// written chunk would result in multiple output chunks, then the first
4940+
// outputted bit calls the readcb, and subsequent chunks just go into
4941+
// the read buffer, and will cause it to emit 'readable' if necessary.
4942+
//
4943+
// This way, back-pressure is actually determined by the reading side,
4944+
// since _read has to be called to start processing a new chunk. However,
4945+
// a pathological inflate type of transform can cause excessive buffering
4946+
// here. For example, imagine a stream where every byte of input is
4947+
// interpreted as an integer from 0-255, and then results in that many
4948+
// bytes of output. Writing the 4 bytes {ff,ff,ff,ff} would result in
4949+
// 1kb of data being output. In this case, you could write a very small
4950+
// amount of input, and end up with a very large amount of output. In
4951+
// such a pathological inflating mechanism, there'd be no way to tell
4952+
// the system to stop doing the transform. A single 4MB write could
4953+
// cause the system to run out of memory.
4954+
//
4955+
// However, even in such a pathological case, only a single written chunk
4956+
// would be consumed, and then the rest would wait (un-transformed) until
4957+
// the results of the previous transformed chunk were consumed.
4958+
48954959
inherits$1(Transform, Duplex);
48964960

48974961
function TransformState(stream) {

packages/csv-generate/dist/esm/sync.js

+64
Original file line numberDiff line numberDiff line change
@@ -3295,6 +3295,26 @@ BufferList.prototype.concat = function (n) {
32953295
};
32963296

32973297
// Copyright Joyent, Inc. and other Node contributors.
3298+
//
3299+
// Permission is hereby granted, free of charge, to any person obtaining a
3300+
// copy of this software and associated documentation files (the
3301+
// "Software"), to deal in the Software without restriction, including
3302+
// without limitation the rights to use, copy, modify, merge, publish,
3303+
// distribute, sublicense, and/or sell copies of the Software, and to permit
3304+
// persons to whom the Software is furnished to do so, subject to the
3305+
// following conditions:
3306+
//
3307+
// The above copyright notice and this permission notice shall be included
3308+
// in all copies or substantial portions of the Software.
3309+
//
3310+
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
3311+
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
3312+
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
3313+
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
3314+
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
3315+
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
3316+
// USE OR OTHER DEALINGS IN THE SOFTWARE.
3317+
32983318
var isBufferEncoding = Buffer.isEncoding
32993319
|| function(encoding) {
33003320
switch (encoding && encoding.toLowerCase()) {
@@ -4381,6 +4401,9 @@ function indexOf(xs, x) {
43814401
}
43824402

43834403
// A bit simpler than readable streams.
4404+
// Implement an async ._write(chunk, encoding, cb), and it'll handle all
4405+
// the drain event emission and buffering.
4406+
43844407
Writable.WritableState = WritableState;
43854408
inherits$1(Writable, EventEmitter);
43864409

@@ -4892,6 +4915,47 @@ function onEndNT(self) {
48924915
}
48934916

48944917
// a transform stream is a readable/writable stream where you do
4918+
// something with the data. Sometimes it's called a "filter",
4919+
// but that's not a great name for it, since that implies a thing where
4920+
// some bits pass through, and others are simply ignored. (That would
4921+
// be a valid example of a transform, of course.)
4922+
//
4923+
// While the output is causally related to the input, it's not a
4924+
// necessarily symmetric or synchronous transformation. For example,
4925+
// a zlib stream might take multiple plain-text writes(), and then
4926+
// emit a single compressed chunk some time in the future.
4927+
//
4928+
// Here's how this works:
4929+
//
4930+
// The Transform stream has all the aspects of the readable and writable
4931+
// stream classes. When you write(chunk), that calls _write(chunk,cb)
4932+
// internally, and returns false if there's a lot of pending writes
4933+
// buffered up. When you call read(), that calls _read(n) until
4934+
// there's enough pending readable data buffered up.
4935+
//
4936+
// In a transform stream, the written data is placed in a buffer. When
4937+
// _read(n) is called, it transforms the queued up data, calling the
4938+
// buffered _write cb's as it consumes chunks. If consuming a single
4939+
// written chunk would result in multiple output chunks, then the first
4940+
// outputted bit calls the readcb, and subsequent chunks just go into
4941+
// the read buffer, and will cause it to emit 'readable' if necessary.
4942+
//
4943+
// This way, back-pressure is actually determined by the reading side,
4944+
// since _read has to be called to start processing a new chunk. However,
4945+
// a pathological inflate type of transform can cause excessive buffering
4946+
// here. For example, imagine a stream where every byte of input is
4947+
// interpreted as an integer from 0-255, and then results in that many
4948+
// bytes of output. Writing the 4 bytes {ff,ff,ff,ff} would result in
4949+
// 1kb of data being output. In this case, you could write a very small
4950+
// amount of input, and end up with a very large amount of output. In
4951+
// such a pathological inflating mechanism, there'd be no way to tell
4952+
// the system to stop doing the transform. A single 4MB write could
4953+
// cause the system to run out of memory.
4954+
//
4955+
// However, even in such a pathological case, only a single written chunk
4956+
// would be consumed, and then the rest would wait (un-transformed) until
4957+
// the results of the previous transformed chunk were consumed.
4958+
48954959
inherits$1(Transform, Duplex);
48964960

48974961
function TransformState(stream) {

packages/csv-generate/dist/iife/index.js

+64
Original file line numberDiff line numberDiff line change
@@ -3298,6 +3298,26 @@ var csv_generate = (function (exports) {
32983298
};
32993299

33003300
// Copyright Joyent, Inc. and other Node contributors.
3301+
//
3302+
// Permission is hereby granted, free of charge, to any person obtaining a
3303+
// copy of this software and associated documentation files (the
3304+
// "Software"), to deal in the Software without restriction, including
3305+
// without limitation the rights to use, copy, modify, merge, publish,
3306+
// distribute, sublicense, and/or sell copies of the Software, and to permit
3307+
// persons to whom the Software is furnished to do so, subject to the
3308+
// following conditions:
3309+
//
3310+
// The above copyright notice and this permission notice shall be included
3311+
// in all copies or substantial portions of the Software.
3312+
//
3313+
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
3314+
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
3315+
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
3316+
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
3317+
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
3318+
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
3319+
// USE OR OTHER DEALINGS IN THE SOFTWARE.
3320+
33013321
var isBufferEncoding = Buffer.isEncoding
33023322
|| function(encoding) {
33033323
switch (encoding && encoding.toLowerCase()) {
@@ -4384,6 +4404,9 @@ var csv_generate = (function (exports) {
43844404
}
43854405

43864406
// A bit simpler than readable streams.
4407+
// Implement an async ._write(chunk, encoding, cb), and it'll handle all
4408+
// the drain event emission and buffering.
4409+
43874410
Writable.WritableState = WritableState;
43884411
inherits$1(Writable, EventEmitter);
43894412

@@ -4895,6 +4918,47 @@ var csv_generate = (function (exports) {
48954918
}
48964919

48974920
// a transform stream is a readable/writable stream where you do
4921+
// something with the data. Sometimes it's called a "filter",
4922+
// but that's not a great name for it, since that implies a thing where
4923+
// some bits pass through, and others are simply ignored. (That would
4924+
// be a valid example of a transform, of course.)
4925+
//
4926+
// While the output is causally related to the input, it's not a
4927+
// necessarily symmetric or synchronous transformation. For example,
4928+
// a zlib stream might take multiple plain-text writes(), and then
4929+
// emit a single compressed chunk some time in the future.
4930+
//
4931+
// Here's how this works:
4932+
//
4933+
// The Transform stream has all the aspects of the readable and writable
4934+
// stream classes. When you write(chunk), that calls _write(chunk,cb)
4935+
// internally, and returns false if there's a lot of pending writes
4936+
// buffered up. When you call read(), that calls _read(n) until
4937+
// there's enough pending readable data buffered up.
4938+
//
4939+
// In a transform stream, the written data is placed in a buffer. When
4940+
// _read(n) is called, it transforms the queued up data, calling the
4941+
// buffered _write cb's as it consumes chunks. If consuming a single
4942+
// written chunk would result in multiple output chunks, then the first
4943+
// outputted bit calls the readcb, and subsequent chunks just go into
4944+
// the read buffer, and will cause it to emit 'readable' if necessary.
4945+
//
4946+
// This way, back-pressure is actually determined by the reading side,
4947+
// since _read has to be called to start processing a new chunk. However,
4948+
// a pathological inflate type of transform can cause excessive buffering
4949+
// here. For example, imagine a stream where every byte of input is
4950+
// interpreted as an integer from 0-255, and then results in that many
4951+
// bytes of output. Writing the 4 bytes {ff,ff,ff,ff} would result in
4952+
// 1kb of data being output. In this case, you could write a very small
4953+
// amount of input, and end up with a very large amount of output. In
4954+
// such a pathological inflating mechanism, there'd be no way to tell
4955+
// the system to stop doing the transform. A single 4MB write could
4956+
// cause the system to run out of memory.
4957+
//
4958+
// However, even in such a pathological case, only a single written chunk
4959+
// would be consumed, and then the rest would wait (un-transformed) until
4960+
// the results of the previous transformed chunk were consumed.
4961+
48984962
inherits$1(Transform, Duplex);
48994963

49004964
function TransformState(stream) {

0 commit comments

Comments
 (0)