Skip to content

stream pipeline kills process when writeStream is closed  #52622

Closed
@DevT0ny

Description

@DevT0ny

Version

v21.7.1

Platform

Linux

Subsystem

No response

What steps will reproduce the bug?

To reproduce this follow this

echo "Hello from src file" > /tmp/src.txt

Now run this script

//@ts-check

const fs = require('node:fs')
const { pipeline } = require('node:stream/promises')
const { promisify } = require('node:util')

const main = async () => {
  const dst = '/tmp/dest.txt'
  const writeStream = fs.createWriteStream(dst, { flags: 'a' })

  const close = promisify(writeStream.close)
  await close.call(writeStream)

  console.log(writeStream.closed, writeStream.writableEnded)

  await pipeline(fs.createReadStream('/tmp/src.txt'), writeStream)
  const dstfile = await fs.promises.readFile(dst, { encoding: 'utf8' })
  console.log({ dstfile })
}

main()
  .then(() => {
    console.log('done')
  })
  .catch((err) => {
    console.log('error')
    console.error(err)
  })
  .finally(() => {
    console.log('over')
  })

How often does it reproduce? Is there a required condition?

No response

What is the expected behavior? Why is that the expected behavior?

pipeline function should throw error saying writeStream is closed.

What do you see instead?

prints true true then exits with code 0. true true is part of script and its working as intended.

Additional information

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    streamIssues and PRs related to the stream subsystem.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions