Skip to content

could not parse using stream? #4

Open
@siygle

Description

@siygle

I write a simple example to use stream to read the file content then pipe to tokenizer, but it will return could not parse error, did I do it wrong? thanks

error

SyntaxError: could not parse "dkfjdkfjd\ndfkdjf\n\nsdsjkdsjdkjskwww\n\n2ueuwie1212\n\nklekwl<b>rest</b>skdjwk\n"
    at Tokenizer._tokenize (/Users/ferrari/test/node_modules/tokenizer/lib/Tokenizer.js:75:13)
    at /Users/ferrari/test/node_modules/tokenizer/lib/Tokenizer.js:30:12
    at process._tickCallback (node.js:415:13)

main.js

var Tokenizer = require('tokenizer');
var t = new Tokenizer();
var fs = require('fs');

t.on('token', function(token, type) {
  console.log('%s - %s', token, type);
});
t.addRule(/^dk$/, 'test');

fs.createReadStream('./test.txt').pipe(t);

test.txt

dkfjdkfjd
dfkdjf

sdsjkdsjdkjskwww

2ueuwie1212

klekwl<b>rest</b>skdjwk

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions