Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Handle EventTime msgpack extension to handle nanosecond precision time and add its parameter #18

Merged
merged 6 commits into from
Sep 2, 2019

Conversation

cosmo0920
Copy link
Contributor

@cosmo0920 cosmo0920 commented Jan 4, 2018

Hi, I've tried to handle EventTime msgpack extension in this codec plugin.
How about start to handle EventTime extension?

Regards,

@cosmo0920
Copy link
Contributor Author

With this nanosecond_precision parameter, logstash starts to handle subsecond like this:

fluent-cat with EventTime extention

$ echo '{"current_version":"v0.14", "versions":{"unstable":0.14, "stable":0.12}}' | bundle exec fluent-cat my.logs

logstash with this codec

logstash config w/ EventTime extension
input {
  tcp {
    codec => fluent {
      nanosecond_precision => true
    }
    port => 24224
  }
}
output {
  stdout { codec => json }
}
Output result
{"current_version":"v0.14","versions":{"stable":0.12,"unstable":0.14},"@version":"1","@metdata":{"ip_address":"127.0.0.1"},"@timestamp":"2018-01-04T09:25:45.489Z","port":56755,"tags":["my.logs"],"host":"localhost"}

Otherwise, logstash cannot handle subseconds:

fluent-cat with EventTime ext

$ echo '{"current_version":"v0.14", "versions":{"unstable":0.14, "stable":0.12}}' | bundle exec fluent-cat my.logs

logstash with this codec

logstash config w/o EventTime extension
input {
  tcp {
    codec => fluent
    port => 24224
  }
}
output {
  stdout { codec => json }
}
Output result

Nothing output.

@cosmo0920
Copy link
Contributor Author

cosmo0920 commented Feb 7, 2018

How about these changes?
These changes are also needed to handle EventTime which is included in new Fluentd protocol.
Thanks.

#
class LogStash::Codecs::Fluent < LogStash::Codecs::Base
require "logstash/codecs/fluent/event_time"

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I had to change this to require_relative "event_time" to make it work on a new Logstash 6.2.2 setup otherwise you get {:exception=>#<LoadError: no such file to load -- logstash/codecs/fluent/event_time>

Copy link
Contributor Author

@cosmo0920 cosmo0920 Feb 27, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Umm..., in my environment, require_relative "event_time" does not work. :'( require_relative "fluent/event_time" and require "logstash/codecs/fluent/event_time" work. 😖

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My plugin setup procedure is:

$ gem build logstash-codec-fluent.gemspec
$ logstash-plugin install logstash-codec-fluent-*.gem

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting. I'll do another test tomorrow and I'll let you know.

@dsolsona
Copy link

dsolsona commented Feb 27, 2018

Hi @cosmo0920

I've tested your patch in a brand new Logstash 6.2.2 setup with fluentd-bit v0.12.14 and it works 👍

This is the output from using the tail input.

{
           "log" => "Tue Feb 27 05:41:46 UTC 2018 this is a test log entry",
    "@timestamp" => 2018-02-27T05:44:19.049Z,
          "tags" => [
        [0] "tail.0"
    ],
      "@version" => "1",
          "port" => 54206,
          "host" => "localhost"
}

Before this patch I wasn't getting anything in logstash and now I do.

@cosmo0920
Copy link
Contributor Author

cosmo0920 commented Feb 28, 2018

fluent-bit has fallback protocol mechanism.
time_as_integer options may be useful for the latest vanilla fluent codec:
https://github.com/fluent/fluent-bit-docs/blob/93cfe27006918044a0ff5e69eeeba4eafde651ab/output/forward.md#configuration-parameters-config_tcp

nanosecond_precision parameter name is referenced from fluent-logger-ruby's EventTime handling parameter:
https://github.com/fluent/fluent-logger-ruby#use-nanosecond-precision-time

@cosmo0920
Copy link
Contributor Author

Any news?

@jsvd
Copy link
Member

jsvd commented Sep 2, 2019

I just tested this locally with logstash 7.3.0:

bin/logstash -e "input { tcp { port => 44444 codec => fluent {nanosecond_precision => true } } } output { stdout { codec => rubydebug } }"

and fluent-logger (0.8.2):

> log = Fluent::Logger::FluentLogger.new(nil, :host => 'localhost', :port => 44444, :nanosecond_precision => true)
> log.post_with_time("myapp.access", {"agent" => "foo"}, Time.now)

And confirmed sending data with nanosecond_precision enabled on both sides no longer fails.

@jsvd jsvd merged commit 99f9c00 into logstash-plugins:master Sep 2, 2019
@cosmo0920
Copy link
Contributor Author

Oh, thank you.

@cosmo0920 cosmo0920 deleted the event-time branch September 2, 2019 15:25
@jsvd
Copy link
Member

jsvd commented Sep 2, 2019

@cosmo0920 sorry for the long delay, and thank you for the work done here.

[edit] very long delay.

@dcrn
Copy link

dcrn commented Nov 7, 2019

When will this be released? Currently hitting the issue resolved by this.

when Fixnum
fluent_time
when EventTime
Time.at(fluent_time.sec, fluent_time.nsec)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We are using the latest version of this plugin and are running into a problem, which in our opinion is caused by this line here. As far as I can see, calling Time.at with just two parameters, expects the second parameter to specify microseconds and not nanoseconds, which in our case changed the timestamps and suddenly added roughly 10 minutes.

Shouldn't this read Time.at(fluent_time.sec, fluent_time.nsec / 1000.0) instead?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants