-
Notifications
You must be signed in to change notification settings - Fork 402
Log lines might end up with duplicate JSON keys #401
Comments
This problem happens because Docker itself add a "time" field inside its log when merged. If you want to produce a JSON log you need to use a different "time" field name. I will suggest avoiding any docker/logstash/fluentbit/fluentd/elasticsearch "reserved" key (example: "@timestamp") |
@silasbw @Flydiverny can you please take a look? |
I can confirm that this is an issue, thanks for reporting it! It seems that the error that was caught and logged here contains additional fields like Having a field that indicates the |
If the root cause here is the Log message would probably end up something like {
"level": 50,
"time": 1591347107037,
"pid": 19,
"payload": {
"hostname": "kubernetes-external-secrets-6799c84569-kfjcj",
"message": "Access to KMS is not allowed",
"code": "AccessDeniedException",
"time": "2020-06-05T08:51:47.036Z",
"requestId": "1894b6c9-b698-4657-a3cb-ecd8f044afee",
"statusCode": 400,
"retryable": false,
"retryDelay": 77.07928574264942,
"stack": "AccessDeniedException: Access to KMS is not allowed\n at Request.extractError (/app/node_modules/aws-sdk/lib/protocol/json.js:51:27)\n at Request.callListeners (/app/node_modules/aws-sdk/lib/sequential_executor.js:106:20)\n at Request.emit (/app/node_modules/aws-sdk/lib/sequential_executor.js:78:10)\n at Request.emit (/app/node_modules/aws-sdk/lib/request.js:683:14)\n at Request.transition (/app/node_modules/aws-sdk/lib/request.js:22:10)\n at AcceptorStateMachine.runTo (/app/node_modules/aws-sdk/lib/state_machine.js:14:12)\n at /app/node_modules/aws-sdk/lib/state_machine.js:26:10\n at Request.<anonymous> (/app/node_modules/aws-sdk/lib/request.js:38:9)\n at Request.<anonymous> (/app/node_modules/aws-sdk/lib/request.js:685:12)\n at Request.callListeners (/app/node_modules/aws-sdk/lib/sequential_executor.js:116:18)",
"type": "Error"
},
"msg": "failure while polling the secret kubernetes-external-secrets/test"
} I was under the impression this had to do with log merging (eg docker log + output from KES or so) later on. In which case I think its up to the end users configuration. PR welcome :) |
docker itself adds a "time" field when it writes the log to disk, adding it to the pod log is a bit redundant. If you want to put it anyway, you can choose an alternative field name or nest it inside another field. Or avoid JSON format. This is an example of log produced in one of our clusters.
In my case, In older versions of fluent-bit (or other software), the merge is less refined and a JSON with two IMHO |
Having a field that indicates time is necessary, true. The problem is that there's two of them, which results in invalid JSON, which in turn will confuse log analysis tools that would otherwise work without modifications. One option is to nest the log, as @Flydiverny said. This is probably the technically correct option, but it might be a breaking change for anyone relying on the existing log structure. Another would be to simply rename the "time" field in the record generated by kubernetes-external-secrets to something else. "message_time", "recorded_at", "kes_time", whatever. Probably easier to implement, but it might break a different set of tools. Either way, the sooner it's changed, the less people are impacted by it I guess. |
This issue is stale because it has been open 90 days with no activity. Remove stale label or comment or this will be closed in 30 days. |
@moolen @Flydiverny is there any hope that this bug will be fixed? |
Ops... I found a fix here https://github.com/external-secrets/kubernetes-external-secrets/blob/master/config/index.js so I can assume that this ticket can be closed. Thanks. |
@moolen I was wrong. The issue is still there. IMHO the ultimate solution would be to rename the Here my log #401 (comment) |
@moolen @bgdnlp @Flydiverny can I have any feedback? |
Hey @pierluigilenoci, let's change the |
@Flydiverny this issue can be closed. |
Thank you @pierluigilenoci for your contribution ❤️! |
In some situations the log JSON will have duplicate keys, I assume because the "message" dict/map is merged with the log one. For example, the "time" key is included twice here:
The text was updated successfully, but these errors were encountered: