Description
Component(s)
processor/k8sattributes
What happened?
Description
We are receiving logs from rsyslog and want to extend them with pod metadata. Rsyslog successfully sends the logs but k8sattributes processor is not associating the logs with the pod.
Collector logs clearly show where syslog connection is coming from: 10.6.44.178:60400
But k8sattributes prints no sources.
[{"Source":{"From":"","Name":""},"Value":""},{"Source":{"From":"","Name":""},"Value":""},{"Source":{"From":"","Name":""},"Value":""},{"Source":{"From":"","Name":""},"Value":""}]}
I can see in internal metrics that pods are being found
> curl http://localhost:8888/metrics
# HELP otelcol_otelsvc_k8s_pod_table_size Size of table containing pod info
# TYPE otelcol_otelsvc_k8s_pod_table_size gauge
otelcol_otelsvc_k8s_pod_table_size{service_instance_id="759ecdf5-d8e4-464f-8058-af078e8e4e01",service_name="otelcol-contrib",service_version="0.109.0"} 490
Also I can see that my pod has the same IP address as printed by rsyslog receiver.
> kubectl describe pods pod-8fjz9 | egrep "IP|Labels"
Labels: app.kubernetes.io/name=NeonVM
IP: 10.6.44.178
Steps to Reproduce
Run otel collector and send logs to it from another k8s pod.
Expected Result
k8sattributes matches the pod and assigns the attributes. k8sattributes should provide more logging opportunities to at least print known associations.
Actual Result
k8sattributes doesn't match the pod.
Collector version
0.109.0
Environment information
Using a ghcr.io/open-telemetry/opentelemetry-collector-releases/opentelemetry-collector-contrib:0.109.0
image that is launched by otel collector operator in our k8s cluster.
OpenTelemetry Collector configuration
receivers:
syslog:
protocol: rfc5424
tcp:
listen_address: 0.0.0.0:10514
processors:
k8sattributes:
auth_type: serviceAccount
filter:
namespace: default
pod_association:
- sources:
- from: connection
extract:
metadata:
- k8s.pod.name
labels:
- from: pod
key: app.kubernetes.io/name
tag_name: app.label.name
exporters:
debug:
verbosity: detailed
service:
pipelines:
logs/debug:
exporters:
- debug
processors:
- k8sattributes
receivers:
- syslog
telemetry:
logs:
level: debug
metrics:
address: ""
level: normal
readers:
- pull:
exporter:
prometheus:
host: 0.0.0.0
port: 8888
Log output
2025-03-17T14:54:54.146Z debug tcp/input.go:102 Received connection {"kind": "receiver", "name": "syslog", "data_type": "logs", "operator_id": "syslog_input_internal_tcp", "operator_type": "tcp_input", "address": "10.6.44.178:60400"}
2025-03-17T14:54:54.167Z debug [email protected]/processor.go:123 evaluating pod identifier {"kind": "processor", "name": "k8sattributes", "pipeline": "logs/debug", "value": [{"Source":{"From":"","Name":""},"Value":""},{"Source":{"From":"","Name":""},"Value":""},{"Source":{"From":"","Name":""},"Value":""},{"Source":{"From":"","Name":""},"Value":""}]}
2025-03-17T14:54:54.168Z info ResourceLog #0
Resource SchemaURL:
ScopeLogs #0
ScopeLogs SchemaURL:
InstrumentationScope
LogRecord #0
ObservedTimestamp: 2025-03-17 14:54:54.148840498 +0000 UTC
Timestamp: 2025-03-17 14:54:54.141993 +0000 UTC
SeverityText: info
SeverityNumber: Info(9)
Body: Str(<134>1 2025-03-17T14:54:54.141993+00:00 vm-pod-dhxbs postgres 22371 - - [288-1] 2025-03-17 14:54:54.141 GMT ttid=d8467b4f5364b95c843478e10082e978/6f298e3e065a464f0a19586ffbacf08c sqlstate=00000 [22371] LOG: disconnection: session time: 0:05:09.990 user=admin database=postgres host=127.0.0.1 port=40532)
Attributes:
-> hostname: Str(vm-pod-dhxbs)
-> appname: Str(postgres)
-> proc_id: Str(22371)
-> message: Str( [288-1] 2025-03-17 14:54:54.141 GMT ttid=d8467b4f5364b95c843478e10082e978/6f298e3e065a464f0a19586ffbacf08c sqlstate=00000 [22371] LOG: disconnection: session time: 0:05:09.990 user=admin database=postgres host=127.0.0.1 port=40532)
-> version: Int(1)
-> facility: Int(16)
-> priority: Int(134)
Trace ID:
Span ID:
Flags: 0
Additional context
No response