Skip to content

[exporter/elasticsearch] Configuring data_stream fails with invalid keys: data_stream #40067

Closed
@Rufmord

Description

@Rufmord

Component(s)

exporter/elasticsearch

What happened?

Description

I am using the helm charts of Opentelemetry and Elasticsearch and want to configure to get all container logs in Kubernetes. This works, but the data stream that is created in elastic has a generic name, so I want to specify the name. Therefore I followed the documentation for the elasticsearch exporter, but I get an error with the data_stream in config.

Steps to Reproduce

Create a helm chart with the following parts.
Chart.yaml
To pull the depedendencies you have to execute helm dep up.

apiVersion: v2
type: application
name: opentelemetry
description: A Helm chart for Kubernetes
version: 0.1.0
appVersion: 0.125.0
dependencies:
- name: opentelemetry-collector
  repository: https://open-telemetry.github.io/opentelemetry-helm-charts
  version: 0.122.5
- name: elasticsearch
  repository: https://charts.bitnami.com/bitnami
  version: 21.3.22

values.yaml

opentelemetry-collector:
  mode: daemonset
  image:
    repository: ghcr.io/open-telemetry/opentelemetry-collector-releases/opentelemetry-collector-contrib
    tag: 0.125.0

  presets:
    logsCollection:
      enabled: true
      includeCollectorLogs: true
  config:
    extensions:
      basicauth/elastic:
        client_auth:
          username: elastic
          password: elastic
    receivers: {}
    processors:
      batch: {}
    exporters:
      elasticsearch:
        endpoints:
          - https://otel-elasticsearch:9200
        # logs_index: new-index-test # tried with this but it is also not working
        # logs_dynamic_index: # this config also not
        #   enabled: true
        data_stream:
          dataset: all-pod-logs
          namespace: mk8s
          type: logs
        auth:
          authenticator: basicauth/elastic
        tls:
          insecure_skip_verify: true
    service:
      extensions:
        - health_check
        - basicauth/elastic
      pipelines:
        logs:
          processors:
            - batch
          exporters:
            - elasticsearch

elasticsearch:
  kibana:
    elasticsearch:
      security:
        enabled: true
        auth:
          enabled: true
          kibanaUsername: "kibana_system"
          kibanaPassword: "elastic"
          createSystemUser: true
          elasticsearchPasswordSecret: otel-elasticsearch
        tls:
          enabled: true
          existingSecret: otel-elasticsearch-master-crt
          usePemCerts: true
  global:
    kibanaEnabled: true
  master:
    resourcesPreset: small
    networkPolicy:
      enabled: false
  data:
    resourcesPreset: small
  coordinating:
    resourcesPreset: small
  ingest:
    resourcesPreset: small
  metrics:
    resourcesPreset: small
  security:
    tls:
      restEncryption: true
      autoGenerated: true
      verificationMode: none
    elasticPassword: "elastic"
    kibana:
      password: elastic
  ingress:
    hostname: test-elastic.mydomain.local
  extraConfig:
    http.cors.allow-origin: "*"
    http.cors.enabled: true
    http.cors.allow-credentials: true
    http.cors.allow-methods: OPTIONS, HEAD, GET, POST, PUT, DELETE
    http.cors.allow-headers: X-Requested-With, X-Auth-Token, Content-Type, Content-Length, Authorization, Access-Control-Allow-Headers, Accept, x-elastic-client-meta

To deploy this you have to create the namespace and then install the chart:

kubectl create namespace otel
helm upgrade --install otel . -n otel
# wait for the pod to be in ready state
kubectl logs daemonset/otel-opentelemetry-collector-agent -n otel

Expected Result

In elasticsearch a data stream with a fitting name should be created, but the opentelemetry-collector crashes because of a config error.

Actual Result

The opentelemetry-collector container has the following error:

Error: failed to get config: cannot unmarshal the configuration: decoding failed due to the following error(s):

error decoding 'exporters': error reading configuration for "elasticsearch": decoding failed due to the following error(s):

'' has invalid keys: data_stream
2025/05/14 07:11:52 collector server run finished with error: failed to get config: cannot unmarshal the configuration: decoding failed due to the following error(s):

error decoding 'exporters': error reading configuration for "elasticsearch": decoding failed due to the following error(s):

'' has invalid keys: data_stream
stream closed EOF for otel/otel-opentelemetry-collector-agent-l822l (opentelemetry-collector)

Did I misconfigure something? What would be the correct configuration?
Thank you in advance!

Collector version

0.125.0

Environment information

Environment

OS: Ubuntu 22.04, but running k8s on it

OpenTelemetry Collector configuration

exporters:
  debug: {}
  elasticsearch:
    auth:
      authenticator: basicauth/elastic
    data_stream:
      dataset: all-pod-logs
      namespace: mk8s
      type: logs
    endpoints:
    - https://otel-elasticsearch:9200
    tls:
      insecure_skip_verify: true
extensions:
  basicauth/elastic:
    client_auth:
      password: elastic
      username: elastic
  health_check:
    endpoint: ${env:MY_POD_IP}:13133
processors:
  batch: {}
  memory_limiter:
    check_interval: 5s
    limit_percentage: 80
    spike_limit_percentage: 25
receivers:
  filelog:
    exclude: []
    include:
    - /var/log/pods/*/*/*.log
    include_file_name: false
    include_file_path: true
    operators:
    - id: container-parser
      max_log_size: 102400
      type: container
    retry_on_failure:
      enabled: true
    start_at: end
  jaeger:
    protocols:
      grpc:
        endpoint: ${env:MY_POD_IP}:14250
      thrift_compact:
        endpoint: ${env:MY_POD_IP}:6831
      thrift_http:
        endpoint: ${env:MY_POD_IP}:14268
  otlp:
    protocols:
      grpc:
        endpoint: ${env:MY_POD_IP}:4317
      http:
        endpoint: ${env:MY_POD_IP}:4318
  prometheus:
    config:
      scrape_configs:
      - job_name: opentelemetry-collector
        scrape_interval: 10s
        static_configs:
        - targets:
          - ${env:MY_POD_IP}:8888
  zipkin:
    endpoint: ${env:MY_POD_IP}:9411
service:
  extensions:
  - health_check
  - basicauth/elastic
  pipelines:
    logs:
      exporters:
      - elasticsearch
      processors:
      - batch
      receivers:
      - otlp
      - filelog
    metrics:
      exporters:
      - debug
      processors:
      - memory_limiter
      - batch
      receivers:
      - otlp
      - prometheus
    traces:
      exporters:
      - debug
      processors:
      - memory_limiter
      - batch
      receivers:
      - otlp
      - jaeger
      - zipkin
  telemetry:
    metrics:
      readers:
      - pull:
          exporter:
            prometheus:
              host: ${env:MY_POD_IP}
              port: 8888

Log output

Error: failed to get config: cannot unmarshal the configuration: decoding failed due to the following error(s):

error decoding 'exporters': error reading configuration for "elasticsearch": decoding failed due to the following error(s):

'' has invalid keys: data_stream
2025/05/14 07:11:52 collector server run finished with error: failed to get config: cannot unmarshal the configuration: decoding failed due to the following error(s):

error decoding 'exporters': error reading configuration for "elasticsearch": decoding failed due to the following error(s):

'' has invalid keys: data_stream
stream closed EOF for otel/otel-opentelemetry-collector-agent-l822l (opentelemetry-collector)

Additional context

No response

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions