2026-02-16T09:59:02.039305Z INFO vector::app: Log level is enabled. level="info" 2026-02-16T09:59:02.039975Z INFO vector::app: Loading configs. paths=["/etc/vector"] 2026-02-16T09:59:02.044013Z INFO source{component_kind="source" component_id=k8s_logs component_type=kubernetes_logs}: vector::sources::kubernetes_logs: Obtained Kubernetes Node name to collect logs for (self). self_node_name="ip-10-0-137-137.ec2.internal" 2026-02-16T09:59:02.053593Z INFO source{component_kind="source" component_id=k8s_logs component_type=kubernetes_logs}: vector::sources::kubernetes_logs: Including matching files. ret=["**/*"] 2026-02-16T09:59:02.053655Z INFO source{component_kind="source" component_id=k8s_logs component_type=kubernetes_logs}: vector::sources::kubernetes_logs: Internal log [Including matching files.] is being suppressed to avoid flooding. 2026-02-16T09:59:02.081143Z INFO vector::topology::running: Running healthchecks. 2026-02-16T09:59:02.081330Z INFO vector: Vector has started. debug="false" version="0.52.0" arch="x86_64" revision="ca5bf26 2025-12-16 14:56:07.290167996" 2026-02-16T09:59:02.081736Z INFO vector::topology::builder: Healthcheck passed. 2026-02-16T09:59:02.083159Z INFO vector::internal_events::api: API server running. address=127.0.0.1:8686 playground=off graphql=http://127.0.0.1:8686/graphql 2026-02-16T09:59:02.084711Z INFO vector::sinks::prometheus::exporter: Building HTTP server. address=0.0.0.0:9598 2026-02-16T09:59:02.089404Z WARN http: vector::internal_events::http_client: HTTP error. error=error trying to connect: tcp connect error: Connection refused (os error 111) error_type="request_failed" stage="processing" 2026-02-16T09:59:02.089443Z ERROR vector::topology::builder: msg="Healthcheck failed." error=Failed to make HTTP(S) request: error trying to connect: tcp connect error: Connection refused (os error 111) component_kind="sink" component_type="loki" component_id=loki 2026-02-16T09:59:02.607461Z INFO source{component_kind="source" component_id=k8s_logs component_type=kubernetes_logs}:file_server: vector::internal_events::file::source: Found new file to watch. file=/var/log/pods/openshift-multus_multus-rzc5v_4bc3fb37-0715-446c-9e3c-2bb0dd3e2761/kube-multus/0.log 2026-02-16T09:59:02.607665Z INFO source{component_kind="source" component_id=k8s_logs component_type=kubernetes_logs}:file_server: vector::internal_events::file::source: Internal log [Found new file to watch.] is being suppressed to avoid flooding. 2026-02-16T09:59:02.608559Z WARN source{component_kind="source" component_id=k8s_logs component_type=kubernetes_logs}:file_server: vector::internal_events::file::source: Currently ignoring file too small to fingerprint. file=/var/log/pods/crossplane-system_function-auto-ready-59868730b9a9-69c8fb67bc-m7dwk_056b539d-2442-4772-9605-e01d3862c243/package-runtime/0.log 2026-02-16T09:59:02.611328Z WARN source{component_kind="source" component_id=k8s_logs component_type=kubernetes_logs}:file_server: vector::internal_events::file::source: Internal log [Currently ignoring file too small to fingerprint.] is being suppressed to avoid flooding. 2026-02-16T09:59:04.262512Z WARN sink{component_kind="sink" component_id=loki component_type=loki}:request{request_id=1}:http: vector::internal_events::http_client: HTTP error. error=error trying to connect: tcp connect error: Connection refused (os error 111) error_type="request_failed" stage="processing" 2026-02-16T09:59:04.262568Z WARN sink{component_kind="sink" component_id=loki component_type=loki}:request{request_id=1}: vector::sinks::util::retries: Retrying after error. error=Failed to make HTTP(S) request: Failed to make HTTP(S) request: error trying to connect: tcp connect error: Connection refused (os error 111) 2026-02-16T09:59:04.815136Z WARN sink{component_kind="sink" component_id=loki component_type=loki}:request{request_id=1}:http: vector::internal_events::http_client: Internal log [HTTP error.] is being suppressed to avoid flooding. 2026-02-16T09:59:04.815165Z WARN sink{component_kind="sink" component_id=loki component_type=loki}:request{request_id=1}: vector::sinks::util::retries: Internal log [Retrying after error.] is being suppressed to avoid flooding. 2026-02-16T09:59:14.903243Z WARN sink{component_kind="sink" component_id=loki component_type=loki}:request{request_id=1}: vector::sinks::util::retries: Internal log [Retrying after error.] has been suppressed 5 times. 2026-02-16T09:59:14.903257Z WARN sink{component_kind="sink" component_id=loki component_type=loki}:request{request_id=1}: vector::sinks::util::retries: Retrying after error. error=Server responded with an error: 502 Bad Gateway 2026-02-16T09:59:31.827411Z WARN sink{component_kind="sink" component_id=loki component_type=loki}:request{request_id=1}: vector::sinks::util::retries: Retrying after error. error=Server responded with an error: 502 Bad Gateway 2026-02-16T09:59:48.025248Z WARN sink{component_kind="sink" component_id=loki component_type=loki}:request{request_id=1}: vector::sinks::util::retries: Retrying after error. error=Server responded with an error: 500 Internal Server Error