Apr 19 12:28:19.045425 ip-10-0-140-194 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 19 12:28:19.045449 ip-10-0-140-194 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 19 12:28:19.045460 ip-10-0-140-194 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 19 12:28:19.045779 ip-10-0-140-194 systemd[1]: Failed to start Kubernetes Kubelet. Apr 19 12:28:29.053996 ip-10-0-140-194 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 19 12:28:29.054013 ip-10-0-140-194 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 1cb1adc7102f4fae841bdfb37204d7cb -- Apr 19 12:30:44.646108 ip-10-0-140-194 systemd[1]: Starting Kubernetes Kubelet... Apr 19 12:30:45.097190 ip-10-0-140-194 kubenswrapper[2583]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 12:30:45.097190 ip-10-0-140-194 kubenswrapper[2583]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 19 12:30:45.097190 ip-10-0-140-194 kubenswrapper[2583]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 12:30:45.097190 ip-10-0-140-194 kubenswrapper[2583]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 19 12:30:45.097190 ip-10-0-140-194 kubenswrapper[2583]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 12:30:45.099946 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.099830 2583 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 19 12:30:45.105287 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105266 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:30:45.105287 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105283 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:30:45.105287 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105288 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:30:45.105287 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105294 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:30:45.105464 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105297 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:30:45.105464 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105300 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:30:45.105464 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105303 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:30:45.105464 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105306 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:30:45.105464 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105308 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:30:45.105464 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105311 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:30:45.105464 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105313 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:30:45.105464 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105316 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:30:45.105464 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105319 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:30:45.105464 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105321 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:30:45.105464 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105324 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:30:45.105464 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105326 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:30:45.105464 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105328 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:30:45.105464 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105331 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:30:45.105464 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105334 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:30:45.105464 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105337 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:30:45.105464 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105339 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:30:45.105464 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105344 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:30:45.105464 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105348 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:30:45.105464 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105350 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:30:45.105961 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105353 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:30:45.105961 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105356 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:30:45.105961 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105361 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:30:45.105961 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105364 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:30:45.105961 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105366 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:30:45.105961 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105368 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:30:45.105961 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105371 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:30:45.105961 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105373 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:30:45.105961 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105376 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:30:45.105961 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105378 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:30:45.105961 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105381 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:30:45.105961 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105383 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:30:45.105961 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105386 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:30:45.105961 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105389 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:30:45.105961 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105392 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:30:45.105961 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105395 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:30:45.105961 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105397 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:30:45.105961 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105399 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:30:45.105961 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105402 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:30:45.106420 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105404 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:30:45.106420 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105406 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:30:45.106420 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105409 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:30:45.106420 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105412 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:30:45.106420 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105414 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:30:45.106420 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105416 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:30:45.106420 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105419 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:30:45.106420 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105422 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:30:45.106420 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105424 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:30:45.106420 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105427 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:30:45.106420 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105430 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:30:45.106420 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105433 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:30:45.106420 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105435 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:30:45.106420 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105438 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:30:45.106420 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105441 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:30:45.106420 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105443 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:30:45.106420 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105445 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:30:45.106420 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105449 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:30:45.106420 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105451 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:30:45.106916 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105454 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:30:45.106916 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105456 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:30:45.106916 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105459 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:30:45.106916 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105461 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:30:45.106916 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105463 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:30:45.106916 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105466 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:30:45.106916 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105468 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:30:45.106916 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105473 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:30:45.106916 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105475 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:30:45.106916 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105478 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:30:45.106916 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105481 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:30:45.106916 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105484 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:30:45.106916 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105486 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:30:45.106916 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105489 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:30:45.106916 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105491 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:30:45.106916 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105494 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:30:45.106916 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105497 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:30:45.106916 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105500 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:30:45.106916 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105502 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:30:45.106916 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105505 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:30:45.107462 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105507 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:30:45.107462 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105510 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:30:45.107462 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105512 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:30:45.107462 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.105515 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:30:45.107462 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106612 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:30:45.107462 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106619 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:30:45.107462 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106622 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:30:45.107462 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106625 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:30:45.107462 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106628 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:30:45.107462 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106631 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:30:45.107462 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106634 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:30:45.107462 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106637 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:30:45.107462 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106640 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:30:45.107462 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106643 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:30:45.107462 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106646 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:30:45.107462 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106648 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:30:45.107462 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106651 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:30:45.107462 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106654 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:30:45.107462 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106657 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:30:45.107956 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106660 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:30:45.107956 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106663 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:30:45.107956 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106666 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:30:45.107956 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106669 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:30:45.107956 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106672 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:30:45.107956 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106674 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:30:45.107956 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106677 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:30:45.107956 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106680 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:30:45.107956 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106682 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:30:45.107956 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106685 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:30:45.107956 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106688 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:30:45.107956 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106691 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:30:45.107956 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106694 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:30:45.107956 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106698 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:30:45.107956 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106700 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:30:45.107956 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106703 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:30:45.107956 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106705 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:30:45.107956 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106708 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:30:45.107956 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106710 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:30:45.108425 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106712 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:30:45.108425 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106715 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:30:45.108425 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106717 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:30:45.108425 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106720 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:30:45.108425 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106722 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:30:45.108425 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106725 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:30:45.108425 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106727 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:30:45.108425 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106730 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:30:45.108425 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106733 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:30:45.108425 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106735 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:30:45.108425 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106738 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:30:45.108425 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106741 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:30:45.108425 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106743 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:30:45.108425 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106746 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:30:45.108425 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106748 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:30:45.108425 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106751 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:30:45.108425 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106754 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:30:45.108425 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106756 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:30:45.108425 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106759 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:30:45.108425 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106761 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:30:45.108931 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106764 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:30:45.108931 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106766 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:30:45.108931 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106769 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:30:45.108931 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106772 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:30:45.108931 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106774 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:30:45.108931 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106777 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:30:45.108931 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106780 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:30:45.108931 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106782 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:30:45.108931 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106786 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:30:45.108931 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106788 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:30:45.108931 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106790 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:30:45.108931 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106793 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:30:45.108931 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106795 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:30:45.108931 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106798 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:30:45.108931 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106800 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:30:45.108931 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106802 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:30:45.108931 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106805 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:30:45.108931 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106807 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:30:45.108931 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106810 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:30:45.108931 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106812 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:30:45.108931 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106815 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:30:45.109448 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106817 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:30:45.109448 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106820 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:30:45.109448 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106823 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:30:45.109448 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106826 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:30:45.109448 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106828 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:30:45.109448 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106831 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:30:45.109448 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106835 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:30:45.109448 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106838 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:30:45.109448 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106841 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:30:45.109448 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106861 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:30:45.109448 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.106866 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:30:45.109448 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.106944 2583 flags.go:64] FLAG: --address="0.0.0.0" Apr 19 12:30:45.109448 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.106954 2583 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 19 12:30:45.109448 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.106961 2583 flags.go:64] FLAG: --anonymous-auth="true" Apr 19 12:30:45.109448 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.106966 2583 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 19 12:30:45.109448 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.106972 2583 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 19 12:30:45.109448 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.106975 2583 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 19 12:30:45.109448 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.106979 2583 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 19 12:30:45.109448 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.106984 2583 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 19 12:30:45.109448 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.106987 2583 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 19 12:30:45.109448 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.106990 2583 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.106993 2583 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.106996 2583 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.106999 2583 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107002 2583 flags.go:64] FLAG: --cgroup-root="" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107005 2583 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107008 2583 flags.go:64] FLAG: --client-ca-file="" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107011 2583 flags.go:64] FLAG: --cloud-config="" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107013 2583 flags.go:64] FLAG: --cloud-provider="external" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107016 2583 flags.go:64] FLAG: --cluster-dns="[]" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107020 2583 flags.go:64] FLAG: --cluster-domain="" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107023 2583 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107027 2583 flags.go:64] FLAG: --config-dir="" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107030 2583 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107034 2583 flags.go:64] FLAG: --container-log-max-files="5" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107038 2583 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107041 2583 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107044 2583 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107047 2583 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107050 2583 flags.go:64] FLAG: --contention-profiling="false" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107053 2583 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107056 2583 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107059 2583 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107062 2583 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107066 2583 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 19 12:30:45.109969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107069 2583 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107072 2583 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107074 2583 flags.go:64] FLAG: --enable-load-reader="false" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107078 2583 flags.go:64] FLAG: --enable-server="true" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107081 2583 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107085 2583 flags.go:64] FLAG: --event-burst="100" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107088 2583 flags.go:64] FLAG: --event-qps="50" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107091 2583 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107094 2583 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107097 2583 flags.go:64] FLAG: --eviction-hard="" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107101 2583 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107103 2583 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107106 2583 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107109 2583 flags.go:64] FLAG: --eviction-soft="" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107112 2583 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107115 2583 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107121 2583 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107123 2583 flags.go:64] FLAG: --experimental-mounter-path="" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107126 2583 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107129 2583 flags.go:64] FLAG: --fail-swap-on="true" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107132 2583 flags.go:64] FLAG: --feature-gates="" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107136 2583 flags.go:64] FLAG: --file-check-frequency="20s" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107139 2583 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107142 2583 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107145 2583 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107148 2583 flags.go:64] FLAG: --healthz-port="10248" Apr 19 12:30:45.110562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107151 2583 flags.go:64] FLAG: --help="false" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107153 2583 flags.go:64] FLAG: --hostname-override="ip-10-0-140-194.ec2.internal" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107156 2583 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107159 2583 flags.go:64] FLAG: --http-check-frequency="20s" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107162 2583 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107165 2583 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107168 2583 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107171 2583 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107173 2583 flags.go:64] FLAG: --image-service-endpoint="" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107176 2583 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107180 2583 flags.go:64] FLAG: --kube-api-burst="100" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107182 2583 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107185 2583 flags.go:64] FLAG: --kube-api-qps="50" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107188 2583 flags.go:64] FLAG: --kube-reserved="" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107191 2583 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107193 2583 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107196 2583 flags.go:64] FLAG: --kubelet-cgroups="" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107199 2583 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107201 2583 flags.go:64] FLAG: --lock-file="" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107204 2583 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107207 2583 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107210 2583 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107216 2583 flags.go:64] FLAG: --log-json-split-stream="false" Apr 19 12:30:45.111255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107219 2583 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107222 2583 flags.go:64] FLAG: --log-text-split-stream="false" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107225 2583 flags.go:64] FLAG: --logging-format="text" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107228 2583 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107231 2583 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107234 2583 flags.go:64] FLAG: --manifest-url="" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107237 2583 flags.go:64] FLAG: --manifest-url-header="" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107241 2583 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107244 2583 flags.go:64] FLAG: --max-open-files="1000000" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107247 2583 flags.go:64] FLAG: --max-pods="110" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107250 2583 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107253 2583 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107256 2583 flags.go:64] FLAG: --memory-manager-policy="None" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107259 2583 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107262 2583 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107265 2583 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107268 2583 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107275 2583 flags.go:64] FLAG: --node-status-max-images="50" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107278 2583 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107281 2583 flags.go:64] FLAG: --oom-score-adj="-999" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107285 2583 flags.go:64] FLAG: --pod-cidr="" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107288 2583 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107293 2583 flags.go:64] FLAG: --pod-manifest-path="" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107296 2583 flags.go:64] FLAG: --pod-max-pids="-1" Apr 19 12:30:45.111827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107299 2583 flags.go:64] FLAG: --pods-per-core="0" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107301 2583 flags.go:64] FLAG: --port="10250" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107304 2583 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107307 2583 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-087b2ea4fe8d8911d" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107311 2583 flags.go:64] FLAG: --qos-reserved="" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107314 2583 flags.go:64] FLAG: --read-only-port="10255" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107317 2583 flags.go:64] FLAG: --register-node="true" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107319 2583 flags.go:64] FLAG: --register-schedulable="true" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107323 2583 flags.go:64] FLAG: --register-with-taints="" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107327 2583 flags.go:64] FLAG: --registry-burst="10" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107330 2583 flags.go:64] FLAG: --registry-qps="5" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107333 2583 flags.go:64] FLAG: --reserved-cpus="" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107335 2583 flags.go:64] FLAG: --reserved-memory="" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107339 2583 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107342 2583 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107345 2583 flags.go:64] FLAG: --rotate-certificates="false" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107348 2583 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107350 2583 flags.go:64] FLAG: --runonce="false" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107353 2583 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107356 2583 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107359 2583 flags.go:64] FLAG: --seccomp-default="false" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107362 2583 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107364 2583 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107367 2583 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107371 2583 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107374 2583 flags.go:64] FLAG: --storage-driver-password="root" Apr 19 12:30:45.112496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107376 2583 flags.go:64] FLAG: --storage-driver-secure="false" Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107379 2583 flags.go:64] FLAG: --storage-driver-table="stats" Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107382 2583 flags.go:64] FLAG: --storage-driver-user="root" Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107385 2583 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107388 2583 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107391 2583 flags.go:64] FLAG: --system-cgroups="" Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107394 2583 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107399 2583 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107402 2583 flags.go:64] FLAG: --tls-cert-file="" Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107405 2583 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107409 2583 flags.go:64] FLAG: --tls-min-version="" Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107411 2583 flags.go:64] FLAG: --tls-private-key-file="" Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107414 2583 flags.go:64] FLAG: --topology-manager-policy="none" Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107417 2583 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107421 2583 flags.go:64] FLAG: --topology-manager-scope="container" Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107424 2583 flags.go:64] FLAG: --v="2" Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107428 2583 flags.go:64] FLAG: --version="false" Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107431 2583 flags.go:64] FLAG: --vmodule="" Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107435 2583 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107438 2583 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107547 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107551 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107555 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107558 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:30:45.113167 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107561 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:30:45.113732 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107563 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:30:45.113732 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107566 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:30:45.113732 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107569 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:30:45.113732 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107571 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:30:45.113732 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107574 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:30:45.113732 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107576 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:30:45.113732 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107579 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:30:45.113732 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107582 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:30:45.113732 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107584 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:30:45.113732 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107587 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:30:45.113732 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107590 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:30:45.113732 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107593 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:30:45.113732 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107595 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:30:45.113732 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107598 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:30:45.113732 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107600 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:30:45.113732 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107603 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:30:45.113732 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107605 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:30:45.113732 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107608 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:30:45.113732 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107611 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:30:45.113732 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107616 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:30:45.114252 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107619 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:30:45.114252 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107623 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:30:45.114252 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107625 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:30:45.114252 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107628 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:30:45.114252 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107630 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:30:45.114252 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107632 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:30:45.114252 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107636 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:30:45.114252 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107639 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:30:45.114252 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107641 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:30:45.114252 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107644 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:30:45.114252 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107646 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:30:45.114252 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107649 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:30:45.114252 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107651 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:30:45.114252 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107654 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:30:45.114252 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107656 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:30:45.114252 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107658 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:30:45.114252 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107661 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:30:45.114252 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107663 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:30:45.114252 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107666 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:30:45.114252 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107668 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:30:45.114735 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107670 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:30:45.114735 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107673 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:30:45.114735 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107675 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:30:45.114735 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107678 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:30:45.114735 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107681 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:30:45.114735 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107683 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:30:45.114735 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107686 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:30:45.114735 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107688 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:30:45.114735 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107691 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:30:45.114735 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107693 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:30:45.114735 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107696 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:30:45.114735 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107698 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:30:45.114735 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107700 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:30:45.114735 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107704 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:30:45.114735 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107707 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:30:45.114735 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107709 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:30:45.114735 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107712 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:30:45.114735 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107716 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:30:45.114735 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107720 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:30:45.115223 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107722 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:30:45.115223 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107725 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:30:45.115223 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107727 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:30:45.115223 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107730 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:30:45.115223 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107732 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:30:45.115223 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107735 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:30:45.115223 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107737 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:30:45.115223 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107739 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:30:45.115223 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107742 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:30:45.115223 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107744 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:30:45.115223 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107755 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:30:45.115223 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107760 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:30:45.115223 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107762 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:30:45.115223 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107765 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:30:45.115223 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107767 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:30:45.115223 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107770 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:30:45.115223 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107772 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:30:45.115223 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107775 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:30:45.115223 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107778 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:30:45.115223 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107781 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:30:45.115738 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107783 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:30:45.115738 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.107786 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:30:45.115738 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.107791 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 12:30:45.115738 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.113901 2583 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 19 12:30:45.115738 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.113919 2583 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 19 12:30:45.115738 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.113968 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:30:45.115738 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.113972 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:30:45.115738 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.113976 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:30:45.115738 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.113979 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:30:45.115738 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.113982 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:30:45.115738 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.113985 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:30:45.115738 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.113987 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:30:45.115738 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.113990 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:30:45.115738 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.113993 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:30:45.115738 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.113995 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:30:45.115738 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.113998 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:30:45.116204 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114000 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:30:45.116204 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114003 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:30:45.116204 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114005 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:30:45.116204 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114008 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:30:45.116204 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114010 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:30:45.116204 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114013 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:30:45.116204 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114015 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:30:45.116204 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114018 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:30:45.116204 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114020 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:30:45.116204 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114023 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:30:45.116204 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114025 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:30:45.116204 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114028 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:30:45.116204 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114030 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:30:45.116204 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114033 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:30:45.116204 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114035 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:30:45.116204 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114037 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:30:45.116204 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114040 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:30:45.116204 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114042 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:30:45.116204 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114045 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:30:45.116204 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114047 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:30:45.116683 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114052 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:30:45.116683 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114054 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:30:45.116683 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114057 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:30:45.116683 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114061 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:30:45.116683 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114066 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:30:45.116683 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114069 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:30:45.116683 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114072 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:30:45.116683 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114075 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:30:45.116683 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114077 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:30:45.116683 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114080 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:30:45.116683 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114082 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:30:45.116683 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114085 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:30:45.116683 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114088 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:30:45.116683 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114090 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:30:45.116683 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114093 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:30:45.116683 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114097 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:30:45.116683 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114101 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:30:45.116683 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114104 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:30:45.116683 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114107 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:30:45.117165 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114109 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:30:45.117165 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114112 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:30:45.117165 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114114 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:30:45.117165 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114117 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:30:45.117165 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114119 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:30:45.117165 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114122 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:30:45.117165 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114124 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:30:45.117165 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114126 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:30:45.117165 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114129 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:30:45.117165 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114132 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:30:45.117165 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114134 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:30:45.117165 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114137 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:30:45.117165 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114139 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:30:45.117165 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114142 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:30:45.117165 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114146 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:30:45.117165 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114148 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:30:45.117165 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114151 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:30:45.117165 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114153 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:30:45.117165 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114155 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:30:45.117165 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114158 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:30:45.117647 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114160 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:30:45.117647 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114163 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:30:45.117647 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114166 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:30:45.117647 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114168 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:30:45.117647 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114170 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:30:45.117647 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114173 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:30:45.117647 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114175 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:30:45.117647 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114178 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:30:45.117647 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114180 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:30:45.117647 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114182 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:30:45.117647 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114185 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:30:45.117647 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114187 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:30:45.117647 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114190 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:30:45.117647 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114192 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:30:45.117647 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114194 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:30:45.117647 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114197 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:30:45.118061 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.114202 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 12:30:45.118061 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114299 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:30:45.118061 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114303 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:30:45.118061 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114306 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:30:45.118061 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114320 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:30:45.118061 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114323 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:30:45.118061 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114325 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:30:45.118061 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114328 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:30:45.118061 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114332 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:30:45.118061 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114335 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:30:45.118061 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114338 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:30:45.118061 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114342 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:30:45.118061 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114345 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:30:45.118061 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114347 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:30:45.118061 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114350 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:30:45.118430 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114353 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:30:45.118430 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114356 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:30:45.118430 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114358 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:30:45.118430 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114361 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:30:45.118430 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114363 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:30:45.118430 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114366 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:30:45.118430 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114368 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:30:45.118430 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114371 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:30:45.118430 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114373 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:30:45.118430 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114376 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:30:45.118430 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114379 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:30:45.118430 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114381 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:30:45.118430 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114384 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:30:45.118430 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114386 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:30:45.118430 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114388 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:30:45.118430 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114391 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:30:45.118430 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114393 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:30:45.118430 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114396 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:30:45.118430 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114398 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:30:45.118430 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114400 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:30:45.118928 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114403 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:30:45.118928 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114405 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:30:45.118928 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114408 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:30:45.118928 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114410 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:30:45.118928 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114412 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:30:45.118928 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114415 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:30:45.118928 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114417 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:30:45.118928 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114420 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:30:45.118928 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114423 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:30:45.118928 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114426 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:30:45.118928 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114429 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:30:45.118928 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114431 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:30:45.118928 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114435 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:30:45.118928 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114438 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:30:45.118928 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114441 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:30:45.118928 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114444 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:30:45.118928 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114447 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:30:45.118928 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114449 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:30:45.118928 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114452 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:30:45.119416 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114455 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:30:45.119416 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114457 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:30:45.119416 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114459 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:30:45.119416 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114462 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:30:45.119416 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114464 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:30:45.119416 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114467 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:30:45.119416 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114469 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:30:45.119416 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114471 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:30:45.119416 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114474 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:30:45.119416 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114476 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:30:45.119416 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114479 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:30:45.119416 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114481 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:30:45.119416 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114483 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:30:45.119416 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114486 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:30:45.119416 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114489 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:30:45.119416 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114491 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:30:45.119416 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114493 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:30:45.119416 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114496 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:30:45.119416 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114498 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:30:45.119964 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114501 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:30:45.119964 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114503 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:30:45.119964 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114506 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:30:45.119964 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114508 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:30:45.119964 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114512 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:30:45.119964 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114514 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:30:45.119964 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114517 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:30:45.119964 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114519 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:30:45.119964 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114522 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:30:45.119964 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114524 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:30:45.119964 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114526 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:30:45.119964 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114529 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:30:45.119964 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114531 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:30:45.119964 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:45.114534 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:30:45.119964 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.114539 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 12:30:45.119964 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.114683 2583 server.go:962] "Client rotation is on, will bootstrap in background" Apr 19 12:30:45.120365 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.117683 2583 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 19 12:30:45.120365 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.118563 2583 server.go:1019] "Starting client certificate rotation" Apr 19 12:30:45.120365 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.118659 2583 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 19 12:30:45.120365 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.118706 2583 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 19 12:30:45.144133 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.144113 2583 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 19 12:30:45.150414 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.150395 2583 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 19 12:30:45.171332 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.171312 2583 log.go:25] "Validated CRI v1 runtime API" Apr 19 12:30:45.175862 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.175817 2583 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 19 12:30:45.176566 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.176552 2583 log.go:25] "Validated CRI v1 image API" Apr 19 12:30:45.178053 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.178033 2583 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 19 12:30:45.181465 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.181445 2583 fs.go:135] Filesystem UUIDs: map[21f07de7-9d65-44c7-85c0-909a7ec11bfd:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 e4ad92b8-9205-44bc-8ccb-41ab9dc563a4:/dev/nvme0n1p4] Apr 19 12:30:45.181526 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.181465 2583 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 19 12:30:45.186566 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.186456 2583 manager.go:217] Machine: {Timestamp:2026-04-19 12:30:45.185184629 +0000 UTC m=+0.422834901 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3113362 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2e0971445496358453b7ccfc71c156 SystemUUID:ec2e0971-4454-9635-8453-b7ccfc71c156 BootID:1cb1adc7-102f-4fae-841b-dfb37204d7cb Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:9c:9a:0b:ba:35 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:9c:9a:0b:ba:35 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:06:fd:68:19:37:89 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 19 12:30:45.186566 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.186561 2583 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 19 12:30:45.186682 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.186643 2583 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 19 12:30:45.188260 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.188233 2583 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 19 12:30:45.188398 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.188262 2583 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-194.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 19 12:30:45.188444 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.188411 2583 topology_manager.go:138] "Creating topology manager with none policy" Apr 19 12:30:45.188444 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.188420 2583 container_manager_linux.go:306] "Creating device plugin manager" Apr 19 12:30:45.188444 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.188433 2583 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 19 12:30:45.189513 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.189503 2583 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 19 12:30:45.190312 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.190302 2583 state_mem.go:36] "Initialized new in-memory state store" Apr 19 12:30:45.190406 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.190391 2583 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zbstr" Apr 19 12:30:45.190437 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.190406 2583 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 19 12:30:45.192988 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.192978 2583 kubelet.go:491] "Attempting to sync node with API server" Apr 19 12:30:45.193435 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.193426 2583 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 19 12:30:45.193467 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.193445 2583 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 19 12:30:45.193467 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.193457 2583 kubelet.go:397] "Adding apiserver pod source" Apr 19 12:30:45.193467 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.193465 2583 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 19 12:30:45.194539 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.194527 2583 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 19 12:30:45.194597 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.194561 2583 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 19 12:30:45.197574 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.197555 2583 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zbstr" Apr 19 12:30:45.198097 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.198083 2583 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 19 12:30:45.199805 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.199787 2583 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 19 12:30:45.201116 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.201102 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 19 12:30:45.201206 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.201122 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 19 12:30:45.201206 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.201131 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 19 12:30:45.201206 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.201139 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 19 12:30:45.201206 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.201149 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 19 12:30:45.201206 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.201157 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 19 12:30:45.201206 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.201166 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 19 12:30:45.201206 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.201174 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 19 12:30:45.201206 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.201185 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 19 12:30:45.201206 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.201194 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 19 12:30:45.201206 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.201207 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 19 12:30:45.201491 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.201221 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 19 12:30:45.201941 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.201930 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 19 12:30:45.201991 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.201944 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 19 12:30:45.205764 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.205746 2583 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 19 12:30:45.205868 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.205794 2583 server.go:1295] "Started kubelet" Apr 19 12:30:45.205966 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.205924 2583 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 19 12:30:45.207610 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.205967 2583 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 19 12:30:45.207712 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.207636 2583 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 19 12:30:45.208158 ip-10-0-140-194 systemd[1]: Started Kubernetes Kubelet. Apr 19 12:30:45.208818 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.208801 2583 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:30:45.208883 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.208835 2583 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 19 12:30:45.213754 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.213738 2583 server.go:317] "Adding debug handlers to kubelet server" Apr 19 12:30:45.214458 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.214440 2583 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:30:45.215729 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.215715 2583 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-140-194.ec2.internal" not found Apr 19 12:30:45.217547 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.217529 2583 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 19 12:30:45.217547 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.217543 2583 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 19 12:30:45.218509 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.218493 2583 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 19 12:30:45.218601 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.218495 2583 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 19 12:30:45.218601 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.218604 2583 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 19 12:30:45.218758 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.218623 2583 factory.go:55] Registering systemd factory Apr 19 12:30:45.218758 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.218647 2583 factory.go:223] Registration of the systemd container factory successfully Apr 19 12:30:45.218758 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:45.218506 2583 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 19 12:30:45.218758 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:45.218504 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-194.ec2.internal\" not found" Apr 19 12:30:45.218758 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.218726 2583 reconstruct.go:97] "Volume reconstruction finished" Apr 19 12:30:45.218758 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.218735 2583 reconciler.go:26] "Reconciler: start to sync state" Apr 19 12:30:45.218953 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.218871 2583 factory.go:153] Registering CRI-O factory Apr 19 12:30:45.218953 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.218887 2583 factory.go:223] Registration of the crio container factory successfully Apr 19 12:30:45.218953 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.218940 2583 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 19 12:30:45.219038 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.218963 2583 factory.go:103] Registering Raw factory Apr 19 12:30:45.219038 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.218979 2583 manager.go:1196] Started watching for new ooms in manager Apr 19 12:30:45.219669 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.219652 2583 manager.go:319] Starting recovery of all containers Apr 19 12:30:45.220175 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.220151 2583 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:30:45.222390 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:45.222328 2583 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-140-194.ec2.internal\" not found" node="ip-10-0-140-194.ec2.internal" Apr 19 12:30:45.230045 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.229811 2583 manager.go:324] Recovery completed Apr 19 12:30:45.231239 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.231216 2583 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-140-194.ec2.internal" not found Apr 19 12:30:45.235337 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.235326 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:30:45.237182 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.237168 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-194.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:30:45.237251 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.237195 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-194.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:30:45.237251 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.237226 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-194.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:30:45.237733 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.237720 2583 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 19 12:30:45.237733 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.237732 2583 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 19 12:30:45.237822 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.237749 2583 state_mem.go:36] "Initialized new in-memory state store" Apr 19 12:30:45.240742 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.240731 2583 policy_none.go:49] "None policy: Start" Apr 19 12:30:45.240778 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.240747 2583 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 19 12:30:45.240778 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.240757 2583 state_mem.go:35] "Initializing new in-memory state store" Apr 19 12:30:45.275404 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.275387 2583 manager.go:341] "Starting Device Plugin manager" Apr 19 12:30:45.299718 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:45.275450 2583 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 19 12:30:45.299718 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.275460 2583 server.go:85] "Starting device plugin registration server" Apr 19 12:30:45.299718 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.275657 2583 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 19 12:30:45.299718 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.275667 2583 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 19 12:30:45.299718 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.275767 2583 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 19 12:30:45.299718 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.275870 2583 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 19 12:30:45.299718 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.275879 2583 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 19 12:30:45.299718 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:45.276364 2583 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 19 12:30:45.299718 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:45.276405 2583 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-194.ec2.internal\" not found" Apr 19 12:30:45.299718 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.291022 2583 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-140-194.ec2.internal" not found Apr 19 12:30:45.321426 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.321376 2583 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 19 12:30:45.322706 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.322689 2583 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 19 12:30:45.322808 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.322715 2583 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 19 12:30:45.322808 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.322738 2583 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 19 12:30:45.322808 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.322744 2583 kubelet.go:2451] "Starting kubelet main sync loop" Apr 19 12:30:45.322808 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:45.322776 2583 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 19 12:30:45.327113 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.327097 2583 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:30:45.376310 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.376252 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:30:45.377153 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.377139 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-194.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:30:45.377219 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.377168 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-194.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:30:45.377219 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.377178 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-194.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:30:45.377219 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.377200 2583 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-194.ec2.internal" Apr 19 12:30:45.386230 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.386213 2583 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-194.ec2.internal" Apr 19 12:30:45.386302 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:45.386234 2583 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-194.ec2.internal\": node \"ip-10-0-140-194.ec2.internal\" not found" Apr 19 12:30:45.423654 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.423620 2583 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-194.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-194.ec2.internal"] Apr 19 12:30:45.428073 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.428055 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-194.ec2.internal" Apr 19 12:30:45.428073 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.428069 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-194.ec2.internal" Apr 19 12:30:45.455426 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.455409 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-194.ec2.internal" Apr 19 12:30:45.459712 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.459699 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-194.ec2.internal" Apr 19 12:30:45.470900 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.470884 2583 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 19 12:30:45.473294 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.473281 2583 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 19 12:30:45.519962 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.519941 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c07507ea8671ca43efd3e919f4d2efb8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-194.ec2.internal\" (UID: \"c07507ea8671ca43efd3e919f4d2efb8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-194.ec2.internal" Apr 19 12:30:45.520043 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.519966 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c07507ea8671ca43efd3e919f4d2efb8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-194.ec2.internal\" (UID: \"c07507ea8671ca43efd3e919f4d2efb8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-194.ec2.internal" Apr 19 12:30:45.520043 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.519986 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/45b611433db72785c05b5ca89b4fe28f-config\") pod \"kube-apiserver-proxy-ip-10-0-140-194.ec2.internal\" (UID: \"45b611433db72785c05b5ca89b4fe28f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-194.ec2.internal" Apr 19 12:30:45.620561 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.620538 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c07507ea8671ca43efd3e919f4d2efb8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-194.ec2.internal\" (UID: \"c07507ea8671ca43efd3e919f4d2efb8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-194.ec2.internal" Apr 19 12:30:45.620659 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.620565 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c07507ea8671ca43efd3e919f4d2efb8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-194.ec2.internal\" (UID: \"c07507ea8671ca43efd3e919f4d2efb8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-194.ec2.internal" Apr 19 12:30:45.620659 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.620581 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/45b611433db72785c05b5ca89b4fe28f-config\") pod \"kube-apiserver-proxy-ip-10-0-140-194.ec2.internal\" (UID: \"45b611433db72785c05b5ca89b4fe28f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-194.ec2.internal" Apr 19 12:30:45.620659 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.620603 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c07507ea8671ca43efd3e919f4d2efb8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-194.ec2.internal\" (UID: \"c07507ea8671ca43efd3e919f4d2efb8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-194.ec2.internal" Apr 19 12:30:45.620659 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.620617 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/45b611433db72785c05b5ca89b4fe28f-config\") pod \"kube-apiserver-proxy-ip-10-0-140-194.ec2.internal\" (UID: \"45b611433db72785c05b5ca89b4fe28f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-194.ec2.internal" Apr 19 12:30:45.620659 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.620635 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c07507ea8671ca43efd3e919f4d2efb8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-194.ec2.internal\" (UID: \"c07507ea8671ca43efd3e919f4d2efb8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-194.ec2.internal" Apr 19 12:30:45.772926 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.772868 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-194.ec2.internal" Apr 19 12:30:45.775304 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:45.775111 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-194.ec2.internal" Apr 19 12:30:46.117992 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.117914 2583 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 19 12:30:46.118565 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.118071 2583 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 19 12:30:46.118565 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.118071 2583 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 19 12:30:46.118565 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.118080 2583 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 19 12:30:46.194436 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.194412 2583 apiserver.go:52] "Watching apiserver" Apr 19 12:30:46.199449 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.199422 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-18 12:25:45 +0000 UTC" deadline="2027-10-16 03:21:39.732504326 +0000 UTC" Apr 19 12:30:46.199449 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.199447 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13070h50m53.533060293s" Apr 19 12:30:46.204121 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.204101 2583 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 19 12:30:46.204471 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.204448 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-vqdpz","openshift-cluster-node-tuning-operator/tuned-b8mhf","openshift-image-registry/node-ca-wtzrr","openshift-multus/multus-additional-cni-plugins-tgwkw","openshift-multus/multus-bh5jh","openshift-network-diagnostics/network-check-target-v6ph6","openshift-network-operator/iptables-alerter-mvtl8","kube-system/kube-apiserver-proxy-ip-10-0-140-194.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9","openshift-dns/node-resolver-w56w7","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-194.ec2.internal","openshift-multus/network-metrics-daemon-lmdfj","openshift-ovn-kubernetes/ovnkube-node-pvnl8"] Apr 19 12:30:46.209439 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.209417 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vqdpz" Apr 19 12:30:46.211248 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.211231 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-hjddv\"" Apr 19 12:30:46.211485 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.211468 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 19 12:30:46.211545 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.211516 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 19 12:30:46.211633 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.211616 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.211697 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.211683 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wtzrr" Apr 19 12:30:46.213143 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.213129 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-v8krd\"" Apr 19 12:30:46.213231 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.213146 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:30:46.213231 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.213218 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 19 12:30:46.213315 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.213247 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 19 12:30:46.213315 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.213268 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-qpff5\"" Apr 19 12:30:46.213474 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.213460 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 19 12:30:46.213626 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.213611 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 19 12:30:46.213777 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.213760 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.215615 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.215599 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 19 12:30:46.215784 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.215768 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 19 12:30:46.215933 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.215912 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.216034 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.216008 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 19 12:30:46.216034 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.216029 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 19 12:30:46.216131 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.216036 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-97fhp\"" Apr 19 12:30:46.216131 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.216013 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 19 12:30:46.217409 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.217392 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 19 12:30:46.217482 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.217426 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-czpz7\"" Apr 19 12:30:46.217619 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.217605 2583 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 19 12:30:46.218060 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.218048 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:30:46.218122 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:46.218105 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6ph6" podUID="b48a3849-491c-4513-8617-fe3c991aa057" Apr 19 12:30:46.220205 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.220187 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mvtl8" Apr 19 12:30:46.221810 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.221789 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 19 12:30:46.221975 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.221888 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-host-var-lib-cni-bin\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.221975 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.221911 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:30:46.221975 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.221918 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-host-run-multus-certs\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.221975 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.221930 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 19 12:30:46.221975 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.221967 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xstvj\" (UniqueName: \"kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj\") pod \"network-check-target-v6ph6\" (UID: \"b48a3849-491c-4513-8617-fe3c991aa057\") " pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:30:46.222220 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222009 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-etc-modprobe-d\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.222220 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222020 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ltct2\"" Apr 19 12:30:46.222220 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222033 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-etc-sysconfig\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.222220 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222066 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-etc-sysctl-conf\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.222220 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222090 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-run\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.222220 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222119 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-system-cni-dir\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.222220 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222141 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-cnibin\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.222220 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222155 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-host-run-netns\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.222220 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222170 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.222220 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222185 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b6ad95d-ad05-4a98-a140-cd66c263961c-tmp\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.222220 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222213 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-os-release\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.222545 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222236 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-lib-modules\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.222545 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222256 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1b6ad95d-ad05-4a98-a140-cd66c263961c-etc-tuned\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.222545 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222275 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ddcdec9a-0940-4c7c-8298-ee39ccec754e-cni-binary-copy\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.222545 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222291 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-host-var-lib-cni-multus\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.222545 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222313 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-multus-conf-dir\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.222545 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222330 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-cnibin\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.222545 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222344 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-os-release\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.222545 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222374 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlz6k\" (UniqueName: \"kubernetes.io/projected/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-kube-api-access-hlz6k\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.222545 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222390 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6ad204b3-eb17-4f25-b1ef-6950791a05cd-serviceca\") pod \"node-ca-wtzrr\" (UID: \"6ad204b3-eb17-4f25-b1ef-6950791a05cd\") " pod="openshift-image-registry/node-ca-wtzrr" Apr 19 12:30:46.222545 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222404 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psjxd\" (UniqueName: \"kubernetes.io/projected/ddcdec9a-0940-4c7c-8298-ee39ccec754e-kube-api-access-psjxd\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.222545 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222419 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-cni-binary-copy\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.222545 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222455 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.222545 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222481 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.222545 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222503 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-etc-sysctl-d\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.222545 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222539 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-var-lib-kubelet\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.222982 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222570 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxw5r\" (UniqueName: \"kubernetes.io/projected/1b6ad95d-ad05-4a98-a140-cd66c263961c-kube-api-access-qxw5r\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.222982 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222597 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-multus-cni-dir\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.222982 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222622 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-host-var-lib-kubelet\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.222982 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222645 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ddcdec9a-0940-4c7c-8298-ee39ccec754e-multus-daemon-config\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.222982 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222673 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4ca0bbe4-ddf4-454f-8dbf-809014fd1062-iptables-alerter-script\") pod \"iptables-alerter-mvtl8\" (UID: \"4ca0bbe4-ddf4-454f-8dbf-809014fd1062\") " pod="openshift-network-operator/iptables-alerter-mvtl8" Apr 19 12:30:46.222982 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222693 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-system-cni-dir\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.222982 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222712 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dc6c48c7-3a5b-4289-8f49-b667f1badbea-agent-certs\") pod \"konnectivity-agent-vqdpz\" (UID: \"dc6c48c7-3a5b-4289-8f49-b667f1badbea\") " pod="kube-system/konnectivity-agent-vqdpz" Apr 19 12:30:46.222982 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222734 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dc6c48c7-3a5b-4289-8f49-b667f1badbea-konnectivity-ca\") pod \"konnectivity-agent-vqdpz\" (UID: \"dc6c48c7-3a5b-4289-8f49-b667f1badbea\") " pod="kube-system/konnectivity-agent-vqdpz" Apr 19 12:30:46.222982 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222748 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-etc-kubernetes\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.222982 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222771 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-hostroot\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.222982 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222789 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-etc-kubernetes\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.222982 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222806 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4ca0bbe4-ddf4-454f-8dbf-809014fd1062-host-slash\") pod \"iptables-alerter-mvtl8\" (UID: \"4ca0bbe4-ddf4-454f-8dbf-809014fd1062\") " pod="openshift-network-operator/iptables-alerter-mvtl8" Apr 19 12:30:46.222982 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222820 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p86vp\" (UniqueName: \"kubernetes.io/projected/4ca0bbe4-ddf4-454f-8dbf-809014fd1062-kube-api-access-p86vp\") pod \"iptables-alerter-mvtl8\" (UID: \"4ca0bbe4-ddf4-454f-8dbf-809014fd1062\") " pod="openshift-network-operator/iptables-alerter-mvtl8" Apr 19 12:30:46.222982 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222833 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-sys\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.222982 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222866 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brw5j\" (UniqueName: \"kubernetes.io/projected/6ad204b3-eb17-4f25-b1ef-6950791a05cd-kube-api-access-brw5j\") pod \"node-ca-wtzrr\" (UID: \"6ad204b3-eb17-4f25-b1ef-6950791a05cd\") " pod="openshift-image-registry/node-ca-wtzrr" Apr 19 12:30:46.222982 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222880 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-multus-socket-dir-parent\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.223415 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222894 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-host-run-k8s-cni-cncf-io\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.223415 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222909 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-etc-systemd\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.223415 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222949 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-host\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.223415 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.222987 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ad204b3-eb17-4f25-b1ef-6950791a05cd-host\") pod \"node-ca-wtzrr\" (UID: \"6ad204b3-eb17-4f25-b1ef-6950791a05cd\") " pod="openshift-image-registry/node-ca-wtzrr" Apr 19 12:30:46.224537 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.224523 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.226128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.226112 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 19 12:30:46.226202 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.226183 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8lhqs\"" Apr 19 12:30:46.226202 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.226194 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 19 12:30:46.226306 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.226186 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 19 12:30:46.226595 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.226578 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w56w7" Apr 19 12:30:46.226661 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.226649 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:30:46.227136 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:46.227107 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lmdfj" podUID="fb73e6b2-9f0a-4bcf-9371-0d399622fe97" Apr 19 12:30:46.228532 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.228511 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 19 12:30:46.228644 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.228581 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 19 12:30:46.228784 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.228762 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-kxnwd\"" Apr 19 12:30:46.229713 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.229160 2583 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 19 12:30:46.230806 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.230792 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.232339 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.232316 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 19 12:30:46.232647 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.232633 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 19 12:30:46.232999 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.232983 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vbmv6\"" Apr 19 12:30:46.233136 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.233120 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 19 12:30:46.233193 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.233157 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 19 12:30:46.233285 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.233269 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 19 12:30:46.233370 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.233355 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 19 12:30:46.251363 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.251344 2583 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-4jw7h" Apr 19 12:30:46.258364 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.258345 2583 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-4jw7h" Apr 19 12:30:46.319455 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.319429 2583 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 19 12:30:46.325265 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.323537 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-os-release\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.325265 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.323583 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs\") pod \"network-metrics-daemon-lmdfj\" (UID: \"fb73e6b2-9f0a-4bcf-9371-0d399622fe97\") " pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:30:46.325265 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.323617 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/59537546-a323-4987-9ad2-4ce6e8f679c8-ovnkube-config\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.325265 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.323651 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psjxd\" (UniqueName: \"kubernetes.io/projected/ddcdec9a-0940-4c7c-8298-ee39ccec754e-kube-api-access-psjxd\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.325265 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.323681 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-cni-binary-copy\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.325265 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.323711 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.325265 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.323739 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/59537546-a323-4987-9ad2-4ce6e8f679c8-ovn-node-metrics-cert\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.325265 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.323784 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-etc-sysctl-d\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.325265 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.323818 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-multus-cni-dir\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.325265 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.323890 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-host-var-lib-kubelet\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.325265 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.323921 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ddcdec9a-0940-4c7c-8298-ee39ccec754e-multus-daemon-config\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.325265 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.323951 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4ca0bbe4-ddf4-454f-8dbf-809014fd1062-iptables-alerter-script\") pod \"iptables-alerter-mvtl8\" (UID: \"4ca0bbe4-ddf4-454f-8dbf-809014fd1062\") " pod="openshift-network-operator/iptables-alerter-mvtl8" Apr 19 12:30:46.325265 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.323976 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-system-cni-dir\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.325265 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324009 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-host-kubelet\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.325265 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324044 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-host-run-netns\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.325265 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324075 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dc6c48c7-3a5b-4289-8f49-b667f1badbea-agent-certs\") pod \"konnectivity-agent-vqdpz\" (UID: \"dc6c48c7-3a5b-4289-8f49-b667f1badbea\") " pod="kube-system/konnectivity-agent-vqdpz" Apr 19 12:30:46.325265 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324106 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dc6c48c7-3a5b-4289-8f49-b667f1badbea-konnectivity-ca\") pod \"konnectivity-agent-vqdpz\" (UID: \"dc6c48c7-3a5b-4289-8f49-b667f1badbea\") " pod="kube-system/konnectivity-agent-vqdpz" Apr 19 12:30:46.326181 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324138 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-etc-kubernetes\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.326181 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324161 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-hostroot\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.326181 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324192 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-etc-kubernetes\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.326181 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324221 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p86vp\" (UniqueName: \"kubernetes.io/projected/4ca0bbe4-ddf4-454f-8dbf-809014fd1062-kube-api-access-p86vp\") pod \"iptables-alerter-mvtl8\" (UID: \"4ca0bbe4-ddf4-454f-8dbf-809014fd1062\") " pod="openshift-network-operator/iptables-alerter-mvtl8" Apr 19 12:30:46.326181 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324252 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-var-lib-openvswitch\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.326181 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324281 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-node-log\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.326181 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324308 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-sys\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.326181 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324338 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-multus-socket-dir-parent\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.326181 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324370 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lqkn\" (UniqueName: \"kubernetes.io/projected/5d31c3f1-1682-400f-9db4-ef1c50b1f94d-kube-api-access-4lqkn\") pod \"node-resolver-w56w7\" (UID: \"5d31c3f1-1682-400f-9db4-ef1c50b1f94d\") " pod="openshift-dns/node-resolver-w56w7" Apr 19 12:30:46.326181 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324399 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-run-ovn\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.326181 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324428 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-host-run-ovn-kubernetes\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.326181 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324461 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.326181 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324494 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-host-var-lib-cni-bin\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.326181 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324523 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-host-run-multus-certs\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.326181 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324557 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xstvj\" (UniqueName: \"kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj\") pod \"network-check-target-v6ph6\" (UID: \"b48a3849-491c-4513-8617-fe3c991aa057\") " pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:30:46.326181 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324588 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fa12d5b3-e8d2-411b-bd24-82d6dc52e085-sys-fs\") pod \"aws-ebs-csi-driver-node-nr5g9\" (UID: \"fa12d5b3-e8d2-411b-bd24-82d6dc52e085\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.326181 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324613 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-run-systemd\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.326983 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324678 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-sys\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.326983 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324684 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-os-release\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.326983 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324738 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-multus-socket-dir-parent\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.326983 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324777 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-host-var-lib-cni-bin\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.326983 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324836 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-host-run-multus-certs\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.326983 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324963 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-host-var-lib-kubelet\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.326983 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.325044 2583 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 19 12:30:46.326983 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.325166 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.326983 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324753 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-etc-modprobe-d\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.326983 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.325492 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-cni-binary-copy\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.326983 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.325524 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-etc-sysctl-d\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.326983 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.325592 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-system-cni-dir\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.326983 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.324838 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-etc-modprobe-d\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.326983 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.325621 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-multus-cni-dir\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.326983 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.325651 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-system-cni-dir\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.326983 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.325696 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-cnibin\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.326983 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.325725 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-hostroot\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.326983 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.325733 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-host-run-netns\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.327888 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.325765 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fa12d5b3-e8d2-411b-bd24-82d6dc52e085-device-dir\") pod \"aws-ebs-csi-driver-node-nr5g9\" (UID: \"fa12d5b3-e8d2-411b-bd24-82d6dc52e085\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.327888 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.325769 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-etc-kubernetes\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.327888 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.325803 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-host-cni-bin\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.327888 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.325872 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dc6c48c7-3a5b-4289-8f49-b667f1badbea-konnectivity-ca\") pod \"konnectivity-agent-vqdpz\" (UID: \"dc6c48c7-3a5b-4289-8f49-b667f1badbea\") " pod="kube-system/konnectivity-agent-vqdpz" Apr 19 12:30:46.327888 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.325884 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-system-cni-dir\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.327888 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.325893 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-host-run-netns\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.327888 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.325892 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ddcdec9a-0940-4c7c-8298-ee39ccec754e-multus-daemon-config\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.327888 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.325934 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-etc-kubernetes\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.327888 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.325954 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b6ad95d-ad05-4a98-a140-cd66c263961c-tmp\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.327888 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.325977 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-cnibin\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.327888 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.326472 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1b6ad95d-ad05-4a98-a140-cd66c263961c-etc-tuned\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.327888 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.326506 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4ca0bbe4-ddf4-454f-8dbf-809014fd1062-iptables-alerter-script\") pod \"iptables-alerter-mvtl8\" (UID: \"4ca0bbe4-ddf4-454f-8dbf-809014fd1062\") " pod="openshift-network-operator/iptables-alerter-mvtl8" Apr 19 12:30:46.327888 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.326547 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.327888 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.326582 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ddcdec9a-0940-4c7c-8298-ee39ccec754e-cni-binary-copy\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.327888 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.326718 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/59537546-a323-4987-9ad2-4ce6e8f679c8-ovnkube-script-lib\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.327888 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.326755 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6ad204b3-eb17-4f25-b1ef-6950791a05cd-serviceca\") pod \"node-ca-wtzrr\" (UID: \"6ad204b3-eb17-4f25-b1ef-6950791a05cd\") " pod="openshift-image-registry/node-ca-wtzrr" Apr 19 12:30:46.327888 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.326789 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.328689 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.326823 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-var-lib-kubelet\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.328689 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.327194 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ddcdec9a-0940-4c7c-8298-ee39ccec754e-cni-binary-copy\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.328689 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.327305 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxw5r\" (UniqueName: \"kubernetes.io/projected/1b6ad95d-ad05-4a98-a140-cd66c263961c-kube-api-access-qxw5r\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.328689 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.327330 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.328689 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.327276 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6ad204b3-eb17-4f25-b1ef-6950791a05cd-serviceca\") pod \"node-ca-wtzrr\" (UID: \"6ad204b3-eb17-4f25-b1ef-6950791a05cd\") " pod="openshift-image-registry/node-ca-wtzrr" Apr 19 12:30:46.328689 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.327445 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-var-lib-kubelet\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.328689 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.327385 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa12d5b3-e8d2-411b-bd24-82d6dc52e085-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nr5g9\" (UID: \"fa12d5b3-e8d2-411b-bd24-82d6dc52e085\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.328689 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.327501 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-run-openvswitch\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.328689 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.327541 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxn4d\" (UniqueName: \"kubernetes.io/projected/59537546-a323-4987-9ad2-4ce6e8f679c8-kube-api-access-rxn4d\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.328689 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.327575 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4ca0bbe4-ddf4-454f-8dbf-809014fd1062-host-slash\") pod \"iptables-alerter-mvtl8\" (UID: \"4ca0bbe4-ddf4-454f-8dbf-809014fd1062\") " pod="openshift-network-operator/iptables-alerter-mvtl8" Apr 19 12:30:46.328689 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.327606 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fa12d5b3-e8d2-411b-bd24-82d6dc52e085-registration-dir\") pod \"aws-ebs-csi-driver-node-nr5g9\" (UID: \"fa12d5b3-e8d2-411b-bd24-82d6dc52e085\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.328689 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.327632 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-systemd-units\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.328689 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.327663 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-etc-openvswitch\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.328689 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.327708 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-log-socket\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.328689 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.327724 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4ca0bbe4-ddf4-454f-8dbf-809014fd1062-host-slash\") pod \"iptables-alerter-mvtl8\" (UID: \"4ca0bbe4-ddf4-454f-8dbf-809014fd1062\") " pod="openshift-network-operator/iptables-alerter-mvtl8" Apr 19 12:30:46.328689 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.327761 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brw5j\" (UniqueName: \"kubernetes.io/projected/6ad204b3-eb17-4f25-b1ef-6950791a05cd-kube-api-access-brw5j\") pod \"node-ca-wtzrr\" (UID: \"6ad204b3-eb17-4f25-b1ef-6950791a05cd\") " pod="openshift-image-registry/node-ca-wtzrr" Apr 19 12:30:46.328689 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.327802 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-host-run-k8s-cni-cncf-io\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.329533 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.327859 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5d31c3f1-1682-400f-9db4-ef1c50b1f94d-hosts-file\") pod \"node-resolver-w56w7\" (UID: \"5d31c3f1-1682-400f-9db4-ef1c50b1f94d\") " pod="openshift-dns/node-resolver-w56w7" Apr 19 12:30:46.329533 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.327894 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-host-cni-netd\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.329533 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.327927 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlz6k\" (UniqueName: \"kubernetes.io/projected/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-kube-api-access-hlz6k\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.329533 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.327963 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-etc-systemd\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.329533 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.327996 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-host\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.329533 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328027 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ad204b3-eb17-4f25-b1ef-6950791a05cd-host\") pod \"node-ca-wtzrr\" (UID: \"6ad204b3-eb17-4f25-b1ef-6950791a05cd\") " pod="openshift-image-registry/node-ca-wtzrr" Apr 19 12:30:46.329533 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328078 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ad204b3-eb17-4f25-b1ef-6950791a05cd-host\") pod \"node-ca-wtzrr\" (UID: \"6ad204b3-eb17-4f25-b1ef-6950791a05cd\") " pod="openshift-image-registry/node-ca-wtzrr" Apr 19 12:30:46.329533 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328076 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/59537546-a323-4987-9ad2-4ce6e8f679c8-env-overrides\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.329533 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328129 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-etc-sysconfig\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.329533 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328183 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-host-run-k8s-cni-cncf-io\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.329533 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328246 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-host\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.329533 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328310 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-etc-systemd\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.329533 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328344 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-etc-sysctl-conf\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.329533 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328371 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-run\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.329533 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328404 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fa12d5b3-e8d2-411b-bd24-82d6dc52e085-etc-selinux\") pod \"aws-ebs-csi-driver-node-nr5g9\" (UID: \"fa12d5b3-e8d2-411b-bd24-82d6dc52e085\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.329533 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328438 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdgtx\" (UniqueName: \"kubernetes.io/projected/fa12d5b3-e8d2-411b-bd24-82d6dc52e085-kube-api-access-hdgtx\") pod \"aws-ebs-csi-driver-node-nr5g9\" (UID: \"fa12d5b3-e8d2-411b-bd24-82d6dc52e085\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.329533 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328469 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzmb6\" (UniqueName: \"kubernetes.io/projected/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-kube-api-access-zzmb6\") pod \"network-metrics-daemon-lmdfj\" (UID: \"fb73e6b2-9f0a-4bcf-9371-0d399622fe97\") " pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:30:46.330128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328501 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.330128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328532 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-os-release\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.330128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328571 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fa12d5b3-e8d2-411b-bd24-82d6dc52e085-socket-dir\") pod \"aws-ebs-csi-driver-node-nr5g9\" (UID: \"fa12d5b3-e8d2-411b-bd24-82d6dc52e085\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.330128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328605 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5d31c3f1-1682-400f-9db4-ef1c50b1f94d-tmp-dir\") pod \"node-resolver-w56w7\" (UID: \"5d31c3f1-1682-400f-9db4-ef1c50b1f94d\") " pod="openshift-dns/node-resolver-w56w7" Apr 19 12:30:46.330128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328615 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-etc-sysconfig\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.330128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328637 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-host-slash\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.330128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328671 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-lib-modules\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.330128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328699 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-host-var-lib-cni-multus\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.330128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328729 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-multus-conf-dir\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.330128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328744 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-etc-sysctl-conf\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.330128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328760 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-cnibin\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.330128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328801 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b6ad95d-ad05-4a98-a140-cd66c263961c-tmp\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.330128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328804 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-run\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.330128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328888 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-cnibin\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.330128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.328966 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-host-var-lib-cni-multus\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.330128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.329001 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-os-release\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.330128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.329009 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ddcdec9a-0940-4c7c-8298-ee39ccec754e-multus-conf-dir\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.330128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.329033 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1b6ad95d-ad05-4a98-a140-cd66c263961c-etc-tuned\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.330919 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.329097 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1b6ad95d-ad05-4a98-a140-cd66c263961c-lib-modules\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.330919 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.329235 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dc6c48c7-3a5b-4289-8f49-b667f1badbea-agent-certs\") pod \"konnectivity-agent-vqdpz\" (UID: \"dc6c48c7-3a5b-4289-8f49-b667f1badbea\") " pod="kube-system/konnectivity-agent-vqdpz" Apr 19 12:30:46.330919 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:46.330516 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:30:46.330919 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:46.330535 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:30:46.330919 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:46.330544 2583 projected.go:194] Error preparing data for projected volume kube-api-access-xstvj for pod openshift-network-diagnostics/network-check-target-v6ph6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:46.330919 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:46.330608 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj podName:b48a3849-491c-4513-8617-fe3c991aa057 nodeName:}" failed. No retries permitted until 2026-04-19 12:30:46.830582113 +0000 UTC m=+2.068232352 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xstvj" (UniqueName: "kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj") pod "network-check-target-v6ph6" (UID: "b48a3849-491c-4513-8617-fe3c991aa057") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:46.332418 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.332363 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psjxd\" (UniqueName: \"kubernetes.io/projected/ddcdec9a-0940-4c7c-8298-ee39ccec754e-kube-api-access-psjxd\") pod \"multus-bh5jh\" (UID: \"ddcdec9a-0940-4c7c-8298-ee39ccec754e\") " pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.332864 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.332827 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p86vp\" (UniqueName: \"kubernetes.io/projected/4ca0bbe4-ddf4-454f-8dbf-809014fd1062-kube-api-access-p86vp\") pod \"iptables-alerter-mvtl8\" (UID: \"4ca0bbe4-ddf4-454f-8dbf-809014fd1062\") " pod="openshift-network-operator/iptables-alerter-mvtl8" Apr 19 12:30:46.335105 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.335081 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlz6k\" (UniqueName: \"kubernetes.io/projected/8c38afde-25f7-4408-bdb5-22a5ea2b4c03-kube-api-access-hlz6k\") pod \"multus-additional-cni-plugins-tgwkw\" (UID: \"8c38afde-25f7-4408-bdb5-22a5ea2b4c03\") " pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.335105 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.335095 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxw5r\" (UniqueName: \"kubernetes.io/projected/1b6ad95d-ad05-4a98-a140-cd66c263961c-kube-api-access-qxw5r\") pod \"tuned-b8mhf\" (UID: \"1b6ad95d-ad05-4a98-a140-cd66c263961c\") " pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.335253 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.335101 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brw5j\" (UniqueName: \"kubernetes.io/projected/6ad204b3-eb17-4f25-b1ef-6950791a05cd-kube-api-access-brw5j\") pod \"node-ca-wtzrr\" (UID: \"6ad204b3-eb17-4f25-b1ef-6950791a05cd\") " pod="openshift-image-registry/node-ca-wtzrr" Apr 19 12:30:46.349582 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.349555 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mvtl8" Apr 19 12:30:46.429272 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429253 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fa12d5b3-e8d2-411b-bd24-82d6dc52e085-registration-dir\") pod \"aws-ebs-csi-driver-node-nr5g9\" (UID: \"fa12d5b3-e8d2-411b-bd24-82d6dc52e085\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.429352 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429280 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-systemd-units\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.429352 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429296 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-etc-openvswitch\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.429352 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429310 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-log-socket\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.429352 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429333 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5d31c3f1-1682-400f-9db4-ef1c50b1f94d-hosts-file\") pod \"node-resolver-w56w7\" (UID: \"5d31c3f1-1682-400f-9db4-ef1c50b1f94d\") " pod="openshift-dns/node-resolver-w56w7" Apr 19 12:30:46.429352 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429353 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-host-cni-netd\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.429509 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429369 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/59537546-a323-4987-9ad2-4ce6e8f679c8-env-overrides\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.429509 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429373 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-log-socket\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.429509 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429386 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fa12d5b3-e8d2-411b-bd24-82d6dc52e085-etc-selinux\") pod \"aws-ebs-csi-driver-node-nr5g9\" (UID: \"fa12d5b3-e8d2-411b-bd24-82d6dc52e085\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.429509 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429372 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fa12d5b3-e8d2-411b-bd24-82d6dc52e085-registration-dir\") pod \"aws-ebs-csi-driver-node-nr5g9\" (UID: \"fa12d5b3-e8d2-411b-bd24-82d6dc52e085\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.429509 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429404 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-systemd-units\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.429509 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429411 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-host-cni-netd\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.429509 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429448 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5d31c3f1-1682-400f-9db4-ef1c50b1f94d-hosts-file\") pod \"node-resolver-w56w7\" (UID: \"5d31c3f1-1682-400f-9db4-ef1c50b1f94d\") " pod="openshift-dns/node-resolver-w56w7" Apr 19 12:30:46.429509 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429372 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-etc-openvswitch\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.429509 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429408 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdgtx\" (UniqueName: \"kubernetes.io/projected/fa12d5b3-e8d2-411b-bd24-82d6dc52e085-kube-api-access-hdgtx\") pod \"aws-ebs-csi-driver-node-nr5g9\" (UID: \"fa12d5b3-e8d2-411b-bd24-82d6dc52e085\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.429509 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429490 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fa12d5b3-e8d2-411b-bd24-82d6dc52e085-etc-selinux\") pod \"aws-ebs-csi-driver-node-nr5g9\" (UID: \"fa12d5b3-e8d2-411b-bd24-82d6dc52e085\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.429509 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429493 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzmb6\" (UniqueName: \"kubernetes.io/projected/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-kube-api-access-zzmb6\") pod \"network-metrics-daemon-lmdfj\" (UID: \"fb73e6b2-9f0a-4bcf-9371-0d399622fe97\") " pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:30:46.430011 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429518 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.430011 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429546 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.430011 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429626 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fa12d5b3-e8d2-411b-bd24-82d6dc52e085-socket-dir\") pod \"aws-ebs-csi-driver-node-nr5g9\" (UID: \"fa12d5b3-e8d2-411b-bd24-82d6dc52e085\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.430011 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429645 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5d31c3f1-1682-400f-9db4-ef1c50b1f94d-tmp-dir\") pod \"node-resolver-w56w7\" (UID: \"5d31c3f1-1682-400f-9db4-ef1c50b1f94d\") " pod="openshift-dns/node-resolver-w56w7" Apr 19 12:30:46.430011 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429667 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-host-slash\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.430011 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429685 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs\") pod \"network-metrics-daemon-lmdfj\" (UID: \"fb73e6b2-9f0a-4bcf-9371-0d399622fe97\") " pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:30:46.430011 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429703 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/59537546-a323-4987-9ad2-4ce6e8f679c8-ovnkube-config\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.430011 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429738 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-host-slash\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.430011 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429745 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/59537546-a323-4987-9ad2-4ce6e8f679c8-ovn-node-metrics-cert\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.430011 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429739 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fa12d5b3-e8d2-411b-bd24-82d6dc52e085-socket-dir\") pod \"aws-ebs-csi-driver-node-nr5g9\" (UID: \"fa12d5b3-e8d2-411b-bd24-82d6dc52e085\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.430011 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:46.429772 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:46.430011 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429833 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-host-kubelet\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.430011 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:46.429864 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs podName:fb73e6b2-9f0a-4bcf-9371-0d399622fe97 nodeName:}" failed. No retries permitted until 2026-04-19 12:30:46.929824082 +0000 UTC m=+2.167474324 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs") pod "network-metrics-daemon-lmdfj" (UID: "fb73e6b2-9f0a-4bcf-9371-0d399622fe97") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:46.430011 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429884 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-host-kubelet\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.430011 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429892 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-host-run-netns\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.430011 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429893 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/59537546-a323-4987-9ad2-4ce6e8f679c8-env-overrides\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.430011 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.429957 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5d31c3f1-1682-400f-9db4-ef1c50b1f94d-tmp-dir\") pod \"node-resolver-w56w7\" (UID: \"5d31c3f1-1682-400f-9db4-ef1c50b1f94d\") " pod="openshift-dns/node-resolver-w56w7" Apr 19 12:30:46.430736 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430096 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-host-run-netns\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.430736 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430142 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-var-lib-openvswitch\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.430736 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430171 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-node-log\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.430736 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430199 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4lqkn\" (UniqueName: \"kubernetes.io/projected/5d31c3f1-1682-400f-9db4-ef1c50b1f94d-kube-api-access-4lqkn\") pod \"node-resolver-w56w7\" (UID: \"5d31c3f1-1682-400f-9db4-ef1c50b1f94d\") " pod="openshift-dns/node-resolver-w56w7" Apr 19 12:30:46.430736 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430218 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-run-ovn\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.430736 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430229 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-var-lib-openvswitch\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.430736 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430230 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-node-log\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.430736 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430253 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-host-run-ovn-kubernetes\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.430736 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430290 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-run-ovn\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.430736 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430294 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fa12d5b3-e8d2-411b-bd24-82d6dc52e085-sys-fs\") pod \"aws-ebs-csi-driver-node-nr5g9\" (UID: \"fa12d5b3-e8d2-411b-bd24-82d6dc52e085\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.430736 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430326 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-run-systemd\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.430736 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430336 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fa12d5b3-e8d2-411b-bd24-82d6dc52e085-sys-fs\") pod \"aws-ebs-csi-driver-node-nr5g9\" (UID: \"fa12d5b3-e8d2-411b-bd24-82d6dc52e085\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.430736 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430355 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fa12d5b3-e8d2-411b-bd24-82d6dc52e085-device-dir\") pod \"aws-ebs-csi-driver-node-nr5g9\" (UID: \"fa12d5b3-e8d2-411b-bd24-82d6dc52e085\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.430736 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430359 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-host-run-ovn-kubernetes\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.430736 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430375 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-run-systemd\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.430736 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430410 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-host-cni-bin\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.430736 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430413 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fa12d5b3-e8d2-411b-bd24-82d6dc52e085-device-dir\") pod \"aws-ebs-csi-driver-node-nr5g9\" (UID: \"fa12d5b3-e8d2-411b-bd24-82d6dc52e085\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.431249 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430439 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-host-cni-bin\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.431249 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430463 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/59537546-a323-4987-9ad2-4ce6e8f679c8-ovnkube-script-lib\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.431249 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430494 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa12d5b3-e8d2-411b-bd24-82d6dc52e085-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nr5g9\" (UID: \"fa12d5b3-e8d2-411b-bd24-82d6dc52e085\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.431249 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430538 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-run-openvswitch\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.431249 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430567 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxn4d\" (UniqueName: \"kubernetes.io/projected/59537546-a323-4987-9ad2-4ce6e8f679c8-kube-api-access-rxn4d\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.431249 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430611 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa12d5b3-e8d2-411b-bd24-82d6dc52e085-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nr5g9\" (UID: \"fa12d5b3-e8d2-411b-bd24-82d6dc52e085\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.431249 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430628 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59537546-a323-4987-9ad2-4ce6e8f679c8-run-openvswitch\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.431249 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430788 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/59537546-a323-4987-9ad2-4ce6e8f679c8-ovnkube-config\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.431249 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.430913 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/59537546-a323-4987-9ad2-4ce6e8f679c8-ovnkube-script-lib\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.431958 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.431940 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/59537546-a323-4987-9ad2-4ce6e8f679c8-ovn-node-metrics-cert\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.434546 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:46.434524 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc07507ea8671ca43efd3e919f4d2efb8.slice/crio-1ef89d8f7feae09f824c754afafa99a6303b321f6c9e6cd5c1334a9e422da798 WatchSource:0}: Error finding container 1ef89d8f7feae09f824c754afafa99a6303b321f6c9e6cd5c1334a9e422da798: Status 404 returned error can't find the container with id 1ef89d8f7feae09f824c754afafa99a6303b321f6c9e6cd5c1334a9e422da798 Apr 19 12:30:46.435035 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:46.435020 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45b611433db72785c05b5ca89b4fe28f.slice/crio-2a2c56d64995e9cd8b56f3ef11aeeb64e0535fa34a881be03ec622490c372e63 WatchSource:0}: Error finding container 2a2c56d64995e9cd8b56f3ef11aeeb64e0535fa34a881be03ec622490c372e63: Status 404 returned error can't find the container with id 2a2c56d64995e9cd8b56f3ef11aeeb64e0535fa34a881be03ec622490c372e63 Apr 19 12:30:46.439902 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.439883 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzmb6\" (UniqueName: \"kubernetes.io/projected/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-kube-api-access-zzmb6\") pod \"network-metrics-daemon-lmdfj\" (UID: \"fb73e6b2-9f0a-4bcf-9371-0d399622fe97\") " pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:30:46.439984 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.439918 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lqkn\" (UniqueName: \"kubernetes.io/projected/5d31c3f1-1682-400f-9db4-ef1c50b1f94d-kube-api-access-4lqkn\") pod \"node-resolver-w56w7\" (UID: \"5d31c3f1-1682-400f-9db4-ef1c50b1f94d\") " pod="openshift-dns/node-resolver-w56w7" Apr 19 12:30:46.439984 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.439928 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:30:46.440267 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.440249 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdgtx\" (UniqueName: \"kubernetes.io/projected/fa12d5b3-e8d2-411b-bd24-82d6dc52e085-kube-api-access-hdgtx\") pod \"aws-ebs-csi-driver-node-nr5g9\" (UID: \"fa12d5b3-e8d2-411b-bd24-82d6dc52e085\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.440702 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.440633 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxn4d\" (UniqueName: \"kubernetes.io/projected/59537546-a323-4987-9ad2-4ce6e8f679c8-kube-api-access-rxn4d\") pod \"ovnkube-node-pvnl8\" (UID: \"59537546-a323-4987-9ad2-4ce6e8f679c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.547133 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.547107 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vqdpz" Apr 19 12:30:46.553137 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:46.553117 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc6c48c7_3a5b_4289_8f49_b667f1badbea.slice/crio-14051b571bae3289a5b566a46a94ba2159a994d29164ee66ed6e27c9a8f62c0c WatchSource:0}: Error finding container 14051b571bae3289a5b566a46a94ba2159a994d29164ee66ed6e27c9a8f62c0c: Status 404 returned error can't find the container with id 14051b571bae3289a5b566a46a94ba2159a994d29164ee66ed6e27c9a8f62c0c Apr 19 12:30:46.562927 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.562912 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" Apr 19 12:30:46.569096 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:46.569076 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b6ad95d_ad05_4a98_a140_cd66c263961c.slice/crio-4229615588d05445b0057a53dd4b25c1ef706f8a645947fa9f0aee07895083d9 WatchSource:0}: Error finding container 4229615588d05445b0057a53dd4b25c1ef706f8a645947fa9f0aee07895083d9: Status 404 returned error can't find the container with id 4229615588d05445b0057a53dd4b25c1ef706f8a645947fa9f0aee07895083d9 Apr 19 12:30:46.588467 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.588449 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wtzrr" Apr 19 12:30:46.592032 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.591934 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tgwkw" Apr 19 12:30:46.594394 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:46.594370 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ad204b3_eb17_4f25_b1ef_6950791a05cd.slice/crio-2dd78f77f63cfb069d0667cf2c3587651829811ac53c1d20c1aacad2a007b3a3 WatchSource:0}: Error finding container 2dd78f77f63cfb069d0667cf2c3587651829811ac53c1d20c1aacad2a007b3a3: Status 404 returned error can't find the container with id 2dd78f77f63cfb069d0667cf2c3587651829811ac53c1d20c1aacad2a007b3a3 Apr 19 12:30:46.597959 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:46.597939 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c38afde_25f7_4408_bdb5_22a5ea2b4c03.slice/crio-e56cad309cf2cb6456911cbba477324de67befab2d0cd51080e08faf71d339da WatchSource:0}: Error finding container e56cad309cf2cb6456911cbba477324de67befab2d0cd51080e08faf71d339da: Status 404 returned error can't find the container with id e56cad309cf2cb6456911cbba477324de67befab2d0cd51080e08faf71d339da Apr 19 12:30:46.621933 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.621912 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bh5jh" Apr 19 12:30:46.627092 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:46.627069 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddcdec9a_0940_4c7c_8298_ee39ccec754e.slice/crio-6b6f91b725259209cafe670a9d9aaa0d948e32766dd0d8126f52316dd312c064 WatchSource:0}: Error finding container 6b6f91b725259209cafe670a9d9aaa0d948e32766dd0d8126f52316dd312c064: Status 404 returned error can't find the container with id 6b6f91b725259209cafe670a9d9aaa0d948e32766dd0d8126f52316dd312c064 Apr 19 12:30:46.657819 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.657799 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" Apr 19 12:30:46.661694 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:46.661672 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ca0bbe4_ddf4_454f_8dbf_809014fd1062.slice/crio-c1831d74c5213802e83643a8b8e8d1386372c9a2adf76105b663c06ca14901a2 WatchSource:0}: Error finding container c1831d74c5213802e83643a8b8e8d1386372c9a2adf76105b663c06ca14901a2: Status 404 returned error can't find the container with id c1831d74c5213802e83643a8b8e8d1386372c9a2adf76105b663c06ca14901a2 Apr 19 12:30:46.664703 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:46.664681 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa12d5b3_e8d2_411b_bd24_82d6dc52e085.slice/crio-e52f9b1a0ea2fef83bcd1f63ad7609e2cd34d91fcb01ee4f9e07a412a3b94403 WatchSource:0}: Error finding container e52f9b1a0ea2fef83bcd1f63ad7609e2cd34d91fcb01ee4f9e07a412a3b94403: Status 404 returned error can't find the container with id e52f9b1a0ea2fef83bcd1f63ad7609e2cd34d91fcb01ee4f9e07a412a3b94403 Apr 19 12:30:46.694453 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.694433 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w56w7" Apr 19 12:30:46.699126 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.699110 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:30:46.701292 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:46.701272 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d31c3f1_1682_400f_9db4_ef1c50b1f94d.slice/crio-87b5198bece2d8fe05843bf2b6ee0bc85e8acdc092402de2d90fd7eff192ae16 WatchSource:0}: Error finding container 87b5198bece2d8fe05843bf2b6ee0bc85e8acdc092402de2d90fd7eff192ae16: Status 404 returned error can't find the container with id 87b5198bece2d8fe05843bf2b6ee0bc85e8acdc092402de2d90fd7eff192ae16 Apr 19 12:30:46.705245 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:30:46.705225 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59537546_a323_4987_9ad2_4ce6e8f679c8.slice/crio-34f21989f7fe67f7d6a900021dcd8ae7701ecffecb938e3bd97032c7faa38c91 WatchSource:0}: Error finding container 34f21989f7fe67f7d6a900021dcd8ae7701ecffecb938e3bd97032c7faa38c91: Status 404 returned error can't find the container with id 34f21989f7fe67f7d6a900021dcd8ae7701ecffecb938e3bd97032c7faa38c91 Apr 19 12:30:46.833953 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.833925 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xstvj\" (UniqueName: \"kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj\") pod \"network-check-target-v6ph6\" (UID: \"b48a3849-491c-4513-8617-fe3c991aa057\") " pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:30:46.834099 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:46.834085 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:30:46.834142 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:46.834108 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:30:46.834142 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:46.834123 2583 projected.go:194] Error preparing data for projected volume kube-api-access-xstvj for pod openshift-network-diagnostics/network-check-target-v6ph6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:46.834201 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:46.834177 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj podName:b48a3849-491c-4513-8617-fe3c991aa057 nodeName:}" failed. No retries permitted until 2026-04-19 12:30:47.834159232 +0000 UTC m=+3.071809485 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xstvj" (UniqueName: "kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj") pod "network-check-target-v6ph6" (UID: "b48a3849-491c-4513-8617-fe3c991aa057") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:46.935815 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:46.935239 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs\") pod \"network-metrics-daemon-lmdfj\" (UID: \"fb73e6b2-9f0a-4bcf-9371-0d399622fe97\") " pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:30:46.935815 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:46.935372 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:46.935815 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:46.935426 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs podName:fb73e6b2-9f0a-4bcf-9371-0d399622fe97 nodeName:}" failed. No retries permitted until 2026-04-19 12:30:47.935409686 +0000 UTC m=+3.173059931 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs") pod "network-metrics-daemon-lmdfj" (UID: "fb73e6b2-9f0a-4bcf-9371-0d399622fe97") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:47.146729 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:47.146701 2583 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:30:47.260108 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:47.259982 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-18 12:25:46 +0000 UTC" deadline="2027-09-24 00:46:55.307353585 +0000 UTC" Apr 19 12:30:47.260108 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:47.260015 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12540h16m8.04734347s" Apr 19 12:30:47.349115 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:47.349055 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w56w7" event={"ID":"5d31c3f1-1682-400f-9db4-ef1c50b1f94d","Type":"ContainerStarted","Data":"87b5198bece2d8fe05843bf2b6ee0bc85e8acdc092402de2d90fd7eff192ae16"} Apr 19 12:30:47.366727 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:47.366692 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mvtl8" event={"ID":"4ca0bbe4-ddf4-454f-8dbf-809014fd1062","Type":"ContainerStarted","Data":"c1831d74c5213802e83643a8b8e8d1386372c9a2adf76105b663c06ca14901a2"} Apr 19 12:30:47.387236 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:47.387201 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tgwkw" event={"ID":"8c38afde-25f7-4408-bdb5-22a5ea2b4c03","Type":"ContainerStarted","Data":"e56cad309cf2cb6456911cbba477324de67befab2d0cd51080e08faf71d339da"} Apr 19 12:30:47.417463 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:47.417429 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-194.ec2.internal" event={"ID":"45b611433db72785c05b5ca89b4fe28f","Type":"ContainerStarted","Data":"2a2c56d64995e9cd8b56f3ef11aeeb64e0535fa34a881be03ec622490c372e63"} Apr 19 12:30:47.439005 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:47.438950 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" event={"ID":"fa12d5b3-e8d2-411b-bd24-82d6dc52e085","Type":"ContainerStarted","Data":"e52f9b1a0ea2fef83bcd1f63ad7609e2cd34d91fcb01ee4f9e07a412a3b94403"} Apr 19 12:30:47.447785 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:47.447715 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bh5jh" event={"ID":"ddcdec9a-0940-4c7c-8298-ee39ccec754e","Type":"ContainerStarted","Data":"6b6f91b725259209cafe670a9d9aaa0d948e32766dd0d8126f52316dd312c064"} Apr 19 12:30:47.481117 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:47.481087 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wtzrr" event={"ID":"6ad204b3-eb17-4f25-b1ef-6950791a05cd","Type":"ContainerStarted","Data":"2dd78f77f63cfb069d0667cf2c3587651829811ac53c1d20c1aacad2a007b3a3"} Apr 19 12:30:47.496935 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:47.496898 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" event={"ID":"1b6ad95d-ad05-4a98-a140-cd66c263961c","Type":"ContainerStarted","Data":"4229615588d05445b0057a53dd4b25c1ef706f8a645947fa9f0aee07895083d9"} Apr 19 12:30:47.512891 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:47.512810 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vqdpz" event={"ID":"dc6c48c7-3a5b-4289-8f49-b667f1badbea","Type":"ContainerStarted","Data":"14051b571bae3289a5b566a46a94ba2159a994d29164ee66ed6e27c9a8f62c0c"} Apr 19 12:30:47.524294 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:47.524266 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-194.ec2.internal" event={"ID":"c07507ea8671ca43efd3e919f4d2efb8","Type":"ContainerStarted","Data":"1ef89d8f7feae09f824c754afafa99a6303b321f6c9e6cd5c1334a9e422da798"} Apr 19 12:30:47.537441 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:47.537417 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" event={"ID":"59537546-a323-4987-9ad2-4ce6e8f679c8","Type":"ContainerStarted","Data":"34f21989f7fe67f7d6a900021dcd8ae7701ecffecb938e3bd97032c7faa38c91"} Apr 19 12:30:47.560907 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:47.560879 2583 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:30:47.595280 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:47.594882 2583 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:30:47.843005 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:47.842305 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xstvj\" (UniqueName: \"kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj\") pod \"network-check-target-v6ph6\" (UID: \"b48a3849-491c-4513-8617-fe3c991aa057\") " pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:30:47.843005 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:47.842476 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:30:47.843005 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:47.842494 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:30:47.843005 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:47.842507 2583 projected.go:194] Error preparing data for projected volume kube-api-access-xstvj for pod openshift-network-diagnostics/network-check-target-v6ph6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:47.843005 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:47.842569 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj podName:b48a3849-491c-4513-8617-fe3c991aa057 nodeName:}" failed. No retries permitted until 2026-04-19 12:30:49.84254991 +0000 UTC m=+5.080200155 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xstvj" (UniqueName: "kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj") pod "network-check-target-v6ph6" (UID: "b48a3849-491c-4513-8617-fe3c991aa057") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:47.943806 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:47.943475 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs\") pod \"network-metrics-daemon-lmdfj\" (UID: \"fb73e6b2-9f0a-4bcf-9371-0d399622fe97\") " pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:30:47.943806 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:47.943576 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:47.943806 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:47.943632 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs podName:fb73e6b2-9f0a-4bcf-9371-0d399622fe97 nodeName:}" failed. No retries permitted until 2026-04-19 12:30:49.943618772 +0000 UTC m=+5.181269012 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs") pod "network-metrics-daemon-lmdfj" (UID: "fb73e6b2-9f0a-4bcf-9371-0d399622fe97") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:48.261104 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:48.261016 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-18 12:25:46 +0000 UTC" deadline="2027-10-24 19:36:24.190884533 +0000 UTC" Apr 19 12:30:48.261104 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:48.261053 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13279h5m35.929834224s" Apr 19 12:30:48.323896 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:48.323557 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:30:48.323896 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:48.323679 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6ph6" podUID="b48a3849-491c-4513-8617-fe3c991aa057" Apr 19 12:30:48.324328 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:48.324152 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:30:48.324328 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:48.324263 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lmdfj" podUID="fb73e6b2-9f0a-4bcf-9371-0d399622fe97" Apr 19 12:30:49.860051 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:49.860012 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xstvj\" (UniqueName: \"kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj\") pod \"network-check-target-v6ph6\" (UID: \"b48a3849-491c-4513-8617-fe3c991aa057\") " pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:30:49.860493 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:49.860190 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:30:49.860493 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:49.860211 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:30:49.860493 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:49.860225 2583 projected.go:194] Error preparing data for projected volume kube-api-access-xstvj for pod openshift-network-diagnostics/network-check-target-v6ph6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:49.860493 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:49.860279 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj podName:b48a3849-491c-4513-8617-fe3c991aa057 nodeName:}" failed. No retries permitted until 2026-04-19 12:30:53.860260846 +0000 UTC m=+9.097911091 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-xstvj" (UniqueName: "kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj") pod "network-check-target-v6ph6" (UID: "b48a3849-491c-4513-8617-fe3c991aa057") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:49.960817 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:49.960781 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs\") pod \"network-metrics-daemon-lmdfj\" (UID: \"fb73e6b2-9f0a-4bcf-9371-0d399622fe97\") " pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:30:49.961015 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:49.960935 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:49.961015 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:49.960997 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs podName:fb73e6b2-9f0a-4bcf-9371-0d399622fe97 nodeName:}" failed. No retries permitted until 2026-04-19 12:30:53.96097865 +0000 UTC m=+9.198628904 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs") pod "network-metrics-daemon-lmdfj" (UID: "fb73e6b2-9f0a-4bcf-9371-0d399622fe97") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:50.323973 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:50.323897 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:30:50.324124 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:50.323918 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:30:50.324124 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:50.324029 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lmdfj" podUID="fb73e6b2-9f0a-4bcf-9371-0d399622fe97" Apr 19 12:30:50.324124 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:50.324068 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6ph6" podUID="b48a3849-491c-4513-8617-fe3c991aa057" Apr 19 12:30:52.323152 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:52.323120 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:30:52.323620 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:52.323172 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:30:52.323620 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:52.323278 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6ph6" podUID="b48a3849-491c-4513-8617-fe3c991aa057" Apr 19 12:30:52.323731 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:52.323701 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lmdfj" podUID="fb73e6b2-9f0a-4bcf-9371-0d399622fe97" Apr 19 12:30:53.896198 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:53.896165 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xstvj\" (UniqueName: \"kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj\") pod \"network-check-target-v6ph6\" (UID: \"b48a3849-491c-4513-8617-fe3c991aa057\") " pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:30:53.896598 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:53.896312 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:30:53.896598 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:53.896327 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:30:53.896598 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:53.896336 2583 projected.go:194] Error preparing data for projected volume kube-api-access-xstvj for pod openshift-network-diagnostics/network-check-target-v6ph6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:53.896598 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:53.896387 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj podName:b48a3849-491c-4513-8617-fe3c991aa057 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:01.896373674 +0000 UTC m=+17.134023914 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-xstvj" (UniqueName: "kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj") pod "network-check-target-v6ph6" (UID: "b48a3849-491c-4513-8617-fe3c991aa057") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:30:53.996788 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:53.996729 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs\") pod \"network-metrics-daemon-lmdfj\" (UID: \"fb73e6b2-9f0a-4bcf-9371-0d399622fe97\") " pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:30:53.996992 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:53.996937 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:53.997059 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:53.997002 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs podName:fb73e6b2-9f0a-4bcf-9371-0d399622fe97 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:01.996982202 +0000 UTC m=+17.234632447 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs") pod "network-metrics-daemon-lmdfj" (UID: "fb73e6b2-9f0a-4bcf-9371-0d399622fe97") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:30:54.323132 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:54.323061 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:30:54.323266 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:54.323186 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lmdfj" podUID="fb73e6b2-9f0a-4bcf-9371-0d399622fe97" Apr 19 12:30:54.323565 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:54.323547 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:30:54.323674 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:54.323650 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6ph6" podUID="b48a3849-491c-4513-8617-fe3c991aa057" Apr 19 12:30:56.323985 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:56.323942 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:30:56.323985 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:56.323968 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:30:56.324429 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:56.324074 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lmdfj" podUID="fb73e6b2-9f0a-4bcf-9371-0d399622fe97" Apr 19 12:30:56.324429 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:56.324198 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6ph6" podUID="b48a3849-491c-4513-8617-fe3c991aa057" Apr 19 12:30:58.322998 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:58.322963 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:30:58.323432 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:30:58.322963 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:30:58.323432 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:58.323093 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6ph6" podUID="b48a3849-491c-4513-8617-fe3c991aa057" Apr 19 12:30:58.323432 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:30:58.323137 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lmdfj" podUID="fb73e6b2-9f0a-4bcf-9371-0d399622fe97" Apr 19 12:31:00.323184 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:00.323101 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:31:00.323184 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:00.323138 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:31:00.323592 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:00.323214 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lmdfj" podUID="fb73e6b2-9f0a-4bcf-9371-0d399622fe97" Apr 19 12:31:00.323592 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:00.323333 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6ph6" podUID="b48a3849-491c-4513-8617-fe3c991aa057" Apr 19 12:31:01.956378 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:01.956347 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xstvj\" (UniqueName: \"kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj\") pod \"network-check-target-v6ph6\" (UID: \"b48a3849-491c-4513-8617-fe3c991aa057\") " pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:31:01.956884 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:01.956501 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:31:01.956884 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:01.956524 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:31:01.956884 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:01.956540 2583 projected.go:194] Error preparing data for projected volume kube-api-access-xstvj for pod openshift-network-diagnostics/network-check-target-v6ph6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:31:01.956884 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:01.956605 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj podName:b48a3849-491c-4513-8617-fe3c991aa057 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:17.956585663 +0000 UTC m=+33.194235901 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-xstvj" (UniqueName: "kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj") pod "network-check-target-v6ph6" (UID: "b48a3849-491c-4513-8617-fe3c991aa057") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:31:02.057389 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:02.057356 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs\") pod \"network-metrics-daemon-lmdfj\" (UID: \"fb73e6b2-9f0a-4bcf-9371-0d399622fe97\") " pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:31:02.057599 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:02.057534 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:31:02.057599 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:02.057596 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs podName:fb73e6b2-9f0a-4bcf-9371-0d399622fe97 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:18.057575928 +0000 UTC m=+33.295226166 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs") pod "network-metrics-daemon-lmdfj" (UID: "fb73e6b2-9f0a-4bcf-9371-0d399622fe97") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:31:02.323667 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:02.323597 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:31:02.323942 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:02.323598 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:31:02.323942 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:02.323726 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lmdfj" podUID="fb73e6b2-9f0a-4bcf-9371-0d399622fe97" Apr 19 12:31:02.323942 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:02.323830 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6ph6" podUID="b48a3849-491c-4513-8617-fe3c991aa057" Apr 19 12:31:04.323443 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:04.323259 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:31:04.323819 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:04.323320 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:31:04.323819 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:04.323522 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lmdfj" podUID="fb73e6b2-9f0a-4bcf-9371-0d399622fe97" Apr 19 12:31:04.323819 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:04.323584 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6ph6" podUID="b48a3849-491c-4513-8617-fe3c991aa057" Apr 19 12:31:05.584410 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:05.584382 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovn-acl-logging/0.log" Apr 19 12:31:05.585447 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:05.585232 2583 generic.go:358] "Generic (PLEG): container finished" podID="59537546-a323-4987-9ad2-4ce6e8f679c8" containerID="9cae1e5742ac5cb1cda9fae929cad14b900183f27e3150c0455fb7567ac4737e" exitCode=1 Apr 19 12:31:05.585447 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:05.585315 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" event={"ID":"59537546-a323-4987-9ad2-4ce6e8f679c8","Type":"ContainerStarted","Data":"89f46d1912ea8674a05c8ed35574cb1eddeb82fc672203d760ef87e7418d15ae"} Apr 19 12:31:05.585447 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:05.585343 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" event={"ID":"59537546-a323-4987-9ad2-4ce6e8f679c8","Type":"ContainerStarted","Data":"47e364fbd72fc74c801d3973c0faa2ecda8cf0b9c0c3f28f629a66d6ad1b6678"} Apr 19 12:31:05.585447 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:05.585357 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" event={"ID":"59537546-a323-4987-9ad2-4ce6e8f679c8","Type":"ContainerStarted","Data":"82ec3798737bccb42d9dcf00576abbec94bf5559f72113b5b3a3b413aa9a0e6d"} Apr 19 12:31:05.585447 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:05.585373 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" event={"ID":"59537546-a323-4987-9ad2-4ce6e8f679c8","Type":"ContainerStarted","Data":"9da9a56a28e96448b60662f25f47bf957e49bedeae6500d819daf60a033a4899"} Apr 19 12:31:05.585447 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:05.585387 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" event={"ID":"59537546-a323-4987-9ad2-4ce6e8f679c8","Type":"ContainerDied","Data":"9cae1e5742ac5cb1cda9fae929cad14b900183f27e3150c0455fb7567ac4737e"} Apr 19 12:31:05.585447 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:05.585402 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" event={"ID":"59537546-a323-4987-9ad2-4ce6e8f679c8","Type":"ContainerStarted","Data":"61d7ba077ded497dd6f402008f2bd6356afb9bce641419334ceba42ac45ae2d7"} Apr 19 12:31:05.600251 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:05.599781 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-194.ec2.internal" event={"ID":"45b611433db72785c05b5ca89b4fe28f","Type":"ContainerStarted","Data":"d07c2892c59470a627a470fb6d318ca51765b1ab5509af4195e13e6239848ce3"} Apr 19 12:31:05.603370 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:05.603331 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bh5jh" event={"ID":"ddcdec9a-0940-4c7c-8298-ee39ccec754e","Type":"ContainerStarted","Data":"6196f85ddd7c32468e67ac6cb0235ed63021840b7b11d3c0766fa0057922a8c3"} Apr 19 12:31:05.604913 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:05.604891 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" event={"ID":"1b6ad95d-ad05-4a98-a140-cd66c263961c","Type":"ContainerStarted","Data":"96536e95ce54f588405c34fe5cc9fc9db92742f417f6ed7b3a8bc0436af45e5b"} Apr 19 12:31:05.612217 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:05.612167 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-194.ec2.internal" podStartSLOduration=20.61215551 podStartE2EDuration="20.61215551s" podCreationTimestamp="2026-04-19 12:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:31:05.611648283 +0000 UTC m=+20.849298545" watchObservedRunningTime="2026-04-19 12:31:05.61215551 +0000 UTC m=+20.849805772" Apr 19 12:31:05.640497 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:05.640454 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-b8mhf" podStartSLOduration=2.434730988 podStartE2EDuration="20.640442126s" podCreationTimestamp="2026-04-19 12:30:45 +0000 UTC" firstStartedPulling="2026-04-19 12:30:46.570420089 +0000 UTC m=+1.808070331" lastFinishedPulling="2026-04-19 12:31:04.77613123 +0000 UTC m=+20.013781469" observedRunningTime="2026-04-19 12:31:05.625289088 +0000 UTC m=+20.862939349" watchObservedRunningTime="2026-04-19 12:31:05.640442126 +0000 UTC m=+20.878092432" Apr 19 12:31:05.640651 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:05.640622 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bh5jh" podStartSLOduration=2.477577085 podStartE2EDuration="20.640617758s" podCreationTimestamp="2026-04-19 12:30:45 +0000 UTC" firstStartedPulling="2026-04-19 12:30:46.628712988 +0000 UTC m=+1.866363228" lastFinishedPulling="2026-04-19 12:31:04.791753658 +0000 UTC m=+20.029403901" observedRunningTime="2026-04-19 12:31:05.639945695 +0000 UTC m=+20.877595956" watchObservedRunningTime="2026-04-19 12:31:05.640617758 +0000 UTC m=+20.878268018" Apr 19 12:31:06.323469 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:06.323232 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:31:06.323676 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:06.323261 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:31:06.323676 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:06.323580 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6ph6" podUID="b48a3849-491c-4513-8617-fe3c991aa057" Apr 19 12:31:06.323676 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:06.323635 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lmdfj" podUID="fb73e6b2-9f0a-4bcf-9371-0d399622fe97" Apr 19 12:31:06.608159 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:06.608131 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wtzrr" event={"ID":"6ad204b3-eb17-4f25-b1ef-6950791a05cd","Type":"ContainerStarted","Data":"9a36fd16fb40fff2a9bcd0cce137a6e5fd255d9be84afc57301b9ab245f18e03"} Apr 19 12:31:06.609656 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:06.609620 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vqdpz" event={"ID":"dc6c48c7-3a5b-4289-8f49-b667f1badbea","Type":"ContainerStarted","Data":"fd02ba18fff88840101ae48614c8bb6179462391a3b88dacc1d6bcc596b9ae5f"} Apr 19 12:31:06.611158 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:06.611137 2583 generic.go:358] "Generic (PLEG): container finished" podID="c07507ea8671ca43efd3e919f4d2efb8" containerID="6457f3c01fe1d9bb96d1f41be2a6e0b86405b7c26083d0e5aabda26887ec60cf" exitCode=0 Apr 19 12:31:06.611242 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:06.611207 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-194.ec2.internal" event={"ID":"c07507ea8671ca43efd3e919f4d2efb8","Type":"ContainerDied","Data":"6457f3c01fe1d9bb96d1f41be2a6e0b86405b7c26083d0e5aabda26887ec60cf"} Apr 19 12:31:06.612995 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:06.612967 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w56w7" event={"ID":"5d31c3f1-1682-400f-9db4-ef1c50b1f94d","Type":"ContainerStarted","Data":"f81955e64d1b5b8c568685a9472d445e8bab3e7564bf5d232f81645f5bfc31e7"} Apr 19 12:31:06.614337 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:06.614312 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mvtl8" event={"ID":"4ca0bbe4-ddf4-454f-8dbf-809014fd1062","Type":"ContainerStarted","Data":"29ae8f3fb8308f04fde53ae88ba8729c95f0dbf36c5b3b83f2a2ac45b66f7ebf"} Apr 19 12:31:06.615744 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:06.615717 2583 generic.go:358] "Generic (PLEG): container finished" podID="8c38afde-25f7-4408-bdb5-22a5ea2b4c03" containerID="da36061db4ff6d4b21c63462eb09ce716662865545c98ce9aa4915955b585954" exitCode=0 Apr 19 12:31:06.615827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:06.615796 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tgwkw" event={"ID":"8c38afde-25f7-4408-bdb5-22a5ea2b4c03","Type":"ContainerDied","Data":"da36061db4ff6d4b21c63462eb09ce716662865545c98ce9aa4915955b585954"} Apr 19 12:31:06.618042 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:06.618020 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" event={"ID":"fa12d5b3-e8d2-411b-bd24-82d6dc52e085","Type":"ContainerStarted","Data":"6ff5dd20b6479af79be6b7f2b8a6d5b7327d11c37e40b8e582dfa7386c6a5f05"} Apr 19 12:31:06.621103 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:06.621061 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wtzrr" podStartSLOduration=3.456139147 podStartE2EDuration="21.621033017s" podCreationTimestamp="2026-04-19 12:30:45 +0000 UTC" firstStartedPulling="2026-04-19 12:30:46.596275839 +0000 UTC m=+1.833926078" lastFinishedPulling="2026-04-19 12:31:04.761169702 +0000 UTC m=+19.998819948" observedRunningTime="2026-04-19 12:31:06.620752769 +0000 UTC m=+21.858403032" watchObservedRunningTime="2026-04-19 12:31:06.621033017 +0000 UTC m=+21.858683280" Apr 19 12:31:06.633457 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:06.633413 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vqdpz" podStartSLOduration=3.4628986 podStartE2EDuration="21.633398987s" podCreationTimestamp="2026-04-19 12:30:45 +0000 UTC" firstStartedPulling="2026-04-19 12:30:46.554614102 +0000 UTC m=+1.792264345" lastFinishedPulling="2026-04-19 12:31:04.725114493 +0000 UTC m=+19.962764732" observedRunningTime="2026-04-19 12:31:06.632731774 +0000 UTC m=+21.870382035" watchObservedRunningTime="2026-04-19 12:31:06.633398987 +0000 UTC m=+21.871049249" Apr 19 12:31:06.664471 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:06.664425 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-mvtl8" podStartSLOduration=3.542881069 podStartE2EDuration="21.664410688s" podCreationTimestamp="2026-04-19 12:30:45 +0000 UTC" firstStartedPulling="2026-04-19 12:30:46.66313808 +0000 UTC m=+1.900788319" lastFinishedPulling="2026-04-19 12:31:04.784667699 +0000 UTC m=+20.022317938" observedRunningTime="2026-04-19 12:31:06.664307934 +0000 UTC m=+21.901958196" watchObservedRunningTime="2026-04-19 12:31:06.664410688 +0000 UTC m=+21.902060950" Apr 19 12:31:06.692735 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:06.692685 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w56w7" podStartSLOduration=3.618564186 podStartE2EDuration="21.692669499s" podCreationTimestamp="2026-04-19 12:30:45 +0000 UTC" firstStartedPulling="2026-04-19 12:30:46.702785227 +0000 UTC m=+1.940435470" lastFinishedPulling="2026-04-19 12:31:04.776890526 +0000 UTC m=+20.014540783" observedRunningTime="2026-04-19 12:31:06.692007338 +0000 UTC m=+21.929657613" watchObservedRunningTime="2026-04-19 12:31:06.692669499 +0000 UTC m=+21.930319762" Apr 19 12:31:06.699426 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:06.699406 2583 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 19 12:31:07.284446 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:07.284351 2583 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-19T12:31:06.699422937Z","UUID":"d34097f5-17f8-4f57-bdcb-06768a2d545b","Handler":null,"Name":"","Endpoint":""} Apr 19 12:31:07.286008 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:07.285986 2583 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 19 12:31:07.286008 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:07.286012 2583 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 19 12:31:07.623026 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:07.622815 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" event={"ID":"fa12d5b3-e8d2-411b-bd24-82d6dc52e085","Type":"ContainerStarted","Data":"6dc60db557666c8bcaafe88fd22a6efc51e9e8a064189464679c7f9923a09800"} Apr 19 12:31:07.625226 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:07.625197 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-194.ec2.internal" event={"ID":"c07507ea8671ca43efd3e919f4d2efb8","Type":"ContainerStarted","Data":"2d4bf2f47df948c3d60a0e00853dbe26452b59e58c4d4ad0154e5fa1446d1ef0"} Apr 19 12:31:07.629891 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:07.629837 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovn-acl-logging/0.log" Apr 19 12:31:07.630351 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:07.630328 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" event={"ID":"59537546-a323-4987-9ad2-4ce6e8f679c8","Type":"ContainerStarted","Data":"36b4d9d51ac1d9fc390fe698d0973caf1860297fa5a327f4efcda9a19c07d8f2"} Apr 19 12:31:08.323731 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:08.323698 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:31:08.323964 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:08.323707 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:31:08.323964 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:08.323828 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6ph6" podUID="b48a3849-491c-4513-8617-fe3c991aa057" Apr 19 12:31:08.323964 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:08.323930 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lmdfj" podUID="fb73e6b2-9f0a-4bcf-9371-0d399622fe97" Apr 19 12:31:08.634410 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:08.634326 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" event={"ID":"fa12d5b3-e8d2-411b-bd24-82d6dc52e085","Type":"ContainerStarted","Data":"026de30fc997ceedf61fef229cb6f92e90809ba5a491807484d207c01f8e6c76"} Apr 19 12:31:08.650746 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:08.650689 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nr5g9" podStartSLOduration=2.793257238 podStartE2EDuration="23.650670492s" podCreationTimestamp="2026-04-19 12:30:45 +0000 UTC" firstStartedPulling="2026-04-19 12:30:46.66598411 +0000 UTC m=+1.903634349" lastFinishedPulling="2026-04-19 12:31:07.523397363 +0000 UTC m=+22.761047603" observedRunningTime="2026-04-19 12:31:08.650343855 +0000 UTC m=+23.887994126" watchObservedRunningTime="2026-04-19 12:31:08.650670492 +0000 UTC m=+23.888320755" Apr 19 12:31:08.650919 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:08.650880 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-194.ec2.internal" podStartSLOduration=23.650873129 podStartE2EDuration="23.650873129s" podCreationTimestamp="2026-04-19 12:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:31:07.638343488 +0000 UTC m=+22.875993750" watchObservedRunningTime="2026-04-19 12:31:08.650873129 +0000 UTC m=+23.888523386" Apr 19 12:31:09.639360 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:09.639191 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovn-acl-logging/0.log" Apr 19 12:31:10.246281 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:10.246247 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vqdpz" Apr 19 12:31:10.246937 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:10.246912 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vqdpz" Apr 19 12:31:10.323533 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:10.323499 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:31:10.323676 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:10.323499 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:31:10.323676 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:10.323611 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6ph6" podUID="b48a3849-491c-4513-8617-fe3c991aa057" Apr 19 12:31:10.323791 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:10.323730 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lmdfj" podUID="fb73e6b2-9f0a-4bcf-9371-0d399622fe97" Apr 19 12:31:10.645465 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:10.645397 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovn-acl-logging/0.log" Apr 19 12:31:10.646225 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:10.645811 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" event={"ID":"59537546-a323-4987-9ad2-4ce6e8f679c8","Type":"ContainerStarted","Data":"1684b74361a07b4261ffde505ab1525dca83e9b60eebedd007e6fef8379f22de"} Apr 19 12:31:10.646225 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:10.646071 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vqdpz" Apr 19 12:31:10.646348 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:10.646332 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:31:10.646391 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:10.646355 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:31:10.646391 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:10.646369 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:31:10.646473 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:10.646411 2583 scope.go:117] "RemoveContainer" containerID="9cae1e5742ac5cb1cda9fae929cad14b900183f27e3150c0455fb7567ac4737e" Apr 19 12:31:10.646831 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:10.646813 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vqdpz" Apr 19 12:31:10.664239 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:10.664218 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:31:10.664349 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:10.664282 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:31:11.576114 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:11.575923 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-v6ph6"] Apr 19 12:31:11.576221 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:11.576202 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:31:11.576309 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:11.576292 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6ph6" podUID="b48a3849-491c-4513-8617-fe3c991aa057" Apr 19 12:31:11.578389 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:11.578366 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lmdfj"] Apr 19 12:31:11.578480 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:11.578448 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:31:11.578537 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:11.578520 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lmdfj" podUID="fb73e6b2-9f0a-4bcf-9371-0d399622fe97" Apr 19 12:31:11.650533 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:11.650442 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovn-acl-logging/0.log" Apr 19 12:31:11.650980 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:11.650761 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" event={"ID":"59537546-a323-4987-9ad2-4ce6e8f679c8","Type":"ContainerStarted","Data":"b5d69a008bb307d21d67d4f6682ff3800f42566f3a9ac2d1568dcf8e3936796b"} Apr 19 12:31:11.674719 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:11.674655 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" podStartSLOduration=8.570044538 podStartE2EDuration="26.674643626s" podCreationTimestamp="2026-04-19 12:30:45 +0000 UTC" firstStartedPulling="2026-04-19 12:30:46.706662457 +0000 UTC m=+1.944312696" lastFinishedPulling="2026-04-19 12:31:04.811261545 +0000 UTC m=+20.048911784" observedRunningTime="2026-04-19 12:31:11.674233375 +0000 UTC m=+26.911883657" watchObservedRunningTime="2026-04-19 12:31:11.674643626 +0000 UTC m=+26.912293881" Apr 19 12:31:12.653685 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:12.653656 2583 generic.go:358] "Generic (PLEG): container finished" podID="8c38afde-25f7-4408-bdb5-22a5ea2b4c03" containerID="2022e1b2fb5131462ea985d19c4b4d2602291a32dfb2d6b0e9390ea8338f10a1" exitCode=0 Apr 19 12:31:12.654089 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:12.653747 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tgwkw" event={"ID":"8c38afde-25f7-4408-bdb5-22a5ea2b4c03","Type":"ContainerDied","Data":"2022e1b2fb5131462ea985d19c4b4d2602291a32dfb2d6b0e9390ea8338f10a1"} Apr 19 12:31:13.322951 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:13.322919 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:31:13.323220 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:13.322919 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:31:13.323220 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:13.323053 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lmdfj" podUID="fb73e6b2-9f0a-4bcf-9371-0d399622fe97" Apr 19 12:31:13.323220 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:13.323096 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6ph6" podUID="b48a3849-491c-4513-8617-fe3c991aa057" Apr 19 12:31:14.659210 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:14.659132 2583 generic.go:358] "Generic (PLEG): container finished" podID="8c38afde-25f7-4408-bdb5-22a5ea2b4c03" containerID="55222a7038fbc9b76900c54eb630c1736b88a50363462d7c760e0761b0b063a9" exitCode=0 Apr 19 12:31:14.659210 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:14.659186 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tgwkw" event={"ID":"8c38afde-25f7-4408-bdb5-22a5ea2b4c03","Type":"ContainerDied","Data":"55222a7038fbc9b76900c54eb630c1736b88a50363462d7c760e0761b0b063a9"} Apr 19 12:31:15.323703 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:15.323664 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:31:15.323922 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:15.323750 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6ph6" podUID="b48a3849-491c-4513-8617-fe3c991aa057" Apr 19 12:31:15.323922 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:15.323867 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:31:15.324048 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:15.324007 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lmdfj" podUID="fb73e6b2-9f0a-4bcf-9371-0d399622fe97" Apr 19 12:31:16.664549 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:16.664372 2583 generic.go:358] "Generic (PLEG): container finished" podID="8c38afde-25f7-4408-bdb5-22a5ea2b4c03" containerID="9b17f89fefc8ab47794e423435979cbf3763dd4c0a5e539a1c6835f4861ca4f6" exitCode=0 Apr 19 12:31:16.664976 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:16.664462 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tgwkw" event={"ID":"8c38afde-25f7-4408-bdb5-22a5ea2b4c03","Type":"ContainerDied","Data":"9b17f89fefc8ab47794e423435979cbf3763dd4c0a5e539a1c6835f4861ca4f6"} Apr 19 12:31:17.326907 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:17.326878 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:31:17.327093 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:17.326883 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:31:17.327093 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:17.327004 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6ph6" podUID="b48a3849-491c-4513-8617-fe3c991aa057" Apr 19 12:31:17.327093 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:17.327063 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lmdfj" podUID="fb73e6b2-9f0a-4bcf-9371-0d399622fe97" Apr 19 12:31:17.979711 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:17.979676 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xstvj\" (UniqueName: \"kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj\") pod \"network-check-target-v6ph6\" (UID: \"b48a3849-491c-4513-8617-fe3c991aa057\") " pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:31:17.980100 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:17.979822 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:31:17.980100 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:17.979841 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:31:17.980100 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:17.979874 2583 projected.go:194] Error preparing data for projected volume kube-api-access-xstvj for pod openshift-network-diagnostics/network-check-target-v6ph6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:31:17.980100 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:17.979933 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj podName:b48a3849-491c-4513-8617-fe3c991aa057 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:49.979914716 +0000 UTC m=+65.217564959 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-xstvj" (UniqueName: "kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj") pod "network-check-target-v6ph6" (UID: "b48a3849-491c-4513-8617-fe3c991aa057") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:31:18.076336 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.076306 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-194.ec2.internal" event="NodeReady" Apr 19 12:31:18.076500 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.076444 2583 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 19 12:31:18.080692 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.080669 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs\") pod \"network-metrics-daemon-lmdfj\" (UID: \"fb73e6b2-9f0a-4bcf-9371-0d399622fe97\") " pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:31:18.080832 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:18.080813 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:31:18.080910 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:18.080899 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs podName:fb73e6b2-9f0a-4bcf-9371-0d399622fe97 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:50.080883377 +0000 UTC m=+65.318533620 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs") pod "network-metrics-daemon-lmdfj" (UID: "fb73e6b2-9f0a-4bcf-9371-0d399622fe97") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:31:18.123082 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.123053 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-z7glk"] Apr 19 12:31:18.148481 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.148458 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vlsjm"] Apr 19 12:31:18.148646 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.148629 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z7glk" Apr 19 12:31:18.150905 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.150888 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-62vx7\"" Apr 19 12:31:18.151008 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.150913 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 19 12:31:18.151008 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.150925 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 19 12:31:18.167950 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.167933 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vlsjm"] Apr 19 12:31:18.167950 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.167953 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-z7glk"] Apr 19 12:31:18.168072 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.168028 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vlsjm" Apr 19 12:31:18.171299 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.171257 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 19 12:31:18.171299 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.171271 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 19 12:31:18.171472 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.171356 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 19 12:31:18.171643 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.171627 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xbkh5\"" Apr 19 12:31:18.282319 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.282241 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d466489-3e13-462c-b90d-4b13b586caae-config-volume\") pod \"dns-default-z7glk\" (UID: \"6d466489-3e13-462c-b90d-4b13b586caae\") " pod="openshift-dns/dns-default-z7glk" Apr 19 12:31:18.282452 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.282316 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xwnl\" (UniqueName: \"kubernetes.io/projected/e9a634b4-9352-4e62-926c-390e9b19d228-kube-api-access-5xwnl\") pod \"ingress-canary-vlsjm\" (UID: \"e9a634b4-9352-4e62-926c-390e9b19d228\") " pod="openshift-ingress-canary/ingress-canary-vlsjm" Apr 19 12:31:18.282452 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.282378 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzcvk\" (UniqueName: \"kubernetes.io/projected/6d466489-3e13-462c-b90d-4b13b586caae-kube-api-access-bzcvk\") pod \"dns-default-z7glk\" (UID: \"6d466489-3e13-462c-b90d-4b13b586caae\") " pod="openshift-dns/dns-default-z7glk" Apr 19 12:31:18.282452 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.282419 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert\") pod \"ingress-canary-vlsjm\" (UID: \"e9a634b4-9352-4e62-926c-390e9b19d228\") " pod="openshift-ingress-canary/ingress-canary-vlsjm" Apr 19 12:31:18.282570 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.282454 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6d466489-3e13-462c-b90d-4b13b586caae-tmp-dir\") pod \"dns-default-z7glk\" (UID: \"6d466489-3e13-462c-b90d-4b13b586caae\") " pod="openshift-dns/dns-default-z7glk" Apr 19 12:31:18.282570 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.282479 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls\") pod \"dns-default-z7glk\" (UID: \"6d466489-3e13-462c-b90d-4b13b586caae\") " pod="openshift-dns/dns-default-z7glk" Apr 19 12:31:18.382927 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.382889 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d466489-3e13-462c-b90d-4b13b586caae-config-volume\") pod \"dns-default-z7glk\" (UID: \"6d466489-3e13-462c-b90d-4b13b586caae\") " pod="openshift-dns/dns-default-z7glk" Apr 19 12:31:18.383137 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.382995 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xwnl\" (UniqueName: \"kubernetes.io/projected/e9a634b4-9352-4e62-926c-390e9b19d228-kube-api-access-5xwnl\") pod \"ingress-canary-vlsjm\" (UID: \"e9a634b4-9352-4e62-926c-390e9b19d228\") " pod="openshift-ingress-canary/ingress-canary-vlsjm" Apr 19 12:31:18.383137 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.383023 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzcvk\" (UniqueName: \"kubernetes.io/projected/6d466489-3e13-462c-b90d-4b13b586caae-kube-api-access-bzcvk\") pod \"dns-default-z7glk\" (UID: \"6d466489-3e13-462c-b90d-4b13b586caae\") " pod="openshift-dns/dns-default-z7glk" Apr 19 12:31:18.383137 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.383047 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert\") pod \"ingress-canary-vlsjm\" (UID: \"e9a634b4-9352-4e62-926c-390e9b19d228\") " pod="openshift-ingress-canary/ingress-canary-vlsjm" Apr 19 12:31:18.383137 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.383080 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6d466489-3e13-462c-b90d-4b13b586caae-tmp-dir\") pod \"dns-default-z7glk\" (UID: \"6d466489-3e13-462c-b90d-4b13b586caae\") " pod="openshift-dns/dns-default-z7glk" Apr 19 12:31:18.383137 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.383108 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls\") pod \"dns-default-z7glk\" (UID: \"6d466489-3e13-462c-b90d-4b13b586caae\") " pod="openshift-dns/dns-default-z7glk" Apr 19 12:31:18.383331 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:18.383191 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:31:18.383331 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:18.383189 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:31:18.383331 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:18.383245 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert podName:e9a634b4-9352-4e62-926c-390e9b19d228 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:18.883224805 +0000 UTC m=+34.120875052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert") pod "ingress-canary-vlsjm" (UID: "e9a634b4-9352-4e62-926c-390e9b19d228") : secret "canary-serving-cert" not found Apr 19 12:31:18.383331 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:18.383264 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls podName:6d466489-3e13-462c-b90d-4b13b586caae nodeName:}" failed. No retries permitted until 2026-04-19 12:31:18.88325544 +0000 UTC m=+34.120905685 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls") pod "dns-default-z7glk" (UID: "6d466489-3e13-462c-b90d-4b13b586caae") : secret "dns-default-metrics-tls" not found Apr 19 12:31:18.383529 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.383495 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6d466489-3e13-462c-b90d-4b13b586caae-tmp-dir\") pod \"dns-default-z7glk\" (UID: \"6d466489-3e13-462c-b90d-4b13b586caae\") " pod="openshift-dns/dns-default-z7glk" Apr 19 12:31:18.383565 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.383544 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d466489-3e13-462c-b90d-4b13b586caae-config-volume\") pod \"dns-default-z7glk\" (UID: \"6d466489-3e13-462c-b90d-4b13b586caae\") " pod="openshift-dns/dns-default-z7glk" Apr 19 12:31:18.395436 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.395408 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzcvk\" (UniqueName: \"kubernetes.io/projected/6d466489-3e13-462c-b90d-4b13b586caae-kube-api-access-bzcvk\") pod \"dns-default-z7glk\" (UID: \"6d466489-3e13-462c-b90d-4b13b586caae\") " pod="openshift-dns/dns-default-z7glk" Apr 19 12:31:18.395558 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.395411 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xwnl\" (UniqueName: \"kubernetes.io/projected/e9a634b4-9352-4e62-926c-390e9b19d228-kube-api-access-5xwnl\") pod \"ingress-canary-vlsjm\" (UID: \"e9a634b4-9352-4e62-926c-390e9b19d228\") " pod="openshift-ingress-canary/ingress-canary-vlsjm" Apr 19 12:31:18.885807 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.885754 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls\") pod \"dns-default-z7glk\" (UID: \"6d466489-3e13-462c-b90d-4b13b586caae\") " pod="openshift-dns/dns-default-z7glk" Apr 19 12:31:18.886015 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:18.885894 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert\") pod \"ingress-canary-vlsjm\" (UID: \"e9a634b4-9352-4e62-926c-390e9b19d228\") " pod="openshift-ingress-canary/ingress-canary-vlsjm" Apr 19 12:31:18.886015 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:18.885970 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:31:18.886015 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:18.885999 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:31:18.886154 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:18.886051 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls podName:6d466489-3e13-462c-b90d-4b13b586caae nodeName:}" failed. No retries permitted until 2026-04-19 12:31:19.886030288 +0000 UTC m=+35.123680532 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls") pod "dns-default-z7glk" (UID: "6d466489-3e13-462c-b90d-4b13b586caae") : secret "dns-default-metrics-tls" not found Apr 19 12:31:18.886154 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:18.886071 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert podName:e9a634b4-9352-4e62-926c-390e9b19d228 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:19.886061241 +0000 UTC m=+35.123711485 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert") pod "ingress-canary-vlsjm" (UID: "e9a634b4-9352-4e62-926c-390e9b19d228") : secret "canary-serving-cert" not found Apr 19 12:31:19.327275 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:19.327240 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:31:19.327955 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:19.327242 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:31:19.329833 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:19.329807 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 19 12:31:19.329991 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:19.329837 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 19 12:31:19.329991 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:19.329861 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nbhvx\"" Apr 19 12:31:19.329991 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:19.329905 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 19 12:31:19.329991 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:19.329840 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4sh8l\"" Apr 19 12:31:19.893997 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:19.893970 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert\") pod \"ingress-canary-vlsjm\" (UID: \"e9a634b4-9352-4e62-926c-390e9b19d228\") " pod="openshift-ingress-canary/ingress-canary-vlsjm" Apr 19 12:31:19.894183 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:19.894007 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls\") pod \"dns-default-z7glk\" (UID: \"6d466489-3e13-462c-b90d-4b13b586caae\") " pod="openshift-dns/dns-default-z7glk" Apr 19 12:31:19.894183 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:19.894133 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:31:19.894299 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:19.894133 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:31:19.894299 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:19.894201 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert podName:e9a634b4-9352-4e62-926c-390e9b19d228 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:21.894181844 +0000 UTC m=+37.131832083 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert") pod "ingress-canary-vlsjm" (UID: "e9a634b4-9352-4e62-926c-390e9b19d228") : secret "canary-serving-cert" not found Apr 19 12:31:19.894299 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:19.894273 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls podName:6d466489-3e13-462c-b90d-4b13b586caae nodeName:}" failed. No retries permitted until 2026-04-19 12:31:21.894252653 +0000 UTC m=+37.131902904 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls") pod "dns-default-z7glk" (UID: "6d466489-3e13-462c-b90d-4b13b586caae") : secret "dns-default-metrics-tls" not found Apr 19 12:31:20.983921 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:20.983881 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bfb848955-zdmbs"] Apr 19 12:31:21.014496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:21.014472 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bfb848955-zdmbs"] Apr 19 12:31:21.014654 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:21.014584 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bfb848955-zdmbs" Apr 19 12:31:21.017554 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:21.017529 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 19 12:31:21.017682 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:21.017562 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 19 12:31:21.017682 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:21.017588 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-2f4cg\"" Apr 19 12:31:21.017682 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:21.017529 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 19 12:31:21.017877 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:21.017837 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 19 12:31:21.105054 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:21.105011 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/133176e9-8b1c-4614-bf03-e320af8ca89e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7bfb848955-zdmbs\" (UID: \"133176e9-8b1c-4614-bf03-e320af8ca89e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bfb848955-zdmbs" Apr 19 12:31:21.105230 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:21.105140 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwbsr\" (UniqueName: \"kubernetes.io/projected/133176e9-8b1c-4614-bf03-e320af8ca89e-kube-api-access-fwbsr\") pod \"managed-serviceaccount-addon-agent-7bfb848955-zdmbs\" (UID: \"133176e9-8b1c-4614-bf03-e320af8ca89e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bfb848955-zdmbs" Apr 19 12:31:21.205955 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:21.205914 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwbsr\" (UniqueName: \"kubernetes.io/projected/133176e9-8b1c-4614-bf03-e320af8ca89e-kube-api-access-fwbsr\") pod \"managed-serviceaccount-addon-agent-7bfb848955-zdmbs\" (UID: \"133176e9-8b1c-4614-bf03-e320af8ca89e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bfb848955-zdmbs" Apr 19 12:31:21.206133 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:21.206004 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/133176e9-8b1c-4614-bf03-e320af8ca89e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7bfb848955-zdmbs\" (UID: \"133176e9-8b1c-4614-bf03-e320af8ca89e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bfb848955-zdmbs" Apr 19 12:31:21.209363 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:21.209334 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/133176e9-8b1c-4614-bf03-e320af8ca89e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7bfb848955-zdmbs\" (UID: \"133176e9-8b1c-4614-bf03-e320af8ca89e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bfb848955-zdmbs" Apr 19 12:31:21.216792 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:21.216769 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwbsr\" (UniqueName: \"kubernetes.io/projected/133176e9-8b1c-4614-bf03-e320af8ca89e-kube-api-access-fwbsr\") pod \"managed-serviceaccount-addon-agent-7bfb848955-zdmbs\" (UID: \"133176e9-8b1c-4614-bf03-e320af8ca89e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bfb848955-zdmbs" Apr 19 12:31:21.337109 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:21.337035 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bfb848955-zdmbs" Apr 19 12:31:21.911840 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:21.911804 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert\") pod \"ingress-canary-vlsjm\" (UID: \"e9a634b4-9352-4e62-926c-390e9b19d228\") " pod="openshift-ingress-canary/ingress-canary-vlsjm" Apr 19 12:31:21.912036 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:21.911874 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls\") pod \"dns-default-z7glk\" (UID: \"6d466489-3e13-462c-b90d-4b13b586caae\") " pod="openshift-dns/dns-default-z7glk" Apr 19 12:31:21.912036 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:21.911951 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:31:21.912036 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:21.911969 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:31:21.912036 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:21.912026 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert podName:e9a634b4-9352-4e62-926c-390e9b19d228 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:25.912009194 +0000 UTC m=+41.149659442 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert") pod "ingress-canary-vlsjm" (UID: "e9a634b4-9352-4e62-926c-390e9b19d228") : secret "canary-serving-cert" not found Apr 19 12:31:21.912196 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:21.912041 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls podName:6d466489-3e13-462c-b90d-4b13b586caae nodeName:}" failed. No retries permitted until 2026-04-19 12:31:25.912034771 +0000 UTC m=+41.149685010 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls") pod "dns-default-z7glk" (UID: "6d466489-3e13-462c-b90d-4b13b586caae") : secret "dns-default-metrics-tls" not found Apr 19 12:31:22.465801 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:22.465630 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bfb848955-zdmbs"] Apr 19 12:31:22.502656 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:31:22.502616 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod133176e9_8b1c_4614_bf03_e320af8ca89e.slice/crio-46baf8ec5cc25674a9cab6ca22c91a02fada1226864fd5ec77427e944a1c807e WatchSource:0}: Error finding container 46baf8ec5cc25674a9cab6ca22c91a02fada1226864fd5ec77427e944a1c807e: Status 404 returned error can't find the container with id 46baf8ec5cc25674a9cab6ca22c91a02fada1226864fd5ec77427e944a1c807e Apr 19 12:31:22.678904 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:22.678870 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tgwkw" event={"ID":"8c38afde-25f7-4408-bdb5-22a5ea2b4c03","Type":"ContainerStarted","Data":"a4e58630688464618b658e2394df84948569c3f4c2e545f2fdf22c274a9d4b08"} Apr 19 12:31:22.680065 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:22.680043 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bfb848955-zdmbs" event={"ID":"133176e9-8b1c-4614-bf03-e320af8ca89e","Type":"ContainerStarted","Data":"46baf8ec5cc25674a9cab6ca22c91a02fada1226864fd5ec77427e944a1c807e"} Apr 19 12:31:23.685150 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:23.685100 2583 generic.go:358] "Generic (PLEG): container finished" podID="8c38afde-25f7-4408-bdb5-22a5ea2b4c03" containerID="a4e58630688464618b658e2394df84948569c3f4c2e545f2fdf22c274a9d4b08" exitCode=0 Apr 19 12:31:23.685150 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:23.685149 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tgwkw" event={"ID":"8c38afde-25f7-4408-bdb5-22a5ea2b4c03","Type":"ContainerDied","Data":"a4e58630688464618b658e2394df84948569c3f4c2e545f2fdf22c274a9d4b08"} Apr 19 12:31:24.690616 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:24.690576 2583 generic.go:358] "Generic (PLEG): container finished" podID="8c38afde-25f7-4408-bdb5-22a5ea2b4c03" containerID="1dc3f93b74d9ea88a2bf88f36a9c22c326c4baa6bd428c777d4c35b549ec9689" exitCode=0 Apr 19 12:31:24.691031 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:24.690629 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tgwkw" event={"ID":"8c38afde-25f7-4408-bdb5-22a5ea2b4c03","Type":"ContainerDied","Data":"1dc3f93b74d9ea88a2bf88f36a9c22c326c4baa6bd428c777d4c35b549ec9689"} Apr 19 12:31:25.695662 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:25.695623 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tgwkw" event={"ID":"8c38afde-25f7-4408-bdb5-22a5ea2b4c03","Type":"ContainerStarted","Data":"a70ecf8ad1f4f8a244121f8fc2da6b41e1495d6f9be763d7ffe0e6ff79fa0e5b"} Apr 19 12:31:25.696866 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:25.696817 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bfb848955-zdmbs" event={"ID":"133176e9-8b1c-4614-bf03-e320af8ca89e","Type":"ContainerStarted","Data":"cf81b9195b7ee03ecbbde987bd7234f55fd0d73db7ba52b39ac69fdd23ef6061"} Apr 19 12:31:25.715341 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:25.715305 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tgwkw" podStartSLOduration=4.782855463 podStartE2EDuration="40.715292205s" podCreationTimestamp="2026-04-19 12:30:45 +0000 UTC" firstStartedPulling="2026-04-19 12:30:46.599908029 +0000 UTC m=+1.837558271" lastFinishedPulling="2026-04-19 12:31:22.532344773 +0000 UTC m=+37.769995013" observedRunningTime="2026-04-19 12:31:25.714547225 +0000 UTC m=+40.952197487" watchObservedRunningTime="2026-04-19 12:31:25.715292205 +0000 UTC m=+40.952942462" Apr 19 12:31:25.728618 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:25.728584 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bfb848955-zdmbs" podStartSLOduration=3.101992132 podStartE2EDuration="5.728572288s" podCreationTimestamp="2026-04-19 12:31:20 +0000 UTC" firstStartedPulling="2026-04-19 12:31:22.510883235 +0000 UTC m=+37.748533478" lastFinishedPulling="2026-04-19 12:31:25.137463382 +0000 UTC m=+40.375113634" observedRunningTime="2026-04-19 12:31:25.728239097 +0000 UTC m=+40.965889359" watchObservedRunningTime="2026-04-19 12:31:25.728572288 +0000 UTC m=+40.966222549" Apr 19 12:31:25.940688 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:25.940663 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert\") pod \"ingress-canary-vlsjm\" (UID: \"e9a634b4-9352-4e62-926c-390e9b19d228\") " pod="openshift-ingress-canary/ingress-canary-vlsjm" Apr 19 12:31:25.940875 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:25.940700 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls\") pod \"dns-default-z7glk\" (UID: \"6d466489-3e13-462c-b90d-4b13b586caae\") " pod="openshift-dns/dns-default-z7glk" Apr 19 12:31:25.940875 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:25.940792 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:31:25.940875 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:25.940797 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:31:25.940875 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:25.940836 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls podName:6d466489-3e13-462c-b90d-4b13b586caae nodeName:}" failed. No retries permitted until 2026-04-19 12:31:33.940824059 +0000 UTC m=+49.178474298 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls") pod "dns-default-z7glk" (UID: "6d466489-3e13-462c-b90d-4b13b586caae") : secret "dns-default-metrics-tls" not found Apr 19 12:31:25.940875 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:25.940871 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert podName:e9a634b4-9352-4e62-926c-390e9b19d228 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:33.940865216 +0000 UTC m=+49.178515455 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert") pod "ingress-canary-vlsjm" (UID: "e9a634b4-9352-4e62-926c-390e9b19d228") : secret "canary-serving-cert" not found Apr 19 12:31:33.994822 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:33.994778 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert\") pod \"ingress-canary-vlsjm\" (UID: \"e9a634b4-9352-4e62-926c-390e9b19d228\") " pod="openshift-ingress-canary/ingress-canary-vlsjm" Apr 19 12:31:33.994822 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:33.994828 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls\") pod \"dns-default-z7glk\" (UID: \"6d466489-3e13-462c-b90d-4b13b586caae\") " pod="openshift-dns/dns-default-z7glk" Apr 19 12:31:33.995381 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:33.994940 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:31:33.995381 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:33.994942 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:31:33.995381 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:33.994992 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls podName:6d466489-3e13-462c-b90d-4b13b586caae nodeName:}" failed. No retries permitted until 2026-04-19 12:31:49.994977698 +0000 UTC m=+65.232627936 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls") pod "dns-default-z7glk" (UID: "6d466489-3e13-462c-b90d-4b13b586caae") : secret "dns-default-metrics-tls" not found Apr 19 12:31:33.995381 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:33.995007 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert podName:e9a634b4-9352-4e62-926c-390e9b19d228 nodeName:}" failed. No retries permitted until 2026-04-19 12:31:49.994999128 +0000 UTC m=+65.232649367 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert") pod "ingress-canary-vlsjm" (UID: "e9a634b4-9352-4e62-926c-390e9b19d228") : secret "canary-serving-cert" not found Apr 19 12:31:42.669208 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:42.669175 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pvnl8" Apr 19 12:31:49.997036 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:49.997004 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xstvj\" (UniqueName: \"kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj\") pod \"network-check-target-v6ph6\" (UID: \"b48a3849-491c-4513-8617-fe3c991aa057\") " pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:31:49.997036 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:49.997041 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert\") pod \"ingress-canary-vlsjm\" (UID: \"e9a634b4-9352-4e62-926c-390e9b19d228\") " pod="openshift-ingress-canary/ingress-canary-vlsjm" Apr 19 12:31:49.997448 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:49.997062 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls\") pod \"dns-default-z7glk\" (UID: \"6d466489-3e13-462c-b90d-4b13b586caae\") " pod="openshift-dns/dns-default-z7glk" Apr 19 12:31:49.997448 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:49.997144 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:31:49.997448 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:49.997155 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:31:49.997448 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:49.997187 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls podName:6d466489-3e13-462c-b90d-4b13b586caae nodeName:}" failed. No retries permitted until 2026-04-19 12:32:21.997174148 +0000 UTC m=+97.234824387 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls") pod "dns-default-z7glk" (UID: "6d466489-3e13-462c-b90d-4b13b586caae") : secret "dns-default-metrics-tls" not found Apr 19 12:31:49.997448 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:49.997255 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert podName:e9a634b4-9352-4e62-926c-390e9b19d228 nodeName:}" failed. No retries permitted until 2026-04-19 12:32:21.997237031 +0000 UTC m=+97.234887283 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert") pod "ingress-canary-vlsjm" (UID: "e9a634b4-9352-4e62-926c-390e9b19d228") : secret "canary-serving-cert" not found Apr 19 12:31:49.999135 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:49.999116 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 19 12:31:50.009514 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:50.009498 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 19 12:31:50.021914 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:50.021887 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xstvj\" (UniqueName: \"kubernetes.io/projected/b48a3849-491c-4513-8617-fe3c991aa057-kube-api-access-xstvj\") pod \"network-check-target-v6ph6\" (UID: \"b48a3849-491c-4513-8617-fe3c991aa057\") " pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:31:50.097896 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:50.097868 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs\") pod \"network-metrics-daemon-lmdfj\" (UID: \"fb73e6b2-9f0a-4bcf-9371-0d399622fe97\") " pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:31:50.099797 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:50.099778 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 19 12:31:50.108619 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:50.108601 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 19 12:31:50.108668 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:31:50.108651 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs podName:fb73e6b2-9f0a-4bcf-9371-0d399622fe97 nodeName:}" failed. No retries permitted until 2026-04-19 12:32:54.108637623 +0000 UTC m=+129.346287862 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs") pod "network-metrics-daemon-lmdfj" (UID: "fb73e6b2-9f0a-4bcf-9371-0d399622fe97") : secret "metrics-daemon-secret" not found Apr 19 12:31:50.241859 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:50.241830 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4sh8l\"" Apr 19 12:31:50.250539 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:50.250492 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:31:50.365478 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:50.365439 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-v6ph6"] Apr 19 12:31:50.369547 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:31:50.369520 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb48a3849_491c_4513_8617_fe3c991aa057.slice/crio-61c1d78203dea37f439b912f73cfcf426df75651ddf0d9e124f189625fbf1bd8 WatchSource:0}: Error finding container 61c1d78203dea37f439b912f73cfcf426df75651ddf0d9e124f189625fbf1bd8: Status 404 returned error can't find the container with id 61c1d78203dea37f439b912f73cfcf426df75651ddf0d9e124f189625fbf1bd8 Apr 19 12:31:50.744286 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:50.744256 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-v6ph6" event={"ID":"b48a3849-491c-4513-8617-fe3c991aa057","Type":"ContainerStarted","Data":"61c1d78203dea37f439b912f73cfcf426df75651ddf0d9e124f189625fbf1bd8"} Apr 19 12:31:53.751095 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:53.751065 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-v6ph6" event={"ID":"b48a3849-491c-4513-8617-fe3c991aa057","Type":"ContainerStarted","Data":"ba4a68caf96701aef8fc979d0103b61d5eacacbc2f5adb9490e8debf528fa6b7"} Apr 19 12:31:53.751540 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:53.751201 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:31:53.764415 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:31:53.764372 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-v6ph6" podStartSLOduration=65.868649027 podStartE2EDuration="1m8.764357854s" podCreationTimestamp="2026-04-19 12:30:45 +0000 UTC" firstStartedPulling="2026-04-19 12:31:50.371998754 +0000 UTC m=+65.609648996" lastFinishedPulling="2026-04-19 12:31:53.267707581 +0000 UTC m=+68.505357823" observedRunningTime="2026-04-19 12:31:53.764058483 +0000 UTC m=+69.001708745" watchObservedRunningTime="2026-04-19 12:31:53.764357854 +0000 UTC m=+69.002008116" Apr 19 12:32:22.008205 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:22.008163 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls\") pod \"dns-default-z7glk\" (UID: \"6d466489-3e13-462c-b90d-4b13b586caae\") " pod="openshift-dns/dns-default-z7glk" Apr 19 12:32:22.008635 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:22.008250 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert\") pod \"ingress-canary-vlsjm\" (UID: \"e9a634b4-9352-4e62-926c-390e9b19d228\") " pod="openshift-ingress-canary/ingress-canary-vlsjm" Apr 19 12:32:22.008635 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:22.008330 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:32:22.008635 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:22.008330 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:32:22.008635 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:22.008406 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls podName:6d466489-3e13-462c-b90d-4b13b586caae nodeName:}" failed. No retries permitted until 2026-04-19 12:33:26.008387947 +0000 UTC m=+161.246038186 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls") pod "dns-default-z7glk" (UID: "6d466489-3e13-462c-b90d-4b13b586caae") : secret "dns-default-metrics-tls" not found Apr 19 12:32:22.008635 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:22.008421 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert podName:e9a634b4-9352-4e62-926c-390e9b19d228 nodeName:}" failed. No retries permitted until 2026-04-19 12:33:26.008415341 +0000 UTC m=+161.246065579 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert") pod "ingress-canary-vlsjm" (UID: "e9a634b4-9352-4e62-926c-390e9b19d228") : secret "canary-serving-cert" not found Apr 19 12:32:24.755419 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:24.755392 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-v6ph6" Apr 19 12:32:52.270084 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.270050 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-68bf8db9c6-c69fm"] Apr 19 12:32:52.272773 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.272757 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:32:52.274626 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.274599 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 19 12:32:52.274797 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.274777 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 19 12:32:52.274883 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.274796 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 19 12:32:52.274883 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.274870 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 19 12:32:52.275144 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.275128 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 19 12:32:52.275212 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.275135 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-vgt2w\"" Apr 19 12:32:52.275265 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.275243 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 19 12:32:52.280418 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.280394 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-68bf8db9c6-c69fm"] Apr 19 12:32:52.364697 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.364668 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-glrsn"] Apr 19 12:32:52.367555 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.367496 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-glrsn" Apr 19 12:32:52.369374 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.369354 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:32:52.369495 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.369379 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-454q6\"" Apr 19 12:32:52.369495 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.369385 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 19 12:32:52.375401 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.375379 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-glrsn"] Apr 19 12:32:52.405421 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.405395 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-default-certificate\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:32:52.405516 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.405425 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-stats-auth\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:32:52.405516 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.405447 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-metrics-certs\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:32:52.405583 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.405532 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f6bf2fb-d79a-44fa-bb49-34879196ab43-service-ca-bundle\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:32:52.405583 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.405562 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcqkm\" (UniqueName: \"kubernetes.io/projected/8f6bf2fb-d79a-44fa-bb49-34879196ab43-kube-api-access-kcqkm\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:32:52.471735 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.471712 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lkhb4"] Apr 19 12:32:52.474628 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.474615 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lkhb4" Apr 19 12:32:52.476128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.476109 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-kxff4"] Apr 19 12:32:52.476810 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.476787 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 19 12:32:52.476948 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.476836 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 19 12:32:52.476948 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.476869 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 19 12:32:52.477117 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.477101 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:32:52.477590 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.477576 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-vl9s9\"" Apr 19 12:32:52.478762 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.478745 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-b9cb554c4-2cbcc"] Apr 19 12:32:52.478913 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.478899 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-kxff4" Apr 19 12:32:52.480889 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.480866 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-6wskm\"" Apr 19 12:32:52.480981 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.480919 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 19 12:32:52.481185 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.481168 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 19 12:32:52.481238 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.481208 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 19 12:32:52.481291 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.481250 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 19 12:32:52.483656 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.483635 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.486401 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.486321 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 19 12:32:52.486401 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.486345 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 19 12:32:52.486566 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.486442 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 19 12:32:52.487267 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.487220 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-v66mr\"" Apr 19 12:32:52.487470 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.487440 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lkhb4"] Apr 19 12:32:52.489119 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.489103 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 19 12:32:52.491469 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.491450 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-kxff4"] Apr 19 12:32:52.493082 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.493062 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 19 12:32:52.493716 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.493698 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b9cb554c4-2cbcc"] Apr 19 12:32:52.506124 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.506103 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-default-certificate\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:32:52.506209 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.506130 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-stats-auth\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:32:52.506209 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.506152 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-metrics-certs\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:32:52.506209 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.506194 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxw5x\" (UniqueName: \"kubernetes.io/projected/a7a09300-5b01-42d1-9c9e-64749c7103a2-kube-api-access-qxw5x\") pod \"volume-data-source-validator-7c6cbb6c87-glrsn\" (UID: \"a7a09300-5b01-42d1-9c9e-64749c7103a2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-glrsn" Apr 19 12:32:52.506330 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.506235 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f6bf2fb-d79a-44fa-bb49-34879196ab43-service-ca-bundle\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:32:52.506330 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:52.506262 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 19 12:32:52.506421 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:52.506331 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-metrics-certs podName:8f6bf2fb-d79a-44fa-bb49-34879196ab43 nodeName:}" failed. No retries permitted until 2026-04-19 12:32:53.006310911 +0000 UTC m=+128.243961153 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-metrics-certs") pod "router-default-68bf8db9c6-c69fm" (UID: "8f6bf2fb-d79a-44fa-bb49-34879196ab43") : secret "router-metrics-certs-default" not found Apr 19 12:32:52.506421 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.506263 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcqkm\" (UniqueName: \"kubernetes.io/projected/8f6bf2fb-d79a-44fa-bb49-34879196ab43-kube-api-access-kcqkm\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:32:52.506421 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:52.506350 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8f6bf2fb-d79a-44fa-bb49-34879196ab43-service-ca-bundle podName:8f6bf2fb-d79a-44fa-bb49-34879196ab43 nodeName:}" failed. No retries permitted until 2026-04-19 12:32:53.00633953 +0000 UTC m=+128.243989774 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/8f6bf2fb-d79a-44fa-bb49-34879196ab43-service-ca-bundle") pod "router-default-68bf8db9c6-c69fm" (UID: "8f6bf2fb-d79a-44fa-bb49-34879196ab43") : configmap references non-existent config key: service-ca.crt Apr 19 12:32:52.508611 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.508582 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-stats-auth\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:32:52.508694 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.508641 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-default-certificate\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:32:52.521276 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.521220 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcqkm\" (UniqueName: \"kubernetes.io/projected/8f6bf2fb-d79a-44fa-bb49-34879196ab43-kube-api-access-kcqkm\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:32:52.607454 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.607427 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a7f8c09c-afa9-4393-b47b-0e8efface148-tmp\") pod \"insights-operator-585dfdc468-kxff4\" (UID: \"a7f8c09c-afa9-4393-b47b-0e8efface148\") " pod="openshift-insights/insights-operator-585dfdc468-kxff4" Apr 19 12:32:52.607571 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.607458 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7f8c09c-afa9-4393-b47b-0e8efface148-serving-cert\") pod \"insights-operator-585dfdc468-kxff4\" (UID: \"a7f8c09c-afa9-4393-b47b-0e8efface148\") " pod="openshift-insights/insights-operator-585dfdc468-kxff4" Apr 19 12:32:52.607571 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.607476 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh2vz\" (UniqueName: \"kubernetes.io/projected/6788f440-7533-43d4-acaf-4fac75b17707-kube-api-access-jh2vz\") pod \"service-ca-operator-d6fc45fc5-lkhb4\" (UID: \"6788f440-7533-43d4-acaf-4fac75b17707\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lkhb4" Apr 19 12:32:52.607571 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.607494 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a7f8c09c-afa9-4393-b47b-0e8efface148-snapshots\") pod \"insights-operator-585dfdc468-kxff4\" (UID: \"a7f8c09c-afa9-4393-b47b-0e8efface148\") " pod="openshift-insights/insights-operator-585dfdc468-kxff4" Apr 19 12:32:52.607571 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.607540 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxw5x\" (UniqueName: \"kubernetes.io/projected/a7a09300-5b01-42d1-9c9e-64749c7103a2-kube-api-access-qxw5x\") pod \"volume-data-source-validator-7c6cbb6c87-glrsn\" (UID: \"a7a09300-5b01-42d1-9c9e-64749c7103a2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-glrsn" Apr 19 12:32:52.607571 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.607567 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hm2n\" (UniqueName: \"kubernetes.io/projected/a7f8c09c-afa9-4393-b47b-0e8efface148-kube-api-access-8hm2n\") pod \"insights-operator-585dfdc468-kxff4\" (UID: \"a7f8c09c-afa9-4393-b47b-0e8efface148\") " pod="openshift-insights/insights-operator-585dfdc468-kxff4" Apr 19 12:32:52.607730 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.607609 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-tls\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.607730 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.607625 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b81e1107-0f07-4af8-8e14-097b73f57b5f-trusted-ca\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.607730 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.607647 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b81e1107-0f07-4af8-8e14-097b73f57b5f-installation-pull-secrets\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.607730 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.607676 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b81e1107-0f07-4af8-8e14-097b73f57b5f-image-registry-private-configuration\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.607730 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.607715 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b81e1107-0f07-4af8-8e14-097b73f57b5f-ca-trust-extracted\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.607905 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.607747 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-certificates\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.607905 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.607763 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7f8c09c-afa9-4393-b47b-0e8efface148-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-kxff4\" (UID: \"a7f8c09c-afa9-4393-b47b-0e8efface148\") " pod="openshift-insights/insights-operator-585dfdc468-kxff4" Apr 19 12:32:52.607905 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.607800 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-bound-sa-token\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.607905 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.607826 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6788f440-7533-43d4-acaf-4fac75b17707-config\") pod \"service-ca-operator-d6fc45fc5-lkhb4\" (UID: \"6788f440-7533-43d4-acaf-4fac75b17707\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lkhb4" Apr 19 12:32:52.607905 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.607875 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dh42\" (UniqueName: \"kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-kube-api-access-6dh42\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.607905 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.607894 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7f8c09c-afa9-4393-b47b-0e8efface148-service-ca-bundle\") pod \"insights-operator-585dfdc468-kxff4\" (UID: \"a7f8c09c-afa9-4393-b47b-0e8efface148\") " pod="openshift-insights/insights-operator-585dfdc468-kxff4" Apr 19 12:32:52.608116 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.607913 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6788f440-7533-43d4-acaf-4fac75b17707-serving-cert\") pod \"service-ca-operator-d6fc45fc5-lkhb4\" (UID: \"6788f440-7533-43d4-acaf-4fac75b17707\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lkhb4" Apr 19 12:32:52.613797 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.613780 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxw5x\" (UniqueName: \"kubernetes.io/projected/a7a09300-5b01-42d1-9c9e-64749c7103a2-kube-api-access-qxw5x\") pod \"volume-data-source-validator-7c6cbb6c87-glrsn\" (UID: \"a7a09300-5b01-42d1-9c9e-64749c7103a2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-glrsn" Apr 19 12:32:52.676058 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.676037 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-glrsn" Apr 19 12:32:52.709071 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.709036 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jh2vz\" (UniqueName: \"kubernetes.io/projected/6788f440-7533-43d4-acaf-4fac75b17707-kube-api-access-jh2vz\") pod \"service-ca-operator-d6fc45fc5-lkhb4\" (UID: \"6788f440-7533-43d4-acaf-4fac75b17707\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lkhb4" Apr 19 12:32:52.709071 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.709069 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a7f8c09c-afa9-4393-b47b-0e8efface148-snapshots\") pod \"insights-operator-585dfdc468-kxff4\" (UID: \"a7f8c09c-afa9-4393-b47b-0e8efface148\") " pod="openshift-insights/insights-operator-585dfdc468-kxff4" Apr 19 12:32:52.709255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.709093 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hm2n\" (UniqueName: \"kubernetes.io/projected/a7f8c09c-afa9-4393-b47b-0e8efface148-kube-api-access-8hm2n\") pod \"insights-operator-585dfdc468-kxff4\" (UID: \"a7f8c09c-afa9-4393-b47b-0e8efface148\") " pod="openshift-insights/insights-operator-585dfdc468-kxff4" Apr 19 12:32:52.709255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.709133 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-tls\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.709255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.709155 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b81e1107-0f07-4af8-8e14-097b73f57b5f-trusted-ca\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.709255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.709183 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b81e1107-0f07-4af8-8e14-097b73f57b5f-installation-pull-secrets\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.709255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.709206 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b81e1107-0f07-4af8-8e14-097b73f57b5f-image-registry-private-configuration\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.709493 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:52.709263 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 12:32:52.709493 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.709276 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b81e1107-0f07-4af8-8e14-097b73f57b5f-ca-trust-extracted\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.709493 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.709345 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-certificates\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.709493 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:52.709284 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b9cb554c4-2cbcc: secret "image-registry-tls" not found Apr 19 12:32:52.709493 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.709377 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7f8c09c-afa9-4393-b47b-0e8efface148-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-kxff4\" (UID: \"a7f8c09c-afa9-4393-b47b-0e8efface148\") " pod="openshift-insights/insights-operator-585dfdc468-kxff4" Apr 19 12:32:52.709493 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.709409 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-bound-sa-token\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.709493 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:52.709443 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-tls podName:b81e1107-0f07-4af8-8e14-097b73f57b5f nodeName:}" failed. No retries permitted until 2026-04-19 12:32:53.209416063 +0000 UTC m=+128.447066306 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-tls") pod "image-registry-b9cb554c4-2cbcc" (UID: "b81e1107-0f07-4af8-8e14-097b73f57b5f") : secret "image-registry-tls" not found Apr 19 12:32:52.709868 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.709495 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6788f440-7533-43d4-acaf-4fac75b17707-config\") pod \"service-ca-operator-d6fc45fc5-lkhb4\" (UID: \"6788f440-7533-43d4-acaf-4fac75b17707\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lkhb4" Apr 19 12:32:52.709868 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.709554 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dh42\" (UniqueName: \"kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-kube-api-access-6dh42\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.709868 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.709582 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7f8c09c-afa9-4393-b47b-0e8efface148-service-ca-bundle\") pod \"insights-operator-585dfdc468-kxff4\" (UID: \"a7f8c09c-afa9-4393-b47b-0e8efface148\") " pod="openshift-insights/insights-operator-585dfdc468-kxff4" Apr 19 12:32:52.709868 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.709614 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6788f440-7533-43d4-acaf-4fac75b17707-serving-cert\") pod \"service-ca-operator-d6fc45fc5-lkhb4\" (UID: \"6788f440-7533-43d4-acaf-4fac75b17707\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lkhb4" Apr 19 12:32:52.709868 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.709641 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a7f8c09c-afa9-4393-b47b-0e8efface148-tmp\") pod \"insights-operator-585dfdc468-kxff4\" (UID: \"a7f8c09c-afa9-4393-b47b-0e8efface148\") " pod="openshift-insights/insights-operator-585dfdc468-kxff4" Apr 19 12:32:52.709868 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.709673 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7f8c09c-afa9-4393-b47b-0e8efface148-serving-cert\") pod \"insights-operator-585dfdc468-kxff4\" (UID: \"a7f8c09c-afa9-4393-b47b-0e8efface148\") " pod="openshift-insights/insights-operator-585dfdc468-kxff4" Apr 19 12:32:52.709868 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.709751 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a7f8c09c-afa9-4393-b47b-0e8efface148-snapshots\") pod \"insights-operator-585dfdc468-kxff4\" (UID: \"a7f8c09c-afa9-4393-b47b-0e8efface148\") " pod="openshift-insights/insights-operator-585dfdc468-kxff4" Apr 19 12:32:52.710258 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.710235 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-certificates\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.710313 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.710261 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6788f440-7533-43d4-acaf-4fac75b17707-config\") pod \"service-ca-operator-d6fc45fc5-lkhb4\" (UID: \"6788f440-7533-43d4-acaf-4fac75b17707\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lkhb4" Apr 19 12:32:52.710436 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.710416 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7f8c09c-afa9-4393-b47b-0e8efface148-service-ca-bundle\") pod \"insights-operator-585dfdc468-kxff4\" (UID: \"a7f8c09c-afa9-4393-b47b-0e8efface148\") " pod="openshift-insights/insights-operator-585dfdc468-kxff4" Apr 19 12:32:52.710683 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.710627 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7f8c09c-afa9-4393-b47b-0e8efface148-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-kxff4\" (UID: \"a7f8c09c-afa9-4393-b47b-0e8efface148\") " pod="openshift-insights/insights-operator-585dfdc468-kxff4" Apr 19 12:32:52.710683 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.710664 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a7f8c09c-afa9-4393-b47b-0e8efface148-tmp\") pod \"insights-operator-585dfdc468-kxff4\" (UID: \"a7f8c09c-afa9-4393-b47b-0e8efface148\") " pod="openshift-insights/insights-operator-585dfdc468-kxff4" Apr 19 12:32:52.710683 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.710673 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b81e1107-0f07-4af8-8e14-097b73f57b5f-ca-trust-extracted\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.711743 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.711700 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b81e1107-0f07-4af8-8e14-097b73f57b5f-trusted-ca\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.712371 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.712325 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7f8c09c-afa9-4393-b47b-0e8efface148-serving-cert\") pod \"insights-operator-585dfdc468-kxff4\" (UID: \"a7f8c09c-afa9-4393-b47b-0e8efface148\") " pod="openshift-insights/insights-operator-585dfdc468-kxff4" Apr 19 12:32:52.713027 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.712798 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6788f440-7533-43d4-acaf-4fac75b17707-serving-cert\") pod \"service-ca-operator-d6fc45fc5-lkhb4\" (UID: \"6788f440-7533-43d4-acaf-4fac75b17707\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lkhb4" Apr 19 12:32:52.713027 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.712900 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b81e1107-0f07-4af8-8e14-097b73f57b5f-image-registry-private-configuration\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.713201 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.713179 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b81e1107-0f07-4af8-8e14-097b73f57b5f-installation-pull-secrets\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.719950 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.719930 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-bound-sa-token\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.720110 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.720092 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dh42\" (UniqueName: \"kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-kube-api-access-6dh42\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:52.720406 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.720389 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh2vz\" (UniqueName: \"kubernetes.io/projected/6788f440-7533-43d4-acaf-4fac75b17707-kube-api-access-jh2vz\") pod \"service-ca-operator-d6fc45fc5-lkhb4\" (UID: \"6788f440-7533-43d4-acaf-4fac75b17707\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lkhb4" Apr 19 12:32:52.720447 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.720416 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hm2n\" (UniqueName: \"kubernetes.io/projected/a7f8c09c-afa9-4393-b47b-0e8efface148-kube-api-access-8hm2n\") pod \"insights-operator-585dfdc468-kxff4\" (UID: \"a7f8c09c-afa9-4393-b47b-0e8efface148\") " pod="openshift-insights/insights-operator-585dfdc468-kxff4" Apr 19 12:32:52.787018 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.786949 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lkhb4" Apr 19 12:32:52.792101 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.792075 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-glrsn"] Apr 19 12:32:52.795410 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.795377 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-kxff4" Apr 19 12:32:52.795773 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:32:52.795741 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7a09300_5b01_42d1_9c9e_64749c7103a2.slice/crio-5400deccc291a0e542fddbd7c22ea27ede41d90cbcaeb7db9df97df7a671fd48 WatchSource:0}: Error finding container 5400deccc291a0e542fddbd7c22ea27ede41d90cbcaeb7db9df97df7a671fd48: Status 404 returned error can't find the container with id 5400deccc291a0e542fddbd7c22ea27ede41d90cbcaeb7db9df97df7a671fd48 Apr 19 12:32:52.859463 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.859431 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-glrsn" event={"ID":"a7a09300-5b01-42d1-9c9e-64749c7103a2","Type":"ContainerStarted","Data":"5400deccc291a0e542fddbd7c22ea27ede41d90cbcaeb7db9df97df7a671fd48"} Apr 19 12:32:52.907323 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.907257 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lkhb4"] Apr 19 12:32:52.910484 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:32:52.910459 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6788f440_7533_43d4_acaf_4fac75b17707.slice/crio-ec360bf2679b1c521fafc5b913878a2e4dc810415ca4a9b524c709a0fd1156e7 WatchSource:0}: Error finding container ec360bf2679b1c521fafc5b913878a2e4dc810415ca4a9b524c709a0fd1156e7: Status 404 returned error can't find the container with id ec360bf2679b1c521fafc5b913878a2e4dc810415ca4a9b524c709a0fd1156e7 Apr 19 12:32:52.923565 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:52.923536 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-kxff4"] Apr 19 12:32:52.927035 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:32:52.927010 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7f8c09c_afa9_4393_b47b_0e8efface148.slice/crio-fbf24059b340d7565d012bf451097027b60a046d7edd57428add1808ee8ad6af WatchSource:0}: Error finding container fbf24059b340d7565d012bf451097027b60a046d7edd57428add1808ee8ad6af: Status 404 returned error can't find the container with id fbf24059b340d7565d012bf451097027b60a046d7edd57428add1808ee8ad6af Apr 19 12:32:53.012150 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:53.012125 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f6bf2fb-d79a-44fa-bb49-34879196ab43-service-ca-bundle\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:32:53.012249 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:53.012176 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-metrics-certs\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:32:53.012295 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:53.012250 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 19 12:32:53.012332 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:53.012289 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8f6bf2fb-d79a-44fa-bb49-34879196ab43-service-ca-bundle podName:8f6bf2fb-d79a-44fa-bb49-34879196ab43 nodeName:}" failed. No retries permitted until 2026-04-19 12:32:54.012267158 +0000 UTC m=+129.249917400 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/8f6bf2fb-d79a-44fa-bb49-34879196ab43-service-ca-bundle") pod "router-default-68bf8db9c6-c69fm" (UID: "8f6bf2fb-d79a-44fa-bb49-34879196ab43") : configmap references non-existent config key: service-ca.crt Apr 19 12:32:53.012332 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:53.012312 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-metrics-certs podName:8f6bf2fb-d79a-44fa-bb49-34879196ab43 nodeName:}" failed. No retries permitted until 2026-04-19 12:32:54.012304821 +0000 UTC m=+129.249955059 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-metrics-certs") pod "router-default-68bf8db9c6-c69fm" (UID: "8f6bf2fb-d79a-44fa-bb49-34879196ab43") : secret "router-metrics-certs-default" not found Apr 19 12:32:53.214214 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:53.214190 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-tls\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:53.214355 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:53.214336 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 12:32:53.214393 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:53.214357 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b9cb554c4-2cbcc: secret "image-registry-tls" not found Apr 19 12:32:53.214423 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:53.214406 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-tls podName:b81e1107-0f07-4af8-8e14-097b73f57b5f nodeName:}" failed. No retries permitted until 2026-04-19 12:32:54.214391991 +0000 UTC m=+129.452042234 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-tls") pod "image-registry-b9cb554c4-2cbcc" (UID: "b81e1107-0f07-4af8-8e14-097b73f57b5f") : secret "image-registry-tls" not found Apr 19 12:32:53.864236 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:53.864167 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lkhb4" event={"ID":"6788f440-7533-43d4-acaf-4fac75b17707","Type":"ContainerStarted","Data":"ec360bf2679b1c521fafc5b913878a2e4dc810415ca4a9b524c709a0fd1156e7"} Apr 19 12:32:53.866189 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:53.866104 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-kxff4" event={"ID":"a7f8c09c-afa9-4393-b47b-0e8efface148","Type":"ContainerStarted","Data":"fbf24059b340d7565d012bf451097027b60a046d7edd57428add1808ee8ad6af"} Apr 19 12:32:54.020630 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:54.020549 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f6bf2fb-d79a-44fa-bb49-34879196ab43-service-ca-bundle\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:32:54.020820 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:54.020655 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-metrics-certs\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:32:54.020820 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:54.020759 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8f6bf2fb-d79a-44fa-bb49-34879196ab43-service-ca-bundle podName:8f6bf2fb-d79a-44fa-bb49-34879196ab43 nodeName:}" failed. No retries permitted until 2026-04-19 12:32:56.020734281 +0000 UTC m=+131.258384534 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/8f6bf2fb-d79a-44fa-bb49-34879196ab43-service-ca-bundle") pod "router-default-68bf8db9c6-c69fm" (UID: "8f6bf2fb-d79a-44fa-bb49-34879196ab43") : configmap references non-existent config key: service-ca.crt Apr 19 12:32:54.020820 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:54.020809 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 19 12:32:54.021048 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:54.020976 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-metrics-certs podName:8f6bf2fb-d79a-44fa-bb49-34879196ab43 nodeName:}" failed. No retries permitted until 2026-04-19 12:32:56.02090455 +0000 UTC m=+131.258554791 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-metrics-certs") pod "router-default-68bf8db9c6-c69fm" (UID: "8f6bf2fb-d79a-44fa-bb49-34879196ab43") : secret "router-metrics-certs-default" not found Apr 19 12:32:54.121325 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:54.121235 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs\") pod \"network-metrics-daemon-lmdfj\" (UID: \"fb73e6b2-9f0a-4bcf-9371-0d399622fe97\") " pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:32:54.121498 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:54.121415 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 19 12:32:54.121603 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:54.121509 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs podName:fb73e6b2-9f0a-4bcf-9371-0d399622fe97 nodeName:}" failed. No retries permitted until 2026-04-19 12:34:56.121486339 +0000 UTC m=+251.359136583 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs") pod "network-metrics-daemon-lmdfj" (UID: "fb73e6b2-9f0a-4bcf-9371-0d399622fe97") : secret "metrics-daemon-secret" not found Apr 19 12:32:54.222215 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:54.222171 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-tls\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:54.222446 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:54.222431 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 12:32:54.222446 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:54.222447 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b9cb554c4-2cbcc: secret "image-registry-tls" not found Apr 19 12:32:54.222577 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:54.222548 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-tls podName:b81e1107-0f07-4af8-8e14-097b73f57b5f nodeName:}" failed. No retries permitted until 2026-04-19 12:32:56.222528656 +0000 UTC m=+131.460178896 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-tls") pod "image-registry-b9cb554c4-2cbcc" (UID: "b81e1107-0f07-4af8-8e14-097b73f57b5f") : secret "image-registry-tls" not found Apr 19 12:32:54.870166 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:54.870127 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-glrsn" event={"ID":"a7a09300-5b01-42d1-9c9e-64749c7103a2","Type":"ContainerStarted","Data":"4efa2f992bdc87c11b1f7a8c2d23445ce2030f15afa9be35cc885134d52d6059"} Apr 19 12:32:54.885563 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:54.885509 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-glrsn" podStartSLOduration=1.364060536 podStartE2EDuration="2.88549536s" podCreationTimestamp="2026-04-19 12:32:52 +0000 UTC" firstStartedPulling="2026-04-19 12:32:52.799017 +0000 UTC m=+128.036667243" lastFinishedPulling="2026-04-19 12:32:54.320451814 +0000 UTC m=+129.558102067" observedRunningTime="2026-04-19 12:32:54.884841335 +0000 UTC m=+130.122491596" watchObservedRunningTime="2026-04-19 12:32:54.88549536 +0000 UTC m=+130.123145620" Apr 19 12:32:55.873750 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:55.873712 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-kxff4" event={"ID":"a7f8c09c-afa9-4393-b47b-0e8efface148","Type":"ContainerStarted","Data":"9341f00c74f958dea3c4f8b05162674894767c379962613c74b1a86d7d5e2085"} Apr 19 12:32:55.875750 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:55.875715 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lkhb4" event={"ID":"6788f440-7533-43d4-acaf-4fac75b17707","Type":"ContainerStarted","Data":"c78d0f6e69d3d5c804ba01f2633b8eccf2714fc273e9d8eb20eefbcbe576267d"} Apr 19 12:32:55.888063 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:55.888019 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-kxff4" podStartSLOduration=1.400098547 podStartE2EDuration="3.888006447s" podCreationTimestamp="2026-04-19 12:32:52 +0000 UTC" firstStartedPulling="2026-04-19 12:32:52.928607583 +0000 UTC m=+128.166257825" lastFinishedPulling="2026-04-19 12:32:55.416515484 +0000 UTC m=+130.654165725" observedRunningTime="2026-04-19 12:32:55.887215125 +0000 UTC m=+131.124865388" watchObservedRunningTime="2026-04-19 12:32:55.888006447 +0000 UTC m=+131.125656740" Apr 19 12:32:55.903761 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:55.903711 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lkhb4" podStartSLOduration=1.396446036 podStartE2EDuration="3.903695506s" podCreationTimestamp="2026-04-19 12:32:52 +0000 UTC" firstStartedPulling="2026-04-19 12:32:52.912314619 +0000 UTC m=+128.149964859" lastFinishedPulling="2026-04-19 12:32:55.419564091 +0000 UTC m=+130.657214329" observedRunningTime="2026-04-19 12:32:55.902820359 +0000 UTC m=+131.140470618" watchObservedRunningTime="2026-04-19 12:32:55.903695506 +0000 UTC m=+131.141345768" Apr 19 12:32:56.039002 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:56.038959 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f6bf2fb-d79a-44fa-bb49-34879196ab43-service-ca-bundle\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:32:56.039208 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:56.039063 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-metrics-certs\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:32:56.039208 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:56.039161 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8f6bf2fb-d79a-44fa-bb49-34879196ab43-service-ca-bundle podName:8f6bf2fb-d79a-44fa-bb49-34879196ab43 nodeName:}" failed. No retries permitted until 2026-04-19 12:33:00.039137235 +0000 UTC m=+135.276787477 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/8f6bf2fb-d79a-44fa-bb49-34879196ab43-service-ca-bundle") pod "router-default-68bf8db9c6-c69fm" (UID: "8f6bf2fb-d79a-44fa-bb49-34879196ab43") : configmap references non-existent config key: service-ca.crt Apr 19 12:32:56.039423 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:56.039231 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 19 12:32:56.039423 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:56.039296 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-metrics-certs podName:8f6bf2fb-d79a-44fa-bb49-34879196ab43 nodeName:}" failed. No retries permitted until 2026-04-19 12:33:00.039279516 +0000 UTC m=+135.276929762 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-metrics-certs") pod "router-default-68bf8db9c6-c69fm" (UID: "8f6bf2fb-d79a-44fa-bb49-34879196ab43") : secret "router-metrics-certs-default" not found Apr 19 12:32:56.241444 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:56.241410 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-tls\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:32:56.241607 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:56.241588 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 12:32:56.241661 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:56.241609 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b9cb554c4-2cbcc: secret "image-registry-tls" not found Apr 19 12:32:56.241701 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:32:56.241663 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-tls podName:b81e1107-0f07-4af8-8e14-097b73f57b5f nodeName:}" failed. No retries permitted until 2026-04-19 12:33:00.241648367 +0000 UTC m=+135.479298606 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-tls") pod "image-registry-b9cb554c4-2cbcc" (UID: "b81e1107-0f07-4af8-8e14-097b73f57b5f") : secret "image-registry-tls" not found Apr 19 12:32:57.087604 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:57.087567 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wgbsg"] Apr 19 12:32:57.091027 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:57.091008 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wgbsg" Apr 19 12:32:57.092864 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:57.092826 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 19 12:32:57.092954 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:57.092902 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 19 12:32:57.093266 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:57.093252 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-xff9w\"" Apr 19 12:32:57.099036 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:57.099014 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wgbsg"] Apr 19 12:32:57.248436 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:57.248397 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn4mz\" (UniqueName: \"kubernetes.io/projected/917f4be8-d4ee-4f39-b77b-59c9da8491e2-kube-api-access-vn4mz\") pod \"migrator-74bb7799d9-wgbsg\" (UID: \"917f4be8-d4ee-4f39-b77b-59c9da8491e2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wgbsg" Apr 19 12:32:57.349011 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:57.348930 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vn4mz\" (UniqueName: \"kubernetes.io/projected/917f4be8-d4ee-4f39-b77b-59c9da8491e2-kube-api-access-vn4mz\") pod \"migrator-74bb7799d9-wgbsg\" (UID: \"917f4be8-d4ee-4f39-b77b-59c9da8491e2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wgbsg" Apr 19 12:32:57.356262 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:57.356235 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn4mz\" (UniqueName: \"kubernetes.io/projected/917f4be8-d4ee-4f39-b77b-59c9da8491e2-kube-api-access-vn4mz\") pod \"migrator-74bb7799d9-wgbsg\" (UID: \"917f4be8-d4ee-4f39-b77b-59c9da8491e2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wgbsg" Apr 19 12:32:57.400438 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:57.400408 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wgbsg" Apr 19 12:32:57.512493 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:57.512457 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wgbsg"] Apr 19 12:32:57.515241 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:32:57.515213 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod917f4be8_d4ee_4f39_b77b_59c9da8491e2.slice/crio-f44785e96f011a4f3b9b6a682fcaf5cacd27aa604e09239aec022f66ba0fd701 WatchSource:0}: Error finding container f44785e96f011a4f3b9b6a682fcaf5cacd27aa604e09239aec022f66ba0fd701: Status 404 returned error can't find the container with id f44785e96f011a4f3b9b6a682fcaf5cacd27aa604e09239aec022f66ba0fd701 Apr 19 12:32:57.880412 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:57.880376 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wgbsg" event={"ID":"917f4be8-d4ee-4f39-b77b-59c9da8491e2","Type":"ContainerStarted","Data":"f44785e96f011a4f3b9b6a682fcaf5cacd27aa604e09239aec022f66ba0fd701"} Apr 19 12:32:58.884997 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:58.884963 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wgbsg" event={"ID":"917f4be8-d4ee-4f39-b77b-59c9da8491e2","Type":"ContainerStarted","Data":"250fdf7d748f1b6108c709c71c80748946915663f42738dd2db98c4ed84d2d53"} Apr 19 12:32:59.004796 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:59.004768 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w56w7_5d31c3f1-1682-400f-9db4-ef1c50b1f94d/dns-node-resolver/0.log" Apr 19 12:32:59.888440 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:59.888406 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wgbsg" event={"ID":"917f4be8-d4ee-4f39-b77b-59c9da8491e2","Type":"ContainerStarted","Data":"bb828d59c89f30b02095784a9d62d652585483a4e6aa5c56ce68d7a149ba0ae5"} Apr 19 12:32:59.902435 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:32:59.902386 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wgbsg" podStartSLOduration=1.6256603630000002 podStartE2EDuration="2.902369201s" podCreationTimestamp="2026-04-19 12:32:57 +0000 UTC" firstStartedPulling="2026-04-19 12:32:57.517274738 +0000 UTC m=+132.754924977" lastFinishedPulling="2026-04-19 12:32:58.793983573 +0000 UTC m=+134.031633815" observedRunningTime="2026-04-19 12:32:59.901618274 +0000 UTC m=+135.139268536" watchObservedRunningTime="2026-04-19 12:32:59.902369201 +0000 UTC m=+135.140019462" Apr 19 12:33:00.073527 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:00.073488 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f6bf2fb-d79a-44fa-bb49-34879196ab43-service-ca-bundle\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:33:00.073678 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:00.073572 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-metrics-certs\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:33:00.073719 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:33:00.073692 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8f6bf2fb-d79a-44fa-bb49-34879196ab43-service-ca-bundle podName:8f6bf2fb-d79a-44fa-bb49-34879196ab43 nodeName:}" failed. No retries permitted until 2026-04-19 12:33:08.07367468 +0000 UTC m=+143.311324923 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/8f6bf2fb-d79a-44fa-bb49-34879196ab43-service-ca-bundle") pod "router-default-68bf8db9c6-c69fm" (UID: "8f6bf2fb-d79a-44fa-bb49-34879196ab43") : configmap references non-existent config key: service-ca.crt Apr 19 12:33:00.073760 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:33:00.073718 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 19 12:33:00.073802 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:33:00.073791 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-metrics-certs podName:8f6bf2fb-d79a-44fa-bb49-34879196ab43 nodeName:}" failed. No retries permitted until 2026-04-19 12:33:08.07377704 +0000 UTC m=+143.311427412 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-metrics-certs") pod "router-default-68bf8db9c6-c69fm" (UID: "8f6bf2fb-d79a-44fa-bb49-34879196ab43") : secret "router-metrics-certs-default" not found Apr 19 12:33:00.205422 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:00.205397 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wtzrr_6ad204b3-eb17-4f25-b1ef-6950791a05cd/node-ca/0.log" Apr 19 12:33:00.276075 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:00.276037 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-tls\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:33:00.276281 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:33:00.276187 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 12:33:00.276281 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:33:00.276204 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b9cb554c4-2cbcc: secret "image-registry-tls" not found Apr 19 12:33:00.276281 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:33:00.276267 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-tls podName:b81e1107-0f07-4af8-8e14-097b73f57b5f nodeName:}" failed. No retries permitted until 2026-04-19 12:33:08.276251533 +0000 UTC m=+143.513901771 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-tls") pod "image-registry-b9cb554c4-2cbcc" (UID: "b81e1107-0f07-4af8-8e14-097b73f57b5f") : secret "image-registry-tls" not found Apr 19 12:33:08.137033 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:08.137002 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f6bf2fb-d79a-44fa-bb49-34879196ab43-service-ca-bundle\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:33:08.137447 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:08.137056 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-metrics-certs\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:33:08.137447 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:33:08.137207 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8f6bf2fb-d79a-44fa-bb49-34879196ab43-service-ca-bundle podName:8f6bf2fb-d79a-44fa-bb49-34879196ab43 nodeName:}" failed. No retries permitted until 2026-04-19 12:33:24.137180717 +0000 UTC m=+159.374830974 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/8f6bf2fb-d79a-44fa-bb49-34879196ab43-service-ca-bundle") pod "router-default-68bf8db9c6-c69fm" (UID: "8f6bf2fb-d79a-44fa-bb49-34879196ab43") : configmap references non-existent config key: service-ca.crt Apr 19 12:33:08.139471 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:08.139445 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f6bf2fb-d79a-44fa-bb49-34879196ab43-metrics-certs\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:33:08.339104 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:08.339067 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-tls\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:33:08.341547 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:08.341518 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-tls\") pod \"image-registry-b9cb554c4-2cbcc\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:33:08.401120 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:08.401043 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:33:08.519324 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:08.519296 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b9cb554c4-2cbcc"] Apr 19 12:33:08.522658 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:33:08.522627 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb81e1107_0f07_4af8_8e14_097b73f57b5f.slice/crio-9ac6f665a9c81a76a41c6e7bad90d617c40531774be2e373dcea72c85b4143af WatchSource:0}: Error finding container 9ac6f665a9c81a76a41c6e7bad90d617c40531774be2e373dcea72c85b4143af: Status 404 returned error can't find the container with id 9ac6f665a9c81a76a41c6e7bad90d617c40531774be2e373dcea72c85b4143af Apr 19 12:33:08.908103 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:08.908067 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" event={"ID":"b81e1107-0f07-4af8-8e14-097b73f57b5f","Type":"ContainerStarted","Data":"caa5d6f9cd6a65790b2b2bd9c70bfaf237ac30dc2520729ef41a5219d1bca8c0"} Apr 19 12:33:08.908103 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:08.908110 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" event={"ID":"b81e1107-0f07-4af8-8e14-097b73f57b5f","Type":"ContainerStarted","Data":"9ac6f665a9c81a76a41c6e7bad90d617c40531774be2e373dcea72c85b4143af"} Apr 19 12:33:08.908314 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:08.908142 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:33:08.924663 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:08.924616 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" podStartSLOduration=16.924601143 podStartE2EDuration="16.924601143s" podCreationTimestamp="2026-04-19 12:32:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:33:08.923463172 +0000 UTC m=+144.161113446" watchObservedRunningTime="2026-04-19 12:33:08.924601143 +0000 UTC m=+144.162251440" Apr 19 12:33:21.157500 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:33:21.157468 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-z7glk" podUID="6d466489-3e13-462c-b90d-4b13b586caae" Apr 19 12:33:21.177633 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:33:21.177604 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-vlsjm" podUID="e9a634b4-9352-4e62-926c-390e9b19d228" Apr 19 12:33:21.498113 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.498085 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-wqsqg"] Apr 19 12:33:21.503182 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.503161 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wqsqg" Apr 19 12:33:21.505052 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.505025 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 19 12:33:21.505417 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.505397 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8lgzb\"" Apr 19 12:33:21.505503 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.505451 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 19 12:33:21.512608 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.512586 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wqsqg"] Apr 19 12:33:21.513361 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.513342 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-b9cb554c4-2cbcc"] Apr 19 12:33:21.639370 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.639339 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b159597a-1990-49de-af45-edc96be184cd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wqsqg\" (UID: \"b159597a-1990-49de-af45-edc96be184cd\") " pod="openshift-insights/insights-runtime-extractor-wqsqg" Apr 19 12:33:21.639551 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.639392 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b159597a-1990-49de-af45-edc96be184cd-data-volume\") pod \"insights-runtime-extractor-wqsqg\" (UID: \"b159597a-1990-49de-af45-edc96be184cd\") " pod="openshift-insights/insights-runtime-extractor-wqsqg" Apr 19 12:33:21.639551 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.639445 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b159597a-1990-49de-af45-edc96be184cd-crio-socket\") pod \"insights-runtime-extractor-wqsqg\" (UID: \"b159597a-1990-49de-af45-edc96be184cd\") " pod="openshift-insights/insights-runtime-extractor-wqsqg" Apr 19 12:33:21.639551 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.639472 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b159597a-1990-49de-af45-edc96be184cd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wqsqg\" (UID: \"b159597a-1990-49de-af45-edc96be184cd\") " pod="openshift-insights/insights-runtime-extractor-wqsqg" Apr 19 12:33:21.639551 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.639495 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62hfz\" (UniqueName: \"kubernetes.io/projected/b159597a-1990-49de-af45-edc96be184cd-kube-api-access-62hfz\") pod \"insights-runtime-extractor-wqsqg\" (UID: \"b159597a-1990-49de-af45-edc96be184cd\") " pod="openshift-insights/insights-runtime-extractor-wqsqg" Apr 19 12:33:21.740460 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.740425 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b159597a-1990-49de-af45-edc96be184cd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wqsqg\" (UID: \"b159597a-1990-49de-af45-edc96be184cd\") " pod="openshift-insights/insights-runtime-extractor-wqsqg" Apr 19 12:33:21.740460 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.740460 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62hfz\" (UniqueName: \"kubernetes.io/projected/b159597a-1990-49de-af45-edc96be184cd-kube-api-access-62hfz\") pod \"insights-runtime-extractor-wqsqg\" (UID: \"b159597a-1990-49de-af45-edc96be184cd\") " pod="openshift-insights/insights-runtime-extractor-wqsqg" Apr 19 12:33:21.740676 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.740588 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b159597a-1990-49de-af45-edc96be184cd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wqsqg\" (UID: \"b159597a-1990-49de-af45-edc96be184cd\") " pod="openshift-insights/insights-runtime-extractor-wqsqg" Apr 19 12:33:21.740676 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.740639 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b159597a-1990-49de-af45-edc96be184cd-data-volume\") pod \"insights-runtime-extractor-wqsqg\" (UID: \"b159597a-1990-49de-af45-edc96be184cd\") " pod="openshift-insights/insights-runtime-extractor-wqsqg" Apr 19 12:33:21.740751 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.740688 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b159597a-1990-49de-af45-edc96be184cd-crio-socket\") pod \"insights-runtime-extractor-wqsqg\" (UID: \"b159597a-1990-49de-af45-edc96be184cd\") " pod="openshift-insights/insights-runtime-extractor-wqsqg" Apr 19 12:33:21.740804 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.740770 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b159597a-1990-49de-af45-edc96be184cd-crio-socket\") pod \"insights-runtime-extractor-wqsqg\" (UID: \"b159597a-1990-49de-af45-edc96be184cd\") " pod="openshift-insights/insights-runtime-extractor-wqsqg" Apr 19 12:33:21.741007 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.740990 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b159597a-1990-49de-af45-edc96be184cd-data-volume\") pod \"insights-runtime-extractor-wqsqg\" (UID: \"b159597a-1990-49de-af45-edc96be184cd\") " pod="openshift-insights/insights-runtime-extractor-wqsqg" Apr 19 12:33:21.741122 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.741105 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b159597a-1990-49de-af45-edc96be184cd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wqsqg\" (UID: \"b159597a-1990-49de-af45-edc96be184cd\") " pod="openshift-insights/insights-runtime-extractor-wqsqg" Apr 19 12:33:21.742909 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.742889 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b159597a-1990-49de-af45-edc96be184cd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wqsqg\" (UID: \"b159597a-1990-49de-af45-edc96be184cd\") " pod="openshift-insights/insights-runtime-extractor-wqsqg" Apr 19 12:33:21.750425 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.750374 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62hfz\" (UniqueName: \"kubernetes.io/projected/b159597a-1990-49de-af45-edc96be184cd-kube-api-access-62hfz\") pod \"insights-runtime-extractor-wqsqg\" (UID: \"b159597a-1990-49de-af45-edc96be184cd\") " pod="openshift-insights/insights-runtime-extractor-wqsqg" Apr 19 12:33:21.813300 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.813273 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wqsqg" Apr 19 12:33:21.931515 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.931484 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wqsqg"] Apr 19 12:33:21.935292 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:33:21.935255 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb159597a_1990_49de_af45_edc96be184cd.slice/crio-c276b54f5f91ea3f25a0b1351d13a0e06429ad5a7ed9144a72502f0cc4b8b424 WatchSource:0}: Error finding container c276b54f5f91ea3f25a0b1351d13a0e06429ad5a7ed9144a72502f0cc4b8b424: Status 404 returned error can't find the container with id c276b54f5f91ea3f25a0b1351d13a0e06429ad5a7ed9144a72502f0cc4b8b424 Apr 19 12:33:21.936761 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:21.936743 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z7glk" Apr 19 12:33:22.345387 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:33:22.345332 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-lmdfj" podUID="fb73e6b2-9f0a-4bcf-9371-0d399622fe97" Apr 19 12:33:22.940255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:22.940221 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wqsqg" event={"ID":"b159597a-1990-49de-af45-edc96be184cd","Type":"ContainerStarted","Data":"c8bce966418cfb560ebea8bcb8ff02cbecba201148ca6448871ad983bd7dffff"} Apr 19 12:33:22.940255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:22.940260 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wqsqg" event={"ID":"b159597a-1990-49de-af45-edc96be184cd","Type":"ContainerStarted","Data":"79d1c0f98ace78e88c3b0e5aab1ade59d27096b89da1a16f15489e15442b8cb9"} Apr 19 12:33:22.940424 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:22.940270 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wqsqg" event={"ID":"b159597a-1990-49de-af45-edc96be184cd","Type":"ContainerStarted","Data":"c276b54f5f91ea3f25a0b1351d13a0e06429ad5a7ed9144a72502f0cc4b8b424"} Apr 19 12:33:24.158540 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:24.158505 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f6bf2fb-d79a-44fa-bb49-34879196ab43-service-ca-bundle\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:33:24.159249 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:24.159222 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f6bf2fb-d79a-44fa-bb49-34879196ab43-service-ca-bundle\") pod \"router-default-68bf8db9c6-c69fm\" (UID: \"8f6bf2fb-d79a-44fa-bb49-34879196ab43\") " pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:33:24.382012 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:24.381990 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:33:24.496897 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:24.496868 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-68bf8db9c6-c69fm"] Apr 19 12:33:24.499945 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:33:24.499919 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f6bf2fb_d79a_44fa_bb49_34879196ab43.slice/crio-c6b840d08031852a552f9e27b4bb17bc7f8624161cb1972ea1ea10d3171b37e2 WatchSource:0}: Error finding container c6b840d08031852a552f9e27b4bb17bc7f8624161cb1972ea1ea10d3171b37e2: Status 404 returned error can't find the container with id c6b840d08031852a552f9e27b4bb17bc7f8624161cb1972ea1ea10d3171b37e2 Apr 19 12:33:24.946812 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:24.946781 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wqsqg" event={"ID":"b159597a-1990-49de-af45-edc96be184cd","Type":"ContainerStarted","Data":"24af340aab3cb89026195db43d64e6621d1ffc9f5fd3ba1b1a2597862d6ec8a8"} Apr 19 12:33:24.948127 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:24.948100 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-68bf8db9c6-c69fm" event={"ID":"8f6bf2fb-d79a-44fa-bb49-34879196ab43","Type":"ContainerStarted","Data":"02df121665f642b02b972b57df8619da45c50148b7d54ee81be0237d2ee1db08"} Apr 19 12:33:24.948127 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:24.948128 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-68bf8db9c6-c69fm" event={"ID":"8f6bf2fb-d79a-44fa-bb49-34879196ab43","Type":"ContainerStarted","Data":"c6b840d08031852a552f9e27b4bb17bc7f8624161cb1972ea1ea10d3171b37e2"} Apr 19 12:33:24.962761 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:24.962721 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-wqsqg" podStartSLOduration=1.6386393670000001 podStartE2EDuration="3.96270916s" podCreationTimestamp="2026-04-19 12:33:21 +0000 UTC" firstStartedPulling="2026-04-19 12:33:21.996725572 +0000 UTC m=+157.234375814" lastFinishedPulling="2026-04-19 12:33:24.320795365 +0000 UTC m=+159.558445607" observedRunningTime="2026-04-19 12:33:24.961490058 +0000 UTC m=+160.199140318" watchObservedRunningTime="2026-04-19 12:33:24.96270916 +0000 UTC m=+160.200359399" Apr 19 12:33:24.977738 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:24.977699 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-68bf8db9c6-c69fm" podStartSLOduration=32.977687812 podStartE2EDuration="32.977687812s" podCreationTimestamp="2026-04-19 12:32:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:33:24.976819631 +0000 UTC m=+160.214469894" watchObservedRunningTime="2026-04-19 12:33:24.977687812 +0000 UTC m=+160.215338106" Apr 19 12:33:25.382356 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:25.382323 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:33:25.385185 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:25.385164 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:33:25.952008 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:25.951977 2583 generic.go:358] "Generic (PLEG): container finished" podID="133176e9-8b1c-4614-bf03-e320af8ca89e" containerID="cf81b9195b7ee03ecbbde987bd7234f55fd0d73db7ba52b39ac69fdd23ef6061" exitCode=255 Apr 19 12:33:25.952170 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:25.952051 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bfb848955-zdmbs" event={"ID":"133176e9-8b1c-4614-bf03-e320af8ca89e","Type":"ContainerDied","Data":"cf81b9195b7ee03ecbbde987bd7234f55fd0d73db7ba52b39ac69fdd23ef6061"} Apr 19 12:33:25.952392 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:25.952371 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:33:25.953440 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:25.953419 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-68bf8db9c6-c69fm" Apr 19 12:33:25.958295 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:25.958281 2583 scope.go:117] "RemoveContainer" containerID="cf81b9195b7ee03ecbbde987bd7234f55fd0d73db7ba52b39ac69fdd23ef6061" Apr 19 12:33:26.073189 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:26.073153 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert\") pod \"ingress-canary-vlsjm\" (UID: \"e9a634b4-9352-4e62-926c-390e9b19d228\") " pod="openshift-ingress-canary/ingress-canary-vlsjm" Apr 19 12:33:26.073189 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:26.073195 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls\") pod \"dns-default-z7glk\" (UID: \"6d466489-3e13-462c-b90d-4b13b586caae\") " pod="openshift-dns/dns-default-z7glk" Apr 19 12:33:26.075682 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:26.075652 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d466489-3e13-462c-b90d-4b13b586caae-metrics-tls\") pod \"dns-default-z7glk\" (UID: \"6d466489-3e13-462c-b90d-4b13b586caae\") " pod="openshift-dns/dns-default-z7glk" Apr 19 12:33:26.075925 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:26.075902 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9a634b4-9352-4e62-926c-390e9b19d228-cert\") pod \"ingress-canary-vlsjm\" (UID: \"e9a634b4-9352-4e62-926c-390e9b19d228\") " pod="openshift-ingress-canary/ingress-canary-vlsjm" Apr 19 12:33:26.139237 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:26.139209 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-62vx7\"" Apr 19 12:33:26.147836 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:26.147812 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z7glk" Apr 19 12:33:26.265434 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:26.265402 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-z7glk"] Apr 19 12:33:26.268581 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:33:26.268555 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d466489_3e13_462c_b90d_4b13b586caae.slice/crio-ee769aa72aa3f97fa383b64198eb9b6bbf482b277f30c0ffb994c11896cf24ea WatchSource:0}: Error finding container ee769aa72aa3f97fa383b64198eb9b6bbf482b277f30c0ffb994c11896cf24ea: Status 404 returned error can't find the container with id ee769aa72aa3f97fa383b64198eb9b6bbf482b277f30c0ffb994c11896cf24ea Apr 19 12:33:26.956269 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:26.956227 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bfb848955-zdmbs" event={"ID":"133176e9-8b1c-4614-bf03-e320af8ca89e","Type":"ContainerStarted","Data":"8e4d6cc38f02ec913bbee714ed0b25731479d02c760758f6713387aa8403bee7"} Apr 19 12:33:26.957559 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:26.957531 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z7glk" event={"ID":"6d466489-3e13-462c-b90d-4b13b586caae","Type":"ContainerStarted","Data":"ee769aa72aa3f97fa383b64198eb9b6bbf482b277f30c0ffb994c11896cf24ea"} Apr 19 12:33:27.961397 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:27.961364 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z7glk" event={"ID":"6d466489-3e13-462c-b90d-4b13b586caae","Type":"ContainerStarted","Data":"973b06d3f1104cd4a793cad9a1bdd209dda20682f22a51cf8ee08d850a6f5643"} Apr 19 12:33:27.961797 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:27.961403 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z7glk" event={"ID":"6d466489-3e13-462c-b90d-4b13b586caae","Type":"ContainerStarted","Data":"9ce64014d6674b7850f617b82398a206fa756247cb72d127d09fec62c0032fee"} Apr 19 12:33:27.976713 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:27.976670 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-z7glk" podStartSLOduration=128.815873355 podStartE2EDuration="2m9.976657695s" podCreationTimestamp="2026-04-19 12:31:18 +0000 UTC" firstStartedPulling="2026-04-19 12:33:26.270316997 +0000 UTC m=+161.507967237" lastFinishedPulling="2026-04-19 12:33:27.431101337 +0000 UTC m=+162.668751577" observedRunningTime="2026-04-19 12:33:27.976282612 +0000 UTC m=+163.213932874" watchObservedRunningTime="2026-04-19 12:33:27.976657695 +0000 UTC m=+163.214307934" Apr 19 12:33:28.964411 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:28.964343 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-z7glk" Apr 19 12:33:31.518705 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:31.518677 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:33:34.323828 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:34.323793 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:33:36.323650 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:36.323620 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vlsjm" Apr 19 12:33:36.325495 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:36.325479 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xbkh5\"" Apr 19 12:33:36.334376 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:36.334357 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vlsjm" Apr 19 12:33:36.447634 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:36.447607 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vlsjm"] Apr 19 12:33:36.450287 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:33:36.450262 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9a634b4_9352_4e62_926c_390e9b19d228.slice/crio-2b3ba29741069feb6b1a3e48a66747c2c16381f67c3c96c7b5c7fd47935c38e4 WatchSource:0}: Error finding container 2b3ba29741069feb6b1a3e48a66747c2c16381f67c3c96c7b5c7fd47935c38e4: Status 404 returned error can't find the container with id 2b3ba29741069feb6b1a3e48a66747c2c16381f67c3c96c7b5c7fd47935c38e4 Apr 19 12:33:36.987570 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:36.987535 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vlsjm" event={"ID":"e9a634b4-9352-4e62-926c-390e9b19d228","Type":"ContainerStarted","Data":"2b3ba29741069feb6b1a3e48a66747c2c16381f67c3c96c7b5c7fd47935c38e4"} Apr 19 12:33:38.497400 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.497365 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-78fff7c594-9qqpj"] Apr 19 12:33:38.500338 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.500321 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.502223 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.502204 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 19 12:33:38.502667 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.502652 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-8nsl6\"" Apr 19 12:33:38.502807 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.502780 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 19 12:33:38.502807 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.502800 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 19 12:33:38.502988 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.502925 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 19 12:33:38.502988 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.502949 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 19 12:33:38.503272 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.503246 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 19 12:33:38.503339 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.503296 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 19 12:33:38.507001 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.506980 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 19 12:33:38.510468 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.510447 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78fff7c594-9qqpj"] Apr 19 12:33:38.560938 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.560913 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-service-ca\") pod \"console-78fff7c594-9qqpj\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.561039 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.560945 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-console-serving-cert\") pod \"console-78fff7c594-9qqpj\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.561039 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.560967 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-oauth-serving-cert\") pod \"console-78fff7c594-9qqpj\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.561039 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.560983 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-trusted-ca-bundle\") pod \"console-78fff7c594-9qqpj\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.561039 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.560999 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-console-config\") pod \"console-78fff7c594-9qqpj\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.561190 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.561061 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-console-oauth-config\") pod \"console-78fff7c594-9qqpj\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.561190 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.561103 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66hc2\" (UniqueName: \"kubernetes.io/projected/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-kube-api-access-66hc2\") pod \"console-78fff7c594-9qqpj\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.662083 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.662055 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-console-oauth-config\") pod \"console-78fff7c594-9qqpj\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.662205 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.662103 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66hc2\" (UniqueName: \"kubernetes.io/projected/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-kube-api-access-66hc2\") pod \"console-78fff7c594-9qqpj\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.662205 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.662128 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-service-ca\") pod \"console-78fff7c594-9qqpj\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.662205 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.662147 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-console-serving-cert\") pod \"console-78fff7c594-9qqpj\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.662205 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.662190 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-oauth-serving-cert\") pod \"console-78fff7c594-9qqpj\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.662205 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.662207 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-trusted-ca-bundle\") pod \"console-78fff7c594-9qqpj\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.662433 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.662228 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-console-config\") pod \"console-78fff7c594-9qqpj\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.662900 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.662842 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-oauth-serving-cert\") pod \"console-78fff7c594-9qqpj\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.662992 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.662905 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-service-ca\") pod \"console-78fff7c594-9qqpj\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.663263 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.663232 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-trusted-ca-bundle\") pod \"console-78fff7c594-9qqpj\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.663434 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.663414 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-console-config\") pod \"console-78fff7c594-9qqpj\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.664717 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.664697 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-console-serving-cert\") pod \"console-78fff7c594-9qqpj\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.664790 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.664725 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-console-oauth-config\") pod \"console-78fff7c594-9qqpj\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.672770 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.672747 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66hc2\" (UniqueName: \"kubernetes.io/projected/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-kube-api-access-66hc2\") pod \"console-78fff7c594-9qqpj\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.810150 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.810068 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:38.930159 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.930129 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78fff7c594-9qqpj"] Apr 19 12:33:38.933309 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:33:38.933268 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9aa1a9ed_69a7_4120_8c3a_ffc824fdd4fe.slice/crio-29f5bf10c7a1a4c4d7d959f194f846b519852e00d06bb9d4bd59050aed78bbb0 WatchSource:0}: Error finding container 29f5bf10c7a1a4c4d7d959f194f846b519852e00d06bb9d4bd59050aed78bbb0: Status 404 returned error can't find the container with id 29f5bf10c7a1a4c4d7d959f194f846b519852e00d06bb9d4bd59050aed78bbb0 Apr 19 12:33:38.969073 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.969052 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-z7glk" Apr 19 12:33:38.994771 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.994741 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vlsjm" event={"ID":"e9a634b4-9352-4e62-926c-390e9b19d228","Type":"ContainerStarted","Data":"4c646346ea1c273a53bb37dce01998a106b88806e7f252c0ffcaa1ddb2ee786c"} Apr 19 12:33:38.995966 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:38.995940 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78fff7c594-9qqpj" event={"ID":"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe","Type":"ContainerStarted","Data":"29f5bf10c7a1a4c4d7d959f194f846b519852e00d06bb9d4bd59050aed78bbb0"} Apr 19 12:33:39.011206 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:39.011163 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vlsjm" podStartSLOduration=139.339947684 podStartE2EDuration="2m21.011149816s" podCreationTimestamp="2026-04-19 12:31:18 +0000 UTC" firstStartedPulling="2026-04-19 12:33:36.452183798 +0000 UTC m=+171.689834038" lastFinishedPulling="2026-04-19 12:33:38.123385931 +0000 UTC m=+173.361036170" observedRunningTime="2026-04-19 12:33:39.010739972 +0000 UTC m=+174.248390232" watchObservedRunningTime="2026-04-19 12:33:39.011149816 +0000 UTC m=+174.248800077" Apr 19 12:33:41.624151 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.624126 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-8hxmd"] Apr 19 12:33:41.628213 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.628195 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.632531 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.632505 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 19 12:33:41.632617 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.632504 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 19 12:33:41.632617 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.632543 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 19 12:33:41.632994 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.632975 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 19 12:33:41.632994 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.632988 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-rhpt4\"" Apr 19 12:33:41.633126 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.632995 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 19 12:33:41.633126 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.633048 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 19 12:33:41.682698 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.682677 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/588e865e-56c0-4ff3-aebf-a1f18329ce04-node-exporter-textfile\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.682803 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.682717 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/588e865e-56c0-4ff3-aebf-a1f18329ce04-node-exporter-accelerators-collector-config\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.682803 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.682739 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/588e865e-56c0-4ff3-aebf-a1f18329ce04-metrics-client-ca\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.682803 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.682764 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/588e865e-56c0-4ff3-aebf-a1f18329ce04-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.682929 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.682811 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/588e865e-56c0-4ff3-aebf-a1f18329ce04-root\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.682929 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.682837 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmmmb\" (UniqueName: \"kubernetes.io/projected/588e865e-56c0-4ff3-aebf-a1f18329ce04-kube-api-access-nmmmb\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.682929 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.682878 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/588e865e-56c0-4ff3-aebf-a1f18329ce04-node-exporter-wtmp\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.682929 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.682896 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/588e865e-56c0-4ff3-aebf-a1f18329ce04-sys\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.682929 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.682911 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/588e865e-56c0-4ff3-aebf-a1f18329ce04-node-exporter-tls\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.784229 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.784191 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/588e865e-56c0-4ff3-aebf-a1f18329ce04-root\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.784385 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.784238 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmmmb\" (UniqueName: \"kubernetes.io/projected/588e865e-56c0-4ff3-aebf-a1f18329ce04-kube-api-access-nmmmb\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.784385 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.784260 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/588e865e-56c0-4ff3-aebf-a1f18329ce04-node-exporter-wtmp\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.784385 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.784276 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/588e865e-56c0-4ff3-aebf-a1f18329ce04-sys\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.784385 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.784296 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/588e865e-56c0-4ff3-aebf-a1f18329ce04-node-exporter-tls\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.784385 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.784307 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/588e865e-56c0-4ff3-aebf-a1f18329ce04-root\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.784385 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.784355 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/588e865e-56c0-4ff3-aebf-a1f18329ce04-sys\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.784681 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:33:41.784401 2583 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 19 12:33:41.784681 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:33:41.784467 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/588e865e-56c0-4ff3-aebf-a1f18329ce04-node-exporter-tls podName:588e865e-56c0-4ff3-aebf-a1f18329ce04 nodeName:}" failed. No retries permitted until 2026-04-19 12:33:42.284447403 +0000 UTC m=+177.522097642 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/588e865e-56c0-4ff3-aebf-a1f18329ce04-node-exporter-tls") pod "node-exporter-8hxmd" (UID: "588e865e-56c0-4ff3-aebf-a1f18329ce04") : secret "node-exporter-tls" not found Apr 19 12:33:41.784681 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.784400 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/588e865e-56c0-4ff3-aebf-a1f18329ce04-node-exporter-wtmp\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.784681 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.784400 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/588e865e-56c0-4ff3-aebf-a1f18329ce04-node-exporter-textfile\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.784681 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.784532 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/588e865e-56c0-4ff3-aebf-a1f18329ce04-node-exporter-accelerators-collector-config\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.784681 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.784555 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/588e865e-56c0-4ff3-aebf-a1f18329ce04-metrics-client-ca\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.784681 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.784571 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/588e865e-56c0-4ff3-aebf-a1f18329ce04-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.784681 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.784678 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/588e865e-56c0-4ff3-aebf-a1f18329ce04-node-exporter-textfile\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.785041 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.785014 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/588e865e-56c0-4ff3-aebf-a1f18329ce04-node-exporter-accelerators-collector-config\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.785145 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.785044 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/588e865e-56c0-4ff3-aebf-a1f18329ce04-metrics-client-ca\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.786960 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.786937 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/588e865e-56c0-4ff3-aebf-a1f18329ce04-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:41.793867 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:41.793824 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmmmb\" (UniqueName: \"kubernetes.io/projected/588e865e-56c0-4ff3-aebf-a1f18329ce04-kube-api-access-nmmmb\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:42.006056 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:42.006024 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78fff7c594-9qqpj" event={"ID":"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe","Type":"ContainerStarted","Data":"d3cc3855a4b13bc3084d4f0b2c57d87e521ac283842346e585521999d19b4351"} Apr 19 12:33:42.022418 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:42.022371 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-78fff7c594-9qqpj" podStartSLOduration=1.352475383 podStartE2EDuration="4.022359236s" podCreationTimestamp="2026-04-19 12:33:38 +0000 UTC" firstStartedPulling="2026-04-19 12:33:38.935158524 +0000 UTC m=+174.172808763" lastFinishedPulling="2026-04-19 12:33:41.605042374 +0000 UTC m=+176.842692616" observedRunningTime="2026-04-19 12:33:42.021777643 +0000 UTC m=+177.259427904" watchObservedRunningTime="2026-04-19 12:33:42.022359236 +0000 UTC m=+177.260009497" Apr 19 12:33:42.289008 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:42.288924 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/588e865e-56c0-4ff3-aebf-a1f18329ce04-node-exporter-tls\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:42.291324 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:42.291305 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/588e865e-56c0-4ff3-aebf-a1f18329ce04-node-exporter-tls\") pod \"node-exporter-8hxmd\" (UID: \"588e865e-56c0-4ff3-aebf-a1f18329ce04\") " pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:42.549101 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:42.549023 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8hxmd" Apr 19 12:33:42.558102 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:33:42.558072 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod588e865e_56c0_4ff3_aebf_a1f18329ce04.slice/crio-f57b273a19dfc6174c8e281a5000533f19eb23b126826d3fbf9b88a6547033a1 WatchSource:0}: Error finding container f57b273a19dfc6174c8e281a5000533f19eb23b126826d3fbf9b88a6547033a1: Status 404 returned error can't find the container with id f57b273a19dfc6174c8e281a5000533f19eb23b126826d3fbf9b88a6547033a1 Apr 19 12:33:43.009270 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.009232 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8hxmd" event={"ID":"588e865e-56c0-4ff3-aebf-a1f18329ce04","Type":"ContainerStarted","Data":"f57b273a19dfc6174c8e281a5000533f19eb23b126826d3fbf9b88a6547033a1"} Apr 19 12:33:43.605415 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.605384 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-658448b984-6hs44"] Apr 19 12:33:43.608952 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.608931 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.611177 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.611153 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 19 12:33:43.611347 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.611327 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 19 12:33:43.611470 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.611451 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 19 12:33:43.611547 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.611462 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-1k7gg91nvkhcg\"" Apr 19 12:33:43.611608 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.611554 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 19 12:33:43.611766 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.611744 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-pqztj\"" Apr 19 12:33:43.612019 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.611883 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 19 12:33:43.618226 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.618205 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-658448b984-6hs44"] Apr 19 12:33:43.700127 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.700094 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/45f0f7dc-3b40-4d18-843e-456c4cad83c2-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.700248 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.700130 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45f0f7dc-3b40-4d18-843e-456c4cad83c2-metrics-client-ca\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.700248 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.700228 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/45f0f7dc-3b40-4d18-843e-456c4cad83c2-secret-grpc-tls\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.700366 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.700283 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/45f0f7dc-3b40-4d18-843e-456c4cad83c2-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.700366 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.700342 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/45f0f7dc-3b40-4d18-843e-456c4cad83c2-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.700468 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.700378 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/45f0f7dc-3b40-4d18-843e-456c4cad83c2-secret-thanos-querier-tls\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.700468 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.700420 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/45f0f7dc-3b40-4d18-843e-456c4cad83c2-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.700551 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.700470 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5lbt\" (UniqueName: \"kubernetes.io/projected/45f0f7dc-3b40-4d18-843e-456c4cad83c2-kube-api-access-r5lbt\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.801205 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.801144 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5lbt\" (UniqueName: \"kubernetes.io/projected/45f0f7dc-3b40-4d18-843e-456c4cad83c2-kube-api-access-r5lbt\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.801205 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.801184 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/45f0f7dc-3b40-4d18-843e-456c4cad83c2-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.801340 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.801213 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45f0f7dc-3b40-4d18-843e-456c4cad83c2-metrics-client-ca\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.801340 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.801265 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/45f0f7dc-3b40-4d18-843e-456c4cad83c2-secret-grpc-tls\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.801340 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.801293 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/45f0f7dc-3b40-4d18-843e-456c4cad83c2-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.801340 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.801326 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/45f0f7dc-3b40-4d18-843e-456c4cad83c2-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.801542 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.801356 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/45f0f7dc-3b40-4d18-843e-456c4cad83c2-secret-thanos-querier-tls\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.801542 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.801386 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/45f0f7dc-3b40-4d18-843e-456c4cad83c2-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.802282 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.802258 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45f0f7dc-3b40-4d18-843e-456c4cad83c2-metrics-client-ca\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.803879 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.803827 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/45f0f7dc-3b40-4d18-843e-456c4cad83c2-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.804059 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.804036 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/45f0f7dc-3b40-4d18-843e-456c4cad83c2-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.804339 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.804320 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/45f0f7dc-3b40-4d18-843e-456c4cad83c2-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.804518 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.804495 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/45f0f7dc-3b40-4d18-843e-456c4cad83c2-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.804617 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.804602 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/45f0f7dc-3b40-4d18-843e-456c4cad83c2-secret-grpc-tls\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.804832 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.804814 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/45f0f7dc-3b40-4d18-843e-456c4cad83c2-secret-thanos-querier-tls\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.808188 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.808167 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5lbt\" (UniqueName: \"kubernetes.io/projected/45f0f7dc-3b40-4d18-843e-456c4cad83c2-kube-api-access-r5lbt\") pod \"thanos-querier-658448b984-6hs44\" (UID: \"45f0f7dc-3b40-4d18-843e-456c4cad83c2\") " pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:43.920367 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:43.920339 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:44.014244 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:44.014205 2583 generic.go:358] "Generic (PLEG): container finished" podID="588e865e-56c0-4ff3-aebf-a1f18329ce04" containerID="2509023316dca9e03d308d9ca18aa683f93170204c1bd16d126a90c4ee66b484" exitCode=0 Apr 19 12:33:44.014656 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:44.014308 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8hxmd" event={"ID":"588e865e-56c0-4ff3-aebf-a1f18329ce04","Type":"ContainerDied","Data":"2509023316dca9e03d308d9ca18aa683f93170204c1bd16d126a90c4ee66b484"} Apr 19 12:33:44.050324 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:44.050292 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-658448b984-6hs44"] Apr 19 12:33:44.054918 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:33:44.054888 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45f0f7dc_3b40_4d18_843e_456c4cad83c2.slice/crio-081eb931943cb3fd5af0f9abdc71229e8bdefd2e170473bb91f73dcf82178d3c WatchSource:0}: Error finding container 081eb931943cb3fd5af0f9abdc71229e8bdefd2e170473bb91f73dcf82178d3c: Status 404 returned error can't find the container with id 081eb931943cb3fd5af0f9abdc71229e8bdefd2e170473bb91f73dcf82178d3c Apr 19 12:33:45.018235 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:45.018190 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-658448b984-6hs44" event={"ID":"45f0f7dc-3b40-4d18-843e-456c4cad83c2","Type":"ContainerStarted","Data":"081eb931943cb3fd5af0f9abdc71229e8bdefd2e170473bb91f73dcf82178d3c"} Apr 19 12:33:45.020283 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:45.020256 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8hxmd" event={"ID":"588e865e-56c0-4ff3-aebf-a1f18329ce04","Type":"ContainerStarted","Data":"a3216519e11b2b9ea632be13dcc0e117687da2f45b03a03bccae880b0896a22f"} Apr 19 12:33:45.020396 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:45.020288 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8hxmd" event={"ID":"588e865e-56c0-4ff3-aebf-a1f18329ce04","Type":"ContainerStarted","Data":"e456dd3e9cf006aa21f7b7292e8c124d964c897d525813db3297359669d21935"} Apr 19 12:33:45.040275 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:45.040220 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-8hxmd" podStartSLOduration=3.113191548 podStartE2EDuration="4.040204265s" podCreationTimestamp="2026-04-19 12:33:41 +0000 UTC" firstStartedPulling="2026-04-19 12:33:42.560427547 +0000 UTC m=+177.798077786" lastFinishedPulling="2026-04-19 12:33:43.487440248 +0000 UTC m=+178.725090503" observedRunningTime="2026-04-19 12:33:45.038658577 +0000 UTC m=+180.276308840" watchObservedRunningTime="2026-04-19 12:33:45.040204265 +0000 UTC m=+180.277854528" Apr 19 12:33:46.413033 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.412957 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d5ff5c7dc-n9zns"] Apr 19 12:33:46.416025 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.416010 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.426225 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.426204 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d5ff5c7dc-n9zns"] Apr 19 12:33:46.522096 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.522071 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-oauth-serving-cert\") pod \"console-6d5ff5c7dc-n9zns\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.522224 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.522109 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdgt7\" (UniqueName: \"kubernetes.io/projected/6b93b671-d9eb-461a-accb-0c88ae9d22e4-kube-api-access-qdgt7\") pod \"console-6d5ff5c7dc-n9zns\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.522224 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.522130 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-console-config\") pod \"console-6d5ff5c7dc-n9zns\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.522224 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.522194 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b93b671-d9eb-461a-accb-0c88ae9d22e4-console-oauth-config\") pod \"console-6d5ff5c7dc-n9zns\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.522360 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.522254 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-trusted-ca-bundle\") pod \"console-6d5ff5c7dc-n9zns\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.522360 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.522287 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b93b671-d9eb-461a-accb-0c88ae9d22e4-console-serving-cert\") pod \"console-6d5ff5c7dc-n9zns\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.522360 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.522309 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-service-ca\") pod \"console-6d5ff5c7dc-n9zns\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.532898 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.532817 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" podUID="b81e1107-0f07-4af8-8e14-097b73f57b5f" containerName="registry" containerID="cri-o://caa5d6f9cd6a65790b2b2bd9c70bfaf237ac30dc2520729ef41a5219d1bca8c0" gracePeriod=30 Apr 19 12:33:46.622635 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.622604 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b93b671-d9eb-461a-accb-0c88ae9d22e4-console-oauth-config\") pod \"console-6d5ff5c7dc-n9zns\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.622761 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.622662 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-trusted-ca-bundle\") pod \"console-6d5ff5c7dc-n9zns\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.622761 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.622694 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b93b671-d9eb-461a-accb-0c88ae9d22e4-console-serving-cert\") pod \"console-6d5ff5c7dc-n9zns\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.622761 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.622717 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-service-ca\") pod \"console-6d5ff5c7dc-n9zns\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.622932 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.622877 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-oauth-serving-cert\") pod \"console-6d5ff5c7dc-n9zns\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.622983 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.622948 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdgt7\" (UniqueName: \"kubernetes.io/projected/6b93b671-d9eb-461a-accb-0c88ae9d22e4-kube-api-access-qdgt7\") pod \"console-6d5ff5c7dc-n9zns\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.623029 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.622987 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-console-config\") pod \"console-6d5ff5c7dc-n9zns\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.623652 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.623624 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-service-ca\") pod \"console-6d5ff5c7dc-n9zns\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.623757 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.623667 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-console-config\") pod \"console-6d5ff5c7dc-n9zns\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.623757 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.623671 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-oauth-serving-cert\") pod \"console-6d5ff5c7dc-n9zns\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.623757 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.623703 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-trusted-ca-bundle\") pod \"console-6d5ff5c7dc-n9zns\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.625785 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.625764 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b93b671-d9eb-461a-accb-0c88ae9d22e4-console-oauth-config\") pod \"console-6d5ff5c7dc-n9zns\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.625945 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.625928 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b93b671-d9eb-461a-accb-0c88ae9d22e4-console-serving-cert\") pod \"console-6d5ff5c7dc-n9zns\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.630186 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.630167 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdgt7\" (UniqueName: \"kubernetes.io/projected/6b93b671-d9eb-461a-accb-0c88ae9d22e4-kube-api-access-qdgt7\") pod \"console-6d5ff5c7dc-n9zns\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.724166 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.724132 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:46.754126 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.754103 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:33:46.824467 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.824436 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-certificates\") pod \"b81e1107-0f07-4af8-8e14-097b73f57b5f\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " Apr 19 12:33:46.824602 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.824523 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b81e1107-0f07-4af8-8e14-097b73f57b5f-ca-trust-extracted\") pod \"b81e1107-0f07-4af8-8e14-097b73f57b5f\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " Apr 19 12:33:46.824602 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.824557 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-bound-sa-token\") pod \"b81e1107-0f07-4af8-8e14-097b73f57b5f\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " Apr 19 12:33:46.824602 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.824587 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b81e1107-0f07-4af8-8e14-097b73f57b5f-installation-pull-secrets\") pod \"b81e1107-0f07-4af8-8e14-097b73f57b5f\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " Apr 19 12:33:46.824761 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.824733 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-tls\") pod \"b81e1107-0f07-4af8-8e14-097b73f57b5f\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " Apr 19 12:33:46.824813 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.824798 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b81e1107-0f07-4af8-8e14-097b73f57b5f-trusted-ca\") pod \"b81e1107-0f07-4af8-8e14-097b73f57b5f\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " Apr 19 12:33:46.824902 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.824887 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dh42\" (UniqueName: \"kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-kube-api-access-6dh42\") pod \"b81e1107-0f07-4af8-8e14-097b73f57b5f\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " Apr 19 12:33:46.824981 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.824900 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b81e1107-0f07-4af8-8e14-097b73f57b5f" (UID: "b81e1107-0f07-4af8-8e14-097b73f57b5f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:33:46.824981 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.824935 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b81e1107-0f07-4af8-8e14-097b73f57b5f-image-registry-private-configuration\") pod \"b81e1107-0f07-4af8-8e14-097b73f57b5f\" (UID: \"b81e1107-0f07-4af8-8e14-097b73f57b5f\") " Apr 19 12:33:46.825336 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.825193 2583 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-certificates\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:33:46.825947 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.825644 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b81e1107-0f07-4af8-8e14-097b73f57b5f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b81e1107-0f07-4af8-8e14-097b73f57b5f" (UID: "b81e1107-0f07-4af8-8e14-097b73f57b5f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:33:46.827998 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.827971 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b81e1107-0f07-4af8-8e14-097b73f57b5f" (UID: "b81e1107-0f07-4af8-8e14-097b73f57b5f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:33:46.828090 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.828043 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b81e1107-0f07-4af8-8e14-097b73f57b5f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b81e1107-0f07-4af8-8e14-097b73f57b5f" (UID: "b81e1107-0f07-4af8-8e14-097b73f57b5f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:33:46.828156 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.828131 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b81e1107-0f07-4af8-8e14-097b73f57b5f-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "b81e1107-0f07-4af8-8e14-097b73f57b5f" (UID: "b81e1107-0f07-4af8-8e14-097b73f57b5f"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:33:46.828206 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.828182 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b81e1107-0f07-4af8-8e14-097b73f57b5f" (UID: "b81e1107-0f07-4af8-8e14-097b73f57b5f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:33:46.828381 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.828357 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-kube-api-access-6dh42" (OuterVolumeSpecName: "kube-api-access-6dh42") pod "b81e1107-0f07-4af8-8e14-097b73f57b5f" (UID: "b81e1107-0f07-4af8-8e14-097b73f57b5f"). InnerVolumeSpecName "kube-api-access-6dh42". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:33:46.834282 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.834259 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b81e1107-0f07-4af8-8e14-097b73f57b5f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b81e1107-0f07-4af8-8e14-097b73f57b5f" (UID: "b81e1107-0f07-4af8-8e14-097b73f57b5f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:33:46.854823 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.850697 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d5ff5c7dc-n9zns"] Apr 19 12:33:46.926707 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.926673 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6dh42\" (UniqueName: \"kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-kube-api-access-6dh42\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:33:46.926707 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.926707 2583 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b81e1107-0f07-4af8-8e14-097b73f57b5f-image-registry-private-configuration\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:33:46.926818 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.926724 2583 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b81e1107-0f07-4af8-8e14-097b73f57b5f-ca-trust-extracted\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:33:46.926818 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.926738 2583 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-bound-sa-token\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:33:46.926818 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.926753 2583 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b81e1107-0f07-4af8-8e14-097b73f57b5f-installation-pull-secrets\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:33:46.926818 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.926767 2583 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b81e1107-0f07-4af8-8e14-097b73f57b5f-registry-tls\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:33:46.926818 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:46.926781 2583 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b81e1107-0f07-4af8-8e14-097b73f57b5f-trusted-ca\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:33:47.026973 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.026887 2583 generic.go:358] "Generic (PLEG): container finished" podID="b81e1107-0f07-4af8-8e14-097b73f57b5f" containerID="caa5d6f9cd6a65790b2b2bd9c70bfaf237ac30dc2520729ef41a5219d1bca8c0" exitCode=0 Apr 19 12:33:47.027114 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.026988 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" event={"ID":"b81e1107-0f07-4af8-8e14-097b73f57b5f","Type":"ContainerDied","Data":"caa5d6f9cd6a65790b2b2bd9c70bfaf237ac30dc2520729ef41a5219d1bca8c0"} Apr 19 12:33:47.027114 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.027036 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" event={"ID":"b81e1107-0f07-4af8-8e14-097b73f57b5f","Type":"ContainerDied","Data":"9ac6f665a9c81a76a41c6e7bad90d617c40531774be2e373dcea72c85b4143af"} Apr 19 12:33:47.027114 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.027057 2583 scope.go:117] "RemoveContainer" containerID="caa5d6f9cd6a65790b2b2bd9c70bfaf237ac30dc2520729ef41a5219d1bca8c0" Apr 19 12:33:47.027114 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.026994 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b9cb554c4-2cbcc" Apr 19 12:33:47.028488 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.028466 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d5ff5c7dc-n9zns" event={"ID":"6b93b671-d9eb-461a-accb-0c88ae9d22e4","Type":"ContainerStarted","Data":"749cff106da792c0c96f01ab324009af0bb933a1d6cf352b0700d1d9e57124de"} Apr 19 12:33:47.028591 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.028498 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d5ff5c7dc-n9zns" event={"ID":"6b93b671-d9eb-461a-accb-0c88ae9d22e4","Type":"ContainerStarted","Data":"9f28347f5978e0c0803210487b82288f3b4307a8b92d7d0100134d3b1df93dee"} Apr 19 12:33:47.030705 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.030686 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-658448b984-6hs44" event={"ID":"45f0f7dc-3b40-4d18-843e-456c4cad83c2","Type":"ContainerStarted","Data":"e3ab1a515335bf4ff6e05fc84b90784f52f307a39bf4e61b0b7704e5f6998b29"} Apr 19 12:33:47.030795 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.030708 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-658448b984-6hs44" event={"ID":"45f0f7dc-3b40-4d18-843e-456c4cad83c2","Type":"ContainerStarted","Data":"13f31263015756587f8bfdcd7ecba0fbed0fb0ab1502f1bee81bf088d24e0d88"} Apr 19 12:33:47.030795 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.030717 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-658448b984-6hs44" event={"ID":"45f0f7dc-3b40-4d18-843e-456c4cad83c2","Type":"ContainerStarted","Data":"dbaaaecc76816f92cd10bd194c31bb7196720814bad65f40e1a5af80902690c6"} Apr 19 12:33:47.035886 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.035868 2583 scope.go:117] "RemoveContainer" containerID="caa5d6f9cd6a65790b2b2bd9c70bfaf237ac30dc2520729ef41a5219d1bca8c0" Apr 19 12:33:47.036108 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:33:47.036088 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caa5d6f9cd6a65790b2b2bd9c70bfaf237ac30dc2520729ef41a5219d1bca8c0\": container with ID starting with caa5d6f9cd6a65790b2b2bd9c70bfaf237ac30dc2520729ef41a5219d1bca8c0 not found: ID does not exist" containerID="caa5d6f9cd6a65790b2b2bd9c70bfaf237ac30dc2520729ef41a5219d1bca8c0" Apr 19 12:33:47.036177 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.036120 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caa5d6f9cd6a65790b2b2bd9c70bfaf237ac30dc2520729ef41a5219d1bca8c0"} err="failed to get container status \"caa5d6f9cd6a65790b2b2bd9c70bfaf237ac30dc2520729ef41a5219d1bca8c0\": rpc error: code = NotFound desc = could not find container \"caa5d6f9cd6a65790b2b2bd9c70bfaf237ac30dc2520729ef41a5219d1bca8c0\": container with ID starting with caa5d6f9cd6a65790b2b2bd9c70bfaf237ac30dc2520729ef41a5219d1bca8c0 not found: ID does not exist" Apr 19 12:33:47.045838 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.045801 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d5ff5c7dc-n9zns" podStartSLOduration=1.045789998 podStartE2EDuration="1.045789998s" podCreationTimestamp="2026-04-19 12:33:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:33:47.045663967 +0000 UTC m=+182.283314229" watchObservedRunningTime="2026-04-19 12:33:47.045789998 +0000 UTC m=+182.283440250" Apr 19 12:33:47.059218 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.059199 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-b9cb554c4-2cbcc"] Apr 19 12:33:47.061362 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.061343 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-b9cb554c4-2cbcc"] Apr 19 12:33:47.329288 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.329202 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b81e1107-0f07-4af8-8e14-097b73f57b5f" path="/var/lib/kubelet/pods/b81e1107-0f07-4af8-8e14-097b73f57b5f/volumes" Apr 19 12:33:47.875160 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.875123 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 19 12:33:47.875530 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.875388 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b81e1107-0f07-4af8-8e14-097b73f57b5f" containerName="registry" Apr 19 12:33:47.875530 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.875400 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81e1107-0f07-4af8-8e14-097b73f57b5f" containerName="registry" Apr 19 12:33:47.875530 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.875459 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="b81e1107-0f07-4af8-8e14-097b73f57b5f" containerName="registry" Apr 19 12:33:47.878675 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.878660 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:47.880837 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.880737 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 19 12:33:47.880837 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.880777 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 19 12:33:47.880837 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.880828 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 19 12:33:47.881164 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.880785 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-dbtjou4v04k56\"" Apr 19 12:33:47.881164 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.881081 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 19 12:33:47.881380 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.881288 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 19 12:33:47.881380 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.881323 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 19 12:33:47.881380 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.881360 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 19 12:33:47.881569 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.881381 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 19 12:33:47.881569 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.881366 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 19 12:33:47.881569 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.881558 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 19 12:33:47.881724 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.881584 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 19 12:33:47.881724 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.881620 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-hk7cx\"" Apr 19 12:33:47.884964 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.884942 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 19 12:33:47.887615 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.887596 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 19 12:33:47.893492 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.893470 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 19 12:33:47.935591 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.935564 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-config\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:47.935686 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.935599 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:47.935686 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.935623 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:47.935799 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.935702 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:47.935799 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.935743 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:47.935934 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.935804 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-web-config\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:47.935934 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.935828 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-config-out\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:47.935934 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.935913 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:47.936089 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.935942 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:47.936089 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.935968 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:47.936089 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.936058 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:47.936089 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.936084 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:47.936254 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.936135 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:47.936254 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.936161 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:47.936254 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.936196 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:47.936254 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.936213 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snzw7\" (UniqueName: \"kubernetes.io/projected/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-kube-api-access-snzw7\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:47.936254 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.936229 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:47.936254 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:47.936243 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.037953 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.037925 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.038088 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.037968 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.038088 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.038027 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.038088 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.038054 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.038243 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.038087 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.038243 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.038111 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snzw7\" (UniqueName: \"kubernetes.io/projected/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-kube-api-access-snzw7\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.038243 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.038133 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.038243 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.038153 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.038243 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.038180 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-config\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.038243 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.038204 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.038243 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.038238 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.038565 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.038271 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.038565 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.038300 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.038565 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.038342 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-web-config\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.038565 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.038369 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-config-out\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.038565 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.038400 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.038565 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.038423 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.038565 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.038447 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.039094 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.038675 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.039149 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.039117 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.040147 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.040119 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.041033 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.041007 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.041517 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.041488 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.042457 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.041940 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-web-config\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.042457 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.042161 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.042457 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.042382 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-658448b984-6hs44" event={"ID":"45f0f7dc-3b40-4d18-843e-456c4cad83c2","Type":"ContainerStarted","Data":"4b86df368e788dd3f4f2443d09516542282c073ca77f0fce1ae7d5a37082b526"} Apr 19 12:33:48.042457 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.042415 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-658448b984-6hs44" event={"ID":"45f0f7dc-3b40-4d18-843e-456c4cad83c2","Type":"ContainerStarted","Data":"1af207cf5661ab8d2ca4409d80cf8077b953ca6975f58388224fd8591310b74f"} Apr 19 12:33:48.042457 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.042431 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-658448b984-6hs44" event={"ID":"45f0f7dc-3b40-4d18-843e-456c4cad83c2","Type":"ContainerStarted","Data":"8a60dc285f644d57bbd384d3823445e1986346bdba9264c331095a55fc70a45a"} Apr 19 12:33:48.042751 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.042612 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.042891 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.042867 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.043182 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.043040 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:48.043522 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.043467 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.043801 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.043759 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.044010 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.043978 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-config-out\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.044718 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.044680 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.044931 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.044900 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.045244 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.045220 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.045643 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.045623 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-config\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.045956 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.045940 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.052462 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.052438 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snzw7\" (UniqueName: \"kubernetes.io/projected/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-kube-api-access-snzw7\") pod \"prometheus-k8s-0\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.064831 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.064795 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-658448b984-6hs44" podStartSLOduration=1.742130889 podStartE2EDuration="5.064784752s" podCreationTimestamp="2026-04-19 12:33:43 +0000 UTC" firstStartedPulling="2026-04-19 12:33:44.056887471 +0000 UTC m=+179.294537718" lastFinishedPulling="2026-04-19 12:33:47.379541341 +0000 UTC m=+182.617191581" observedRunningTime="2026-04-19 12:33:48.063471405 +0000 UTC m=+183.301121666" watchObservedRunningTime="2026-04-19 12:33:48.064784752 +0000 UTC m=+183.302435013" Apr 19 12:33:48.189649 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.189624 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:48.318283 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.318253 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 19 12:33:48.321499 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:33:48.321472 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod665d07c9_ff7e_4e0f_9bd5_a6d3cd22433f.slice/crio-914203e1f17b3990309a2af8fe23848fd2ef1d83e23510077b4fa9f8a73a0a89 WatchSource:0}: Error finding container 914203e1f17b3990309a2af8fe23848fd2ef1d83e23510077b4fa9f8a73a0a89: Status 404 returned error can't find the container with id 914203e1f17b3990309a2af8fe23848fd2ef1d83e23510077b4fa9f8a73a0a89 Apr 19 12:33:48.810693 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.810659 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:48.810874 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.810705 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:48.816427 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:48.816405 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:49.048466 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:49.048429 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f","Type":"ContainerStarted","Data":"914203e1f17b3990309a2af8fe23848fd2ef1d83e23510077b4fa9f8a73a0a89"} Apr 19 12:33:49.053941 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:49.053915 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:33:50.052486 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:50.052453 2583 generic.go:358] "Generic (PLEG): container finished" podID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerID="1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc" exitCode=0 Apr 19 12:33:50.052929 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:50.052550 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f","Type":"ContainerDied","Data":"1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc"} Apr 19 12:33:54.054305 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:54.054279 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-658448b984-6hs44" Apr 19 12:33:54.068115 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:54.068085 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f","Type":"ContainerStarted","Data":"43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91"} Apr 19 12:33:54.068115 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:54.068116 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f","Type":"ContainerStarted","Data":"013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468"} Apr 19 12:33:54.068310 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:54.068126 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f","Type":"ContainerStarted","Data":"3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18"} Apr 19 12:33:54.068310 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:54.068135 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f","Type":"ContainerStarted","Data":"1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba"} Apr 19 12:33:54.068310 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:54.068142 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f","Type":"ContainerStarted","Data":"7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390"} Apr 19 12:33:54.068310 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:54.068151 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f","Type":"ContainerStarted","Data":"3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85"} Apr 19 12:33:54.104231 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:54.104170 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.402434762 podStartE2EDuration="7.104152574s" podCreationTimestamp="2026-04-19 12:33:47 +0000 UTC" firstStartedPulling="2026-04-19 12:33:48.3234459 +0000 UTC m=+183.561096141" lastFinishedPulling="2026-04-19 12:33:53.025163713 +0000 UTC m=+188.262813953" observedRunningTime="2026-04-19 12:33:54.102214037 +0000 UTC m=+189.339864310" watchObservedRunningTime="2026-04-19 12:33:54.104152574 +0000 UTC m=+189.341802836" Apr 19 12:33:56.725251 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:56.725218 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:56.725618 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:56.725293 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:56.729770 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:56.729750 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:57.082689 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:57.082607 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:33:57.124091 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:57.124059 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78fff7c594-9qqpj"] Apr 19 12:33:58.190286 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:33:58.190259 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:33:58.329602 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:33:58.329565 2583 configmap.go:193] Couldn't get configMap openshift-monitoring/prometheus-k8s-rulefiles-0: configmap "prometheus-k8s-rulefiles-0" not found Apr 19 12:33:58.329759 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:33:58.329646 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-prometheus-k8s-rulefiles-0 podName:665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f nodeName:}" failed. No retries permitted until 2026-04-19 12:33:58.829623346 +0000 UTC m=+194.067273586 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-k8s-rulefiles-0" (UniqueName: "kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-prometheus-k8s-rulefiles-0") pod "prometheus-k8s-0" (UID: "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f") : configmap "prometheus-k8s-rulefiles-0" not found Apr 19 12:34:22.143868 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.143768 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-78fff7c594-9qqpj" podUID="9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe" containerName="console" containerID="cri-o://d3cc3855a4b13bc3084d4f0b2c57d87e521ac283842346e585521999d19b4351" gracePeriod=15 Apr 19 12:34:22.147280 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.147252 2583 generic.go:358] "Generic (PLEG): container finished" podID="6788f440-7533-43d4-acaf-4fac75b17707" containerID="c78d0f6e69d3d5c804ba01f2633b8eccf2714fc273e9d8eb20eefbcbe576267d" exitCode=0 Apr 19 12:34:22.147391 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.147312 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lkhb4" event={"ID":"6788f440-7533-43d4-acaf-4fac75b17707","Type":"ContainerDied","Data":"c78d0f6e69d3d5c804ba01f2633b8eccf2714fc273e9d8eb20eefbcbe576267d"} Apr 19 12:34:22.147718 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.147700 2583 scope.go:117] "RemoveContainer" containerID="c78d0f6e69d3d5c804ba01f2633b8eccf2714fc273e9d8eb20eefbcbe576267d" Apr 19 12:34:22.389807 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.389785 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78fff7c594-9qqpj_9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe/console/0.log" Apr 19 12:34:22.389965 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.389878 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:34:22.524704 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.524675 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66hc2\" (UniqueName: \"kubernetes.io/projected/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-kube-api-access-66hc2\") pod \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " Apr 19 12:34:22.524917 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.524725 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-console-oauth-config\") pod \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " Apr 19 12:34:22.524917 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.524765 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-oauth-serving-cert\") pod \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " Apr 19 12:34:22.524917 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.524786 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-service-ca\") pod \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " Apr 19 12:34:22.524917 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.524814 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-trusted-ca-bundle\") pod \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " Apr 19 12:34:22.524917 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.524832 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-console-serving-cert\") pod \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " Apr 19 12:34:22.524917 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.524890 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-console-config\") pod \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\" (UID: \"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe\") " Apr 19 12:34:22.525233 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.525204 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-service-ca" (OuterVolumeSpecName: "service-ca") pod "9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe" (UID: "9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:34:22.525445 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.525414 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-console-config" (OuterVolumeSpecName: "console-config") pod "9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe" (UID: "9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:34:22.525445 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.525433 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe" (UID: "9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:34:22.525590 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.525455 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe" (UID: "9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:34:22.527287 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.527255 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe" (UID: "9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:34:22.527287 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.527262 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-kube-api-access-66hc2" (OuterVolumeSpecName: "kube-api-access-66hc2") pod "9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe" (UID: "9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe"). InnerVolumeSpecName "kube-api-access-66hc2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:34:22.527450 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.527353 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe" (UID: "9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:34:22.625469 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.625439 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-66hc2\" (UniqueName: \"kubernetes.io/projected/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-kube-api-access-66hc2\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:34:22.625574 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.625473 2583 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-console-oauth-config\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:34:22.625574 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.625488 2583 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-oauth-serving-cert\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:34:22.625574 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.625502 2583 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-service-ca\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:34:22.625574 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.625516 2583 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-trusted-ca-bundle\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:34:22.625574 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.625530 2583 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-console-serving-cert\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:34:22.625574 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:22.625543 2583 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe-console-config\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:34:23.152403 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:23.152373 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78fff7c594-9qqpj_9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe/console/0.log" Apr 19 12:34:23.152873 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:23.152411 2583 generic.go:358] "Generic (PLEG): container finished" podID="9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe" containerID="d3cc3855a4b13bc3084d4f0b2c57d87e521ac283842346e585521999d19b4351" exitCode=2 Apr 19 12:34:23.152873 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:23.152477 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78fff7c594-9qqpj" event={"ID":"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe","Type":"ContainerDied","Data":"d3cc3855a4b13bc3084d4f0b2c57d87e521ac283842346e585521999d19b4351"} Apr 19 12:34:23.152873 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:23.152486 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78fff7c594-9qqpj" Apr 19 12:34:23.152873 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:23.152499 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78fff7c594-9qqpj" event={"ID":"9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe","Type":"ContainerDied","Data":"29f5bf10c7a1a4c4d7d959f194f846b519852e00d06bb9d4bd59050aed78bbb0"} Apr 19 12:34:23.152873 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:23.152514 2583 scope.go:117] "RemoveContainer" containerID="d3cc3855a4b13bc3084d4f0b2c57d87e521ac283842346e585521999d19b4351" Apr 19 12:34:23.154745 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:23.154721 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lkhb4" event={"ID":"6788f440-7533-43d4-acaf-4fac75b17707","Type":"ContainerStarted","Data":"cd7010ed8dbe3b5c3e9c0e01cd58ec454c737fd48be4a5d08a62d6fc1b072389"} Apr 19 12:34:23.161831 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:23.161803 2583 scope.go:117] "RemoveContainer" containerID="d3cc3855a4b13bc3084d4f0b2c57d87e521ac283842346e585521999d19b4351" Apr 19 12:34:23.162157 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:34:23.162119 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3cc3855a4b13bc3084d4f0b2c57d87e521ac283842346e585521999d19b4351\": container with ID starting with d3cc3855a4b13bc3084d4f0b2c57d87e521ac283842346e585521999d19b4351 not found: ID does not exist" containerID="d3cc3855a4b13bc3084d4f0b2c57d87e521ac283842346e585521999d19b4351" Apr 19 12:34:23.162218 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:23.162169 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3cc3855a4b13bc3084d4f0b2c57d87e521ac283842346e585521999d19b4351"} err="failed to get container status \"d3cc3855a4b13bc3084d4f0b2c57d87e521ac283842346e585521999d19b4351\": rpc error: code = NotFound desc = could not find container \"d3cc3855a4b13bc3084d4f0b2c57d87e521ac283842346e585521999d19b4351\": container with ID starting with d3cc3855a4b13bc3084d4f0b2c57d87e521ac283842346e585521999d19b4351 not found: ID does not exist" Apr 19 12:34:23.189662 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:23.189636 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78fff7c594-9qqpj"] Apr 19 12:34:23.199605 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:23.199554 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-78fff7c594-9qqpj"] Apr 19 12:34:23.329070 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:23.329027 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe" path="/var/lib/kubelet/pods/9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe/volumes" Apr 19 12:34:27.168009 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:27.167974 2583 generic.go:358] "Generic (PLEG): container finished" podID="a7f8c09c-afa9-4393-b47b-0e8efface148" containerID="9341f00c74f958dea3c4f8b05162674894767c379962613c74b1a86d7d5e2085" exitCode=0 Apr 19 12:34:27.168398 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:27.168038 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-kxff4" event={"ID":"a7f8c09c-afa9-4393-b47b-0e8efface148","Type":"ContainerDied","Data":"9341f00c74f958dea3c4f8b05162674894767c379962613c74b1a86d7d5e2085"} Apr 19 12:34:27.168398 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:27.168349 2583 scope.go:117] "RemoveContainer" containerID="9341f00c74f958dea3c4f8b05162674894767c379962613c74b1a86d7d5e2085" Apr 19 12:34:28.172104 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:28.172054 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-kxff4" event={"ID":"a7f8c09c-afa9-4393-b47b-0e8efface148","Type":"ContainerStarted","Data":"e3956950ba9764943b2d139d3aec970a6a10c14fcaf1b3a71b5e3e9fbb00e5bf"} Apr 19 12:34:48.190727 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:48.190685 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:34:48.207145 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:48.207123 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:34:48.253023 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:48.252998 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:34:56.191990 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:56.191950 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs\") pod \"network-metrics-daemon-lmdfj\" (UID: \"fb73e6b2-9f0a-4bcf-9371-0d399622fe97\") " pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:34:56.194500 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:56.194473 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb73e6b2-9f0a-4bcf-9371-0d399622fe97-metrics-certs\") pod \"network-metrics-daemon-lmdfj\" (UID: \"fb73e6b2-9f0a-4bcf-9371-0d399622fe97\") " pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:34:56.226465 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:56.226439 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nbhvx\"" Apr 19 12:34:56.234774 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:56.234756 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lmdfj" Apr 19 12:34:56.355078 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:56.354945 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lmdfj"] Apr 19 12:34:56.357689 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:34:56.357658 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb73e6b2_9f0a_4bcf_9371_0d399622fe97.slice/crio-01ebab6024abd0e3d160fb38cf996e29f6c65b5abbccabf94c70af4314525914 WatchSource:0}: Error finding container 01ebab6024abd0e3d160fb38cf996e29f6c65b5abbccabf94c70af4314525914: Status 404 returned error can't find the container with id 01ebab6024abd0e3d160fb38cf996e29f6c65b5abbccabf94c70af4314525914 Apr 19 12:34:57.263748 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:57.263711 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lmdfj" event={"ID":"fb73e6b2-9f0a-4bcf-9371-0d399622fe97","Type":"ContainerStarted","Data":"01ebab6024abd0e3d160fb38cf996e29f6c65b5abbccabf94c70af4314525914"} Apr 19 12:34:58.267491 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:58.267448 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lmdfj" event={"ID":"fb73e6b2-9f0a-4bcf-9371-0d399622fe97","Type":"ContainerStarted","Data":"cf293441afc6ee9027fdc2e30ca3c410fc59861fff5656a543b2c3ed72d0ff26"} Apr 19 12:34:58.267491 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:58.267498 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lmdfj" event={"ID":"fb73e6b2-9f0a-4bcf-9371-0d399622fe97","Type":"ContainerStarted","Data":"6ddf30b7fd464cb0492729a25d4c13b878c22e6b5f4d115930d89531cefd1918"} Apr 19 12:34:58.282344 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:34:58.282289 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lmdfj" podStartSLOduration=252.373762106 podStartE2EDuration="4m13.282270345s" podCreationTimestamp="2026-04-19 12:30:45 +0000 UTC" firstStartedPulling="2026-04-19 12:34:56.359432945 +0000 UTC m=+251.597083188" lastFinishedPulling="2026-04-19 12:34:57.267941188 +0000 UTC m=+252.505591427" observedRunningTime="2026-04-19 12:34:58.280727038 +0000 UTC m=+253.518377300" watchObservedRunningTime="2026-04-19 12:34:58.282270345 +0000 UTC m=+253.519920607" Apr 19 12:35:06.266013 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.265970 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 19 12:35:06.266595 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.266561 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="prometheus" containerID="cri-o://3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85" gracePeriod=600 Apr 19 12:35:06.266595 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.266574 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="kube-rbac-proxy" containerID="cri-o://013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468" gracePeriod=600 Apr 19 12:35:06.266749 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.266613 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="kube-rbac-proxy-web" containerID="cri-o://3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18" gracePeriod=600 Apr 19 12:35:06.266749 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.266642 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="config-reloader" containerID="cri-o://7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390" gracePeriod=600 Apr 19 12:35:06.266749 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.266713 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="kube-rbac-proxy-thanos" containerID="cri-o://43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91" gracePeriod=600 Apr 19 12:35:06.266929 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.266585 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="thanos-sidecar" containerID="cri-o://1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba" gracePeriod=600 Apr 19 12:35:06.502756 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.502732 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:06.577699 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.577628 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-prometheus-k8s-tls\") pod \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " Apr 19 12:35:06.577699 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.577667 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-thanos-prometheus-http-client-file\") pod \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " Apr 19 12:35:06.577699 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.577688 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-kube-rbac-proxy\") pod \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " Apr 19 12:35:06.577960 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.577705 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snzw7\" (UniqueName: \"kubernetes.io/projected/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-kube-api-access-snzw7\") pod \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " Apr 19 12:35:06.577960 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.577727 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-grpc-tls\") pod \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " Apr 19 12:35:06.577960 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.577753 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-prometheus-k8s-db\") pod \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " Apr 19 12:35:06.577960 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.577781 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-prometheus-trusted-ca-bundle\") pod \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " Apr 19 12:35:06.577960 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.577817 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-config-out\") pod \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " Apr 19 12:35:06.577960 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.577863 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-prometheus-k8s-rulefiles-0\") pod \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " Apr 19 12:35:06.577960 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.577908 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-configmap-serving-certs-ca-bundle\") pod \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " Apr 19 12:35:06.577960 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.577932 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-web-config\") pod \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " Apr 19 12:35:06.577960 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.577959 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-configmap-metrics-client-ca\") pod \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " Apr 19 12:35:06.578370 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.577984 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-metrics-client-certs\") pod \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " Apr 19 12:35:06.578370 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.578012 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " Apr 19 12:35:06.578370 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.578036 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-tls-assets\") pod \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " Apr 19 12:35:06.578370 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.578101 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-configmap-kubelet-serving-ca-bundle\") pod \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " Apr 19 12:35:06.578370 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.578128 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " Apr 19 12:35:06.578370 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.578159 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-config\") pod \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\" (UID: \"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f\") " Apr 19 12:35:06.579365 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.579324 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" (UID: "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:35:06.580797 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.579672 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" (UID: "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:35:06.580797 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.579774 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" (UID: "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:35:06.581011 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.580982 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" (UID: "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:35:06.581102 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.581048 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" (UID: "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:35:06.581400 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.581148 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" (UID: "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:35:06.581400 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.581193 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-config" (OuterVolumeSpecName: "config") pod "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" (UID: "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:35:06.581400 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.581237 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-kube-api-access-snzw7" (OuterVolumeSpecName: "kube-api-access-snzw7") pod "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" (UID: "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f"). InnerVolumeSpecName "kube-api-access-snzw7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:35:06.581589 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.581566 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" (UID: "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:35:06.581782 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.581747 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-config-out" (OuterVolumeSpecName: "config-out") pod "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" (UID: "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:35:06.582179 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.581775 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" (UID: "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:35:06.582268 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.582242 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" (UID: "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:35:06.582384 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.582342 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" (UID: "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:35:06.582384 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.582355 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" (UID: "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:35:06.583260 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.583238 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" (UID: "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:35:06.583745 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.583725 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" (UID: "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:35:06.584024 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.584007 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" (UID: "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:35:06.593514 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.593491 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-web-config" (OuterVolumeSpecName: "web-config") pod "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" (UID: "665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:35:06.679394 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.679364 2583 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:06.679394 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.679391 2583 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:06.679394 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.679402 2583 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-config\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:06.679582 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.679411 2583 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-prometheus-k8s-tls\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:06.679582 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.679421 2583 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-thanos-prometheus-http-client-file\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:06.679582 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.679430 2583 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-kube-rbac-proxy\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:06.679582 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.679439 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-snzw7\" (UniqueName: \"kubernetes.io/projected/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-kube-api-access-snzw7\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:06.679582 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.679447 2583 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-grpc-tls\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:06.679582 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.679456 2583 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-prometheus-k8s-db\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:06.679582 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.679465 2583 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-prometheus-trusted-ca-bundle\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:06.679582 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.679474 2583 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-config-out\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:06.679582 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.679483 2583 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:06.679582 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.679491 2583 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:06.679582 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.679502 2583 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-web-config\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:06.679582 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.679511 2583 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-configmap-metrics-client-ca\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:06.679582 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.679520 2583 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-metrics-client-certs\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:06.679582 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.679529 2583 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:06.679582 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:06.679540 2583 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f-tls-assets\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:07.295201 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.295166 2583 generic.go:358] "Generic (PLEG): container finished" podID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerID="43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91" exitCode=0 Apr 19 12:35:07.295201 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.295195 2583 generic.go:358] "Generic (PLEG): container finished" podID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerID="013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468" exitCode=0 Apr 19 12:35:07.295201 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.295201 2583 generic.go:358] "Generic (PLEG): container finished" podID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerID="3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18" exitCode=0 Apr 19 12:35:07.295201 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.295206 2583 generic.go:358] "Generic (PLEG): container finished" podID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerID="1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba" exitCode=0 Apr 19 12:35:07.295728 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.295212 2583 generic.go:358] "Generic (PLEG): container finished" podID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerID="7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390" exitCode=0 Apr 19 12:35:07.295728 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.295217 2583 generic.go:358] "Generic (PLEG): container finished" podID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerID="3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85" exitCode=0 Apr 19 12:35:07.295728 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.295249 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f","Type":"ContainerDied","Data":"43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91"} Apr 19 12:35:07.295728 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.295283 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f","Type":"ContainerDied","Data":"013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468"} Apr 19 12:35:07.295728 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.295294 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f","Type":"ContainerDied","Data":"3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18"} Apr 19 12:35:07.295728 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.295294 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.295728 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.295305 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f","Type":"ContainerDied","Data":"1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba"} Apr 19 12:35:07.295728 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.295317 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f","Type":"ContainerDied","Data":"7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390"} Apr 19 12:35:07.295728 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.295326 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f","Type":"ContainerDied","Data":"3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85"} Apr 19 12:35:07.295728 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.295338 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f","Type":"ContainerDied","Data":"914203e1f17b3990309a2af8fe23848fd2ef1d83e23510077b4fa9f8a73a0a89"} Apr 19 12:35:07.295728 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.295353 2583 scope.go:117] "RemoveContainer" containerID="43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91" Apr 19 12:35:07.303807 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.303664 2583 scope.go:117] "RemoveContainer" containerID="013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468" Apr 19 12:35:07.310370 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.310353 2583 scope.go:117] "RemoveContainer" containerID="3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18" Apr 19 12:35:07.316964 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.316945 2583 scope.go:117] "RemoveContainer" containerID="1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba" Apr 19 12:35:07.317535 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.317512 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 19 12:35:07.321656 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.321631 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 19 12:35:07.324760 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.324745 2583 scope.go:117] "RemoveContainer" containerID="7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390" Apr 19 12:35:07.327296 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.327272 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" path="/var/lib/kubelet/pods/665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f/volumes" Apr 19 12:35:07.332924 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.332903 2583 scope.go:117] "RemoveContainer" containerID="3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85" Apr 19 12:35:07.340500 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.340474 2583 scope.go:117] "RemoveContainer" containerID="1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc" Apr 19 12:35:07.345391 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.345366 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 19 12:35:07.345750 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.345736 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="config-reloader" Apr 19 12:35:07.345791 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.345754 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="config-reloader" Apr 19 12:35:07.345791 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.345769 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="thanos-sidecar" Apr 19 12:35:07.345791 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.345777 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="thanos-sidecar" Apr 19 12:35:07.345906 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.345789 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="kube-rbac-proxy-thanos" Apr 19 12:35:07.345906 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.345798 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="kube-rbac-proxy-thanos" Apr 19 12:35:07.345906 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.345809 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="kube-rbac-proxy-web" Apr 19 12:35:07.345906 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.345817 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="kube-rbac-proxy-web" Apr 19 12:35:07.345906 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.345830 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="init-config-reloader" Apr 19 12:35:07.345906 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.345838 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="init-config-reloader" Apr 19 12:35:07.345906 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.345871 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe" containerName="console" Apr 19 12:35:07.345906 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.345879 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe" containerName="console" Apr 19 12:35:07.345906 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.345888 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="prometheus" Apr 19 12:35:07.345906 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.345896 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="prometheus" Apr 19 12:35:07.346211 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.345913 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="kube-rbac-proxy" Apr 19 12:35:07.346211 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.345922 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="kube-rbac-proxy" Apr 19 12:35:07.346211 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.345989 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="9aa1a9ed-69a7-4120-8c3a-ffc824fdd4fe" containerName="console" Apr 19 12:35:07.346211 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.346001 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="config-reloader" Apr 19 12:35:07.346211 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.346012 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="prometheus" Apr 19 12:35:07.346211 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.346022 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="thanos-sidecar" Apr 19 12:35:07.346211 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.346032 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="kube-rbac-proxy-web" Apr 19 12:35:07.346211 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.346043 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="kube-rbac-proxy" Apr 19 12:35:07.346211 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.346051 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="665d07c9-ff7e-4e0f-9bd5-a6d3cd22433f" containerName="kube-rbac-proxy-thanos" Apr 19 12:35:07.347922 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.347831 2583 scope.go:117] "RemoveContainer" containerID="43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91" Apr 19 12:35:07.352134 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:35:07.352108 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91\": container with ID starting with 43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91 not found: ID does not exist" containerID="43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91" Apr 19 12:35:07.352233 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.352139 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91"} err="failed to get container status \"43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91\": rpc error: code = NotFound desc = could not find container \"43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91\": container with ID starting with 43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91 not found: ID does not exist" Apr 19 12:35:07.352233 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.352156 2583 scope.go:117] "RemoveContainer" containerID="013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468" Apr 19 12:35:07.352413 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:35:07.352396 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468\": container with ID starting with 013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468 not found: ID does not exist" containerID="013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468" Apr 19 12:35:07.352461 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.352416 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468"} err="failed to get container status \"013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468\": rpc error: code = NotFound desc = could not find container \"013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468\": container with ID starting with 013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468 not found: ID does not exist" Apr 19 12:35:07.352461 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.352428 2583 scope.go:117] "RemoveContainer" containerID="3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18" Apr 19 12:35:07.352679 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:35:07.352657 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18\": container with ID starting with 3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18 not found: ID does not exist" containerID="3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18" Apr 19 12:35:07.352788 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.352683 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18"} err="failed to get container status \"3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18\": rpc error: code = NotFound desc = could not find container \"3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18\": container with ID starting with 3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18 not found: ID does not exist" Apr 19 12:35:07.352788 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.352705 2583 scope.go:117] "RemoveContainer" containerID="1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba" Apr 19 12:35:07.352994 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:35:07.352969 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba\": container with ID starting with 1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba not found: ID does not exist" containerID="1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba" Apr 19 12:35:07.353035 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.352999 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba"} err="failed to get container status \"1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba\": rpc error: code = NotFound desc = could not find container \"1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba\": container with ID starting with 1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba not found: ID does not exist" Apr 19 12:35:07.353035 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.353014 2583 scope.go:117] "RemoveContainer" containerID="7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390" Apr 19 12:35:07.353228 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:35:07.353203 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390\": container with ID starting with 7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390 not found: ID does not exist" containerID="7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390" Apr 19 12:35:07.353272 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.353237 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390"} err="failed to get container status \"7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390\": rpc error: code = NotFound desc = could not find container \"7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390\": container with ID starting with 7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390 not found: ID does not exist" Apr 19 12:35:07.353272 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.353256 2583 scope.go:117] "RemoveContainer" containerID="3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85" Apr 19 12:35:07.353578 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:35:07.353552 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85\": container with ID starting with 3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85 not found: ID does not exist" containerID="3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85" Apr 19 12:35:07.353643 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.353593 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85"} err="failed to get container status \"3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85\": rpc error: code = NotFound desc = could not find container \"3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85\": container with ID starting with 3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85 not found: ID does not exist" Apr 19 12:35:07.353643 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.353617 2583 scope.go:117] "RemoveContainer" containerID="1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc" Apr 19 12:35:07.353906 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:35:07.353881 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc\": container with ID starting with 1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc not found: ID does not exist" containerID="1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc" Apr 19 12:35:07.353947 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.353915 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc"} err="failed to get container status \"1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc\": rpc error: code = NotFound desc = could not find container \"1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc\": container with ID starting with 1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc not found: ID does not exist" Apr 19 12:35:07.353947 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.353939 2583 scope.go:117] "RemoveContainer" containerID="43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91" Apr 19 12:35:07.354216 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.354197 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91"} err="failed to get container status \"43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91\": rpc error: code = NotFound desc = could not find container \"43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91\": container with ID starting with 43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91 not found: ID does not exist" Apr 19 12:35:07.354216 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.354216 2583 scope.go:117] "RemoveContainer" containerID="013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468" Apr 19 12:35:07.354449 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.354431 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468"} err="failed to get container status \"013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468\": rpc error: code = NotFound desc = could not find container \"013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468\": container with ID starting with 013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468 not found: ID does not exist" Apr 19 12:35:07.354494 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.354450 2583 scope.go:117] "RemoveContainer" containerID="3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18" Apr 19 12:35:07.354673 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.354658 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18"} err="failed to get container status \"3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18\": rpc error: code = NotFound desc = could not find container \"3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18\": container with ID starting with 3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18 not found: ID does not exist" Apr 19 12:35:07.354712 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.354674 2583 scope.go:117] "RemoveContainer" containerID="1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba" Apr 19 12:35:07.354911 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.354886 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba"} err="failed to get container status \"1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba\": rpc error: code = NotFound desc = could not find container \"1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba\": container with ID starting with 1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba not found: ID does not exist" Apr 19 12:35:07.354963 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.354914 2583 scope.go:117] "RemoveContainer" containerID="7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390" Apr 19 12:35:07.354963 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.354904 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.355302 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.355153 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390"} err="failed to get container status \"7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390\": rpc error: code = NotFound desc = could not find container \"7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390\": container with ID starting with 7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390 not found: ID does not exist" Apr 19 12:35:07.355302 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.355180 2583 scope.go:117] "RemoveContainer" containerID="3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85" Apr 19 12:35:07.355609 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.355576 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85"} err="failed to get container status \"3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85\": rpc error: code = NotFound desc = could not find container \"3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85\": container with ID starting with 3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85 not found: ID does not exist" Apr 19 12:35:07.355706 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.355612 2583 scope.go:117] "RemoveContainer" containerID="1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc" Apr 19 12:35:07.356078 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.356053 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc"} err="failed to get container status \"1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc\": rpc error: code = NotFound desc = could not find container \"1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc\": container with ID starting with 1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc not found: ID does not exist" Apr 19 12:35:07.356162 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.356079 2583 scope.go:117] "RemoveContainer" containerID="43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91" Apr 19 12:35:07.356364 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.356333 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91"} err="failed to get container status \"43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91\": rpc error: code = NotFound desc = could not find container \"43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91\": container with ID starting with 43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91 not found: ID does not exist" Apr 19 12:35:07.356364 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.356351 2583 scope.go:117] "RemoveContainer" containerID="013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468" Apr 19 12:35:07.356599 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.356580 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468"} err="failed to get container status \"013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468\": rpc error: code = NotFound desc = could not find container \"013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468\": container with ID starting with 013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468 not found: ID does not exist" Apr 19 12:35:07.356651 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.356602 2583 scope.go:117] "RemoveContainer" containerID="3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18" Apr 19 12:35:07.356838 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.356818 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18"} err="failed to get container status \"3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18\": rpc error: code = NotFound desc = could not find container \"3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18\": container with ID starting with 3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18 not found: ID does not exist" Apr 19 12:35:07.356838 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.356839 2583 scope.go:117] "RemoveContainer" containerID="1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba" Apr 19 12:35:07.357088 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.356926 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 19 12:35:07.357088 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.356926 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 19 12:35:07.357234 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.357095 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba"} err="failed to get container status \"1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba\": rpc error: code = NotFound desc = could not find container \"1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba\": container with ID starting with 1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba not found: ID does not exist" Apr 19 12:35:07.357234 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.357114 2583 scope.go:117] "RemoveContainer" containerID="7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390" Apr 19 12:35:07.357401 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.357381 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390"} err="failed to get container status \"7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390\": rpc error: code = NotFound desc = could not find container \"7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390\": container with ID starting with 7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390 not found: ID does not exist" Apr 19 12:35:07.357401 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.357402 2583 scope.go:117] "RemoveContainer" containerID="3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85" Apr 19 12:35:07.357650 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.357607 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 19 12:35:07.357650 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.357598 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85"} err="failed to get container status \"3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85\": rpc error: code = NotFound desc = could not find container \"3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85\": container with ID starting with 3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85 not found: ID does not exist" Apr 19 12:35:07.357650 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.357630 2583 scope.go:117] "RemoveContainer" containerID="1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc" Apr 19 12:35:07.357905 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.357669 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-dbtjou4v04k56\"" Apr 19 12:35:07.357905 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.357709 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 19 12:35:07.357905 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.357722 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 19 12:35:07.357905 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.357757 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 19 12:35:07.357905 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.357870 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 19 12:35:07.358309 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.357918 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 19 12:35:07.358309 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.358034 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 19 12:35:07.358309 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.358076 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 19 12:35:07.358309 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.358166 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 19 12:35:07.358309 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.358214 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc"} err="failed to get container status \"1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc\": rpc error: code = NotFound desc = could not find container \"1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc\": container with ID starting with 1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc not found: ID does not exist" Apr 19 12:35:07.358309 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.358236 2583 scope.go:117] "RemoveContainer" containerID="43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91" Apr 19 12:35:07.358309 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.358254 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-hk7cx\"" Apr 19 12:35:07.358729 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.358544 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91"} err="failed to get container status \"43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91\": rpc error: code = NotFound desc = could not find container \"43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91\": container with ID starting with 43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91 not found: ID does not exist" Apr 19 12:35:07.358729 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.358586 2583 scope.go:117] "RemoveContainer" containerID="013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468" Apr 19 12:35:07.358925 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.358883 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468"} err="failed to get container status \"013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468\": rpc error: code = NotFound desc = could not find container \"013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468\": container with ID starting with 013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468 not found: ID does not exist" Apr 19 12:35:07.358925 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.358919 2583 scope.go:117] "RemoveContainer" containerID="3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18" Apr 19 12:35:07.359341 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.359223 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18"} err="failed to get container status \"3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18\": rpc error: code = NotFound desc = could not find container \"3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18\": container with ID starting with 3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18 not found: ID does not exist" Apr 19 12:35:07.359341 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.359253 2583 scope.go:117] "RemoveContainer" containerID="1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba" Apr 19 12:35:07.359964 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.359937 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba"} err="failed to get container status \"1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba\": rpc error: code = NotFound desc = could not find container \"1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba\": container with ID starting with 1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba not found: ID does not exist" Apr 19 12:35:07.360055 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.359964 2583 scope.go:117] "RemoveContainer" containerID="7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390" Apr 19 12:35:07.360768 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.360454 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 19 12:35:07.360768 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.360656 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390"} err="failed to get container status \"7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390\": rpc error: code = NotFound desc = could not find container \"7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390\": container with ID starting with 7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390 not found: ID does not exist" Apr 19 12:35:07.360768 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.360678 2583 scope.go:117] "RemoveContainer" containerID="3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85" Apr 19 12:35:07.361192 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.361096 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85"} err="failed to get container status \"3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85\": rpc error: code = NotFound desc = could not find container \"3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85\": container with ID starting with 3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85 not found: ID does not exist" Apr 19 12:35:07.361192 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.361123 2583 scope.go:117] "RemoveContainer" containerID="1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc" Apr 19 12:35:07.361441 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.361416 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc"} err="failed to get container status \"1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc\": rpc error: code = NotFound desc = could not find container \"1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc\": container with ID starting with 1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc not found: ID does not exist" Apr 19 12:35:07.361510 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.361446 2583 scope.go:117] "RemoveContainer" containerID="43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91" Apr 19 12:35:07.361902 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.361791 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91"} err="failed to get container status \"43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91\": rpc error: code = NotFound desc = could not find container \"43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91\": container with ID starting with 43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91 not found: ID does not exist" Apr 19 12:35:07.361902 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.361815 2583 scope.go:117] "RemoveContainer" containerID="013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468" Apr 19 12:35:07.362295 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.362267 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468"} err="failed to get container status \"013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468\": rpc error: code = NotFound desc = could not find container \"013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468\": container with ID starting with 013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468 not found: ID does not exist" Apr 19 12:35:07.362295 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.362295 2583 scope.go:117] "RemoveContainer" containerID="3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18" Apr 19 12:35:07.362562 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.362538 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18"} err="failed to get container status \"3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18\": rpc error: code = NotFound desc = could not find container \"3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18\": container with ID starting with 3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18 not found: ID does not exist" Apr 19 12:35:07.362687 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.362566 2583 scope.go:117] "RemoveContainer" containerID="1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba" Apr 19 12:35:07.362877 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.362817 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba"} err="failed to get container status \"1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba\": rpc error: code = NotFound desc = could not find container \"1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba\": container with ID starting with 1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba not found: ID does not exist" Apr 19 12:35:07.362877 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.362872 2583 scope.go:117] "RemoveContainer" containerID="7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390" Apr 19 12:35:07.363010 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.362831 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 19 12:35:07.363221 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.363163 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390"} err="failed to get container status \"7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390\": rpc error: code = NotFound desc = could not find container \"7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390\": container with ID starting with 7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390 not found: ID does not exist" Apr 19 12:35:07.363221 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.363220 2583 scope.go:117] "RemoveContainer" containerID="3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85" Apr 19 12:35:07.363536 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.363504 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85"} err="failed to get container status \"3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85\": rpc error: code = NotFound desc = could not find container \"3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85\": container with ID starting with 3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85 not found: ID does not exist" Apr 19 12:35:07.363536 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.363530 2583 scope.go:117] "RemoveContainer" containerID="1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc" Apr 19 12:35:07.363895 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.363829 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc"} err="failed to get container status \"1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc\": rpc error: code = NotFound desc = could not find container \"1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc\": container with ID starting with 1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc not found: ID does not exist" Apr 19 12:35:07.363895 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.363895 2583 scope.go:117] "RemoveContainer" containerID="43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91" Apr 19 12:35:07.364210 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.364186 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91"} err="failed to get container status \"43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91\": rpc error: code = NotFound desc = could not find container \"43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91\": container with ID starting with 43e0afda6dfc62b88efeec393b50a93aada6b702e9022dcd940f1fd6ec719a91 not found: ID does not exist" Apr 19 12:35:07.364289 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.364212 2583 scope.go:117] "RemoveContainer" containerID="013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468" Apr 19 12:35:07.364506 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.364477 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468"} err="failed to get container status \"013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468\": rpc error: code = NotFound desc = could not find container \"013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468\": container with ID starting with 013a3c9a91b5f362adb9593981a44c6c6f42adcb0451c3ace3392e3bffb69468 not found: ID does not exist" Apr 19 12:35:07.364616 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.364507 2583 scope.go:117] "RemoveContainer" containerID="3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18" Apr 19 12:35:07.364988 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.364923 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 19 12:35:07.365306 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.365287 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18"} err="failed to get container status \"3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18\": rpc error: code = NotFound desc = could not find container \"3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18\": container with ID starting with 3d6764a0da699b7017c5e00095ff762ab4526543833c3fd3682064576b494b18 not found: ID does not exist" Apr 19 12:35:07.365306 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.365307 2583 scope.go:117] "RemoveContainer" containerID="1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba" Apr 19 12:35:07.365576 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.365551 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba"} err="failed to get container status \"1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba\": rpc error: code = NotFound desc = could not find container \"1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba\": container with ID starting with 1f4b80a2c943330614201be1ad4e0abde1a80fbcce8c941a316c0e2003f2c8ba not found: ID does not exist" Apr 19 12:35:07.365638 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.365577 2583 scope.go:117] "RemoveContainer" containerID="7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390" Apr 19 12:35:07.365810 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.365790 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390"} err="failed to get container status \"7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390\": rpc error: code = NotFound desc = could not find container \"7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390\": container with ID starting with 7acda596dd8e58fa6a0cf41307e0b663ea29194fdb4012f228d3ecf89cbc6390 not found: ID does not exist" Apr 19 12:35:07.365810 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.365810 2583 scope.go:117] "RemoveContainer" containerID="3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85" Apr 19 12:35:07.366059 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.366035 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85"} err="failed to get container status \"3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85\": rpc error: code = NotFound desc = could not find container \"3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85\": container with ID starting with 3e8a75fa347574495dae5e210f895fc81e07d207a85b5ac7c7b5eac494dfee85 not found: ID does not exist" Apr 19 12:35:07.366126 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.366061 2583 scope.go:117] "RemoveContainer" containerID="1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc" Apr 19 12:35:07.366318 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.366301 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc"} err="failed to get container status \"1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc\": rpc error: code = NotFound desc = could not find container \"1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc\": container with ID starting with 1c19a61188a17e4ae0b2df6ba48d264bc7b47e3df2c021b1162a96b66813e3fc not found: ID does not exist" Apr 19 12:35:07.484330 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.484297 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-config\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.484330 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.484330 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.484569 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.484350 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2d481966-39bc-49fb-9c52-0fb57e05898e-config-out\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.484569 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.484404 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.484569 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.484467 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2d481966-39bc-49fb-9c52-0fb57e05898e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.484569 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.484505 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d481966-39bc-49fb-9c52-0fb57e05898e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.484569 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.484531 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d481966-39bc-49fb-9c52-0fb57e05898e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.484569 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.484557 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-web-config\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.484786 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.484593 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.484786 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.484638 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2d481966-39bc-49fb-9c52-0fb57e05898e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.484786 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.484676 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.484786 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.484705 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.484786 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.484728 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.484786 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.484776 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d481966-39bc-49fb-9c52-0fb57e05898e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.485013 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.484802 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2d481966-39bc-49fb-9c52-0fb57e05898e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.485013 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.484831 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d481966-39bc-49fb-9c52-0fb57e05898e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.485013 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.484912 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.485013 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.484931 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwrd6\" (UniqueName: \"kubernetes.io/projected/2d481966-39bc-49fb-9c52-0fb57e05898e-kube-api-access-bwrd6\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.586002 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.585901 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.586002 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.585967 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.586002 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.586000 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d481966-39bc-49fb-9c52-0fb57e05898e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.586259 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.586031 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2d481966-39bc-49fb-9c52-0fb57e05898e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.586259 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.586241 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d481966-39bc-49fb-9c52-0fb57e05898e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.586359 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.586307 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.586359 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.586342 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bwrd6\" (UniqueName: \"kubernetes.io/projected/2d481966-39bc-49fb-9c52-0fb57e05898e-kube-api-access-bwrd6\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.586457 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.586371 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-config\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.586457 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.586399 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.586457 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.586427 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2d481966-39bc-49fb-9c52-0fb57e05898e-config-out\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.586600 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.586454 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.586600 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.586491 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2d481966-39bc-49fb-9c52-0fb57e05898e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.586600 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.586518 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d481966-39bc-49fb-9c52-0fb57e05898e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.586600 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.586545 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d481966-39bc-49fb-9c52-0fb57e05898e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.586600 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.586573 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-web-config\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.586830 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.586604 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.586830 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.586634 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2d481966-39bc-49fb-9c52-0fb57e05898e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.586830 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.586672 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.587026 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.586922 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d481966-39bc-49fb-9c52-0fb57e05898e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.587831 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.587169 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d481966-39bc-49fb-9c52-0fb57e05898e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.587983 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.587933 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d481966-39bc-49fb-9c52-0fb57e05898e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.589822 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.589199 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2d481966-39bc-49fb-9c52-0fb57e05898e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.589822 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.589289 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.589822 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.589511 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.590614 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.589948 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d481966-39bc-49fb-9c52-0fb57e05898e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.590614 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.590012 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-config\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.590614 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.590048 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.590614 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.590353 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2d481966-39bc-49fb-9c52-0fb57e05898e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.590614 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.590453 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-web-config\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.590894 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.590873 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.591675 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.591650 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.592134 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.592105 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.592227 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.592187 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2d481966-39bc-49fb-9c52-0fb57e05898e-config-out\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.592476 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.592456 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2d481966-39bc-49fb-9c52-0fb57e05898e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.592800 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.592779 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2d481966-39bc-49fb-9c52-0fb57e05898e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.594513 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.594493 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwrd6\" (UniqueName: \"kubernetes.io/projected/2d481966-39bc-49fb-9c52-0fb57e05898e-kube-api-access-bwrd6\") pod \"prometheus-k8s-0\" (UID: \"2d481966-39bc-49fb-9c52-0fb57e05898e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.667530 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.667499 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:07.793525 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:07.793496 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 19 12:35:07.796480 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:35:07.796444 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d481966_39bc_49fb_9c52_0fb57e05898e.slice/crio-753d1d94247e81fa5476cb4e3f1a7ef0407583053e53b746d053378fe61cadc5 WatchSource:0}: Error finding container 753d1d94247e81fa5476cb4e3f1a7ef0407583053e53b746d053378fe61cadc5: Status 404 returned error can't find the container with id 753d1d94247e81fa5476cb4e3f1a7ef0407583053e53b746d053378fe61cadc5 Apr 19 12:35:08.299895 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:08.299834 2583 generic.go:358] "Generic (PLEG): container finished" podID="2d481966-39bc-49fb-9c52-0fb57e05898e" containerID="f765f0230a5547b06e050aa11f74f4f4d19f396bbaf308e6ccd5711254443178" exitCode=0 Apr 19 12:35:08.299895 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:08.299883 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2d481966-39bc-49fb-9c52-0fb57e05898e","Type":"ContainerDied","Data":"f765f0230a5547b06e050aa11f74f4f4d19f396bbaf308e6ccd5711254443178"} Apr 19 12:35:08.300316 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:08.299915 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2d481966-39bc-49fb-9c52-0fb57e05898e","Type":"ContainerStarted","Data":"753d1d94247e81fa5476cb4e3f1a7ef0407583053e53b746d053378fe61cadc5"} Apr 19 12:35:09.306776 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:09.306744 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2d481966-39bc-49fb-9c52-0fb57e05898e","Type":"ContainerStarted","Data":"4bc3617f2eb144df88953a088685ac49b167690b885d056a1eba79f6b6f081af"} Apr 19 12:35:09.306776 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:09.306777 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2d481966-39bc-49fb-9c52-0fb57e05898e","Type":"ContainerStarted","Data":"27a453a12addf7196af8acb13016067a83efcf617ba157cbcb1096bd9c4982ba"} Apr 19 12:35:09.307170 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:09.306787 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2d481966-39bc-49fb-9c52-0fb57e05898e","Type":"ContainerStarted","Data":"136bb3113b53d5a7d1b0f1dc5e75236434f5f0fa1c997bbbc7fd62016df6fc79"} Apr 19 12:35:09.307170 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:09.306811 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2d481966-39bc-49fb-9c52-0fb57e05898e","Type":"ContainerStarted","Data":"6767a820e8c815fc3607ae40e67cc9b3938ee36af18c4d44e207970fa4d78129"} Apr 19 12:35:09.307170 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:09.306819 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2d481966-39bc-49fb-9c52-0fb57e05898e","Type":"ContainerStarted","Data":"2d8a55711adbb4a090e2884cd2dfbde0ccfc817462ba59d4d70a0c96fe74803e"} Apr 19 12:35:09.307170 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:09.306827 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2d481966-39bc-49fb-9c52-0fb57e05898e","Type":"ContainerStarted","Data":"bb2d4ac7036aeec7d474f98881282ea99dbbc9e022f8db5cfcf73674d87866bc"} Apr 19 12:35:09.331904 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:09.331828 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.331813362 podStartE2EDuration="2.331813362s" podCreationTimestamp="2026-04-19 12:35:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:35:09.330529898 +0000 UTC m=+264.568180160" watchObservedRunningTime="2026-04-19 12:35:09.331813362 +0000 UTC m=+264.569463624" Apr 19 12:35:12.667960 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:12.667932 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:35:21.460143 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:21.460104 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d5ff5c7dc-n9zns"] Apr 19 12:35:45.233739 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:45.233711 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovn-acl-logging/0.log" Apr 19 12:35:45.234364 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:45.233745 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovn-acl-logging/0.log" Apr 19 12:35:45.240992 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:45.240970 2583 kubelet.go:1628] "Image garbage collection succeeded" Apr 19 12:35:46.484968 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.484903 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6d5ff5c7dc-n9zns" podUID="6b93b671-d9eb-461a-accb-0c88ae9d22e4" containerName="console" containerID="cri-o://749cff106da792c0c96f01ab324009af0bb933a1d6cf352b0700d1d9e57124de" gracePeriod=15 Apr 19 12:35:46.724117 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.724094 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d5ff5c7dc-n9zns_6b93b671-d9eb-461a-accb-0c88ae9d22e4/console/0.log" Apr 19 12:35:46.724226 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.724157 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:35:46.803976 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.803899 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-trusted-ca-bundle\") pod \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " Apr 19 12:35:46.803976 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.803931 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b93b671-d9eb-461a-accb-0c88ae9d22e4-console-serving-cert\") pod \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " Apr 19 12:35:46.803976 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.803966 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-console-config\") pod \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " Apr 19 12:35:46.804240 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.803997 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-service-ca\") pod \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " Apr 19 12:35:46.804240 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.804027 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b93b671-d9eb-461a-accb-0c88ae9d22e4-console-oauth-config\") pod \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " Apr 19 12:35:46.804240 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.804053 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-oauth-serving-cert\") pod \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " Apr 19 12:35:46.804240 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.804083 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdgt7\" (UniqueName: \"kubernetes.io/projected/6b93b671-d9eb-461a-accb-0c88ae9d22e4-kube-api-access-qdgt7\") pod \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\" (UID: \"6b93b671-d9eb-461a-accb-0c88ae9d22e4\") " Apr 19 12:35:46.804433 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.804402 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6b93b671-d9eb-461a-accb-0c88ae9d22e4" (UID: "6b93b671-d9eb-461a-accb-0c88ae9d22e4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:35:46.804479 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.804437 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-service-ca" (OuterVolumeSpecName: "service-ca") pod "6b93b671-d9eb-461a-accb-0c88ae9d22e4" (UID: "6b93b671-d9eb-461a-accb-0c88ae9d22e4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:35:46.804555 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.804526 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-console-config" (OuterVolumeSpecName: "console-config") pod "6b93b671-d9eb-461a-accb-0c88ae9d22e4" (UID: "6b93b671-d9eb-461a-accb-0c88ae9d22e4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:35:46.804555 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.804536 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6b93b671-d9eb-461a-accb-0c88ae9d22e4" (UID: "6b93b671-d9eb-461a-accb-0c88ae9d22e4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:35:46.806243 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.806211 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b93b671-d9eb-461a-accb-0c88ae9d22e4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6b93b671-d9eb-461a-accb-0c88ae9d22e4" (UID: "6b93b671-d9eb-461a-accb-0c88ae9d22e4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:35:46.806355 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.806319 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b93b671-d9eb-461a-accb-0c88ae9d22e4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6b93b671-d9eb-461a-accb-0c88ae9d22e4" (UID: "6b93b671-d9eb-461a-accb-0c88ae9d22e4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:35:46.806355 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.806335 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b93b671-d9eb-461a-accb-0c88ae9d22e4-kube-api-access-qdgt7" (OuterVolumeSpecName: "kube-api-access-qdgt7") pod "6b93b671-d9eb-461a-accb-0c88ae9d22e4" (UID: "6b93b671-d9eb-461a-accb-0c88ae9d22e4"). InnerVolumeSpecName "kube-api-access-qdgt7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:35:46.905661 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.905619 2583 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-trusted-ca-bundle\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:46.905661 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.905654 2583 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b93b671-d9eb-461a-accb-0c88ae9d22e4-console-serving-cert\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:46.905661 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.905665 2583 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-console-config\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:46.905887 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.905683 2583 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-service-ca\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:46.905887 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.905691 2583 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b93b671-d9eb-461a-accb-0c88ae9d22e4-console-oauth-config\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:46.905887 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.905700 2583 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b93b671-d9eb-461a-accb-0c88ae9d22e4-oauth-serving-cert\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:46.905887 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:46.905708 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qdgt7\" (UniqueName: \"kubernetes.io/projected/6b93b671-d9eb-461a-accb-0c88ae9d22e4-kube-api-access-qdgt7\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:35:47.417501 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:47.417477 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d5ff5c7dc-n9zns_6b93b671-d9eb-461a-accb-0c88ae9d22e4/console/0.log" Apr 19 12:35:47.417657 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:47.417516 2583 generic.go:358] "Generic (PLEG): container finished" podID="6b93b671-d9eb-461a-accb-0c88ae9d22e4" containerID="749cff106da792c0c96f01ab324009af0bb933a1d6cf352b0700d1d9e57124de" exitCode=2 Apr 19 12:35:47.417657 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:47.417583 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d5ff5c7dc-n9zns" Apr 19 12:35:47.417657 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:47.417602 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d5ff5c7dc-n9zns" event={"ID":"6b93b671-d9eb-461a-accb-0c88ae9d22e4","Type":"ContainerDied","Data":"749cff106da792c0c96f01ab324009af0bb933a1d6cf352b0700d1d9e57124de"} Apr 19 12:35:47.417657 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:47.417644 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d5ff5c7dc-n9zns" event={"ID":"6b93b671-d9eb-461a-accb-0c88ae9d22e4","Type":"ContainerDied","Data":"9f28347f5978e0c0803210487b82288f3b4307a8b92d7d0100134d3b1df93dee"} Apr 19 12:35:47.417802 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:47.417663 2583 scope.go:117] "RemoveContainer" containerID="749cff106da792c0c96f01ab324009af0bb933a1d6cf352b0700d1d9e57124de" Apr 19 12:35:47.425439 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:47.425416 2583 scope.go:117] "RemoveContainer" containerID="749cff106da792c0c96f01ab324009af0bb933a1d6cf352b0700d1d9e57124de" Apr 19 12:35:47.425670 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:35:47.425653 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"749cff106da792c0c96f01ab324009af0bb933a1d6cf352b0700d1d9e57124de\": container with ID starting with 749cff106da792c0c96f01ab324009af0bb933a1d6cf352b0700d1d9e57124de not found: ID does not exist" containerID="749cff106da792c0c96f01ab324009af0bb933a1d6cf352b0700d1d9e57124de" Apr 19 12:35:47.425727 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:47.425678 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749cff106da792c0c96f01ab324009af0bb933a1d6cf352b0700d1d9e57124de"} err="failed to get container status \"749cff106da792c0c96f01ab324009af0bb933a1d6cf352b0700d1d9e57124de\": rpc error: code = NotFound desc = could not find container \"749cff106da792c0c96f01ab324009af0bb933a1d6cf352b0700d1d9e57124de\": container with ID starting with 749cff106da792c0c96f01ab324009af0bb933a1d6cf352b0700d1d9e57124de not found: ID does not exist" Apr 19 12:35:47.432820 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:47.432800 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d5ff5c7dc-n9zns"] Apr 19 12:35:47.436283 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:47.436261 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6d5ff5c7dc-n9zns"] Apr 19 12:35:49.327317 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:35:49.327280 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b93b671-d9eb-461a-accb-0c88ae9d22e4" path="/var/lib/kubelet/pods/6b93b671-d9eb-461a-accb-0c88ae9d22e4/volumes" Apr 19 12:36:07.668475 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:07.668431 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:36:07.684621 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:07.684594 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:36:08.492950 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:08.492910 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:36:58.860982 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:58.860836 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-jshbm"] Apr 19 12:36:58.861462 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:58.861187 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b93b671-d9eb-461a-accb-0c88ae9d22e4" containerName="console" Apr 19 12:36:58.861462 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:58.861200 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b93b671-d9eb-461a-accb-0c88ae9d22e4" containerName="console" Apr 19 12:36:58.861462 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:58.861255 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b93b671-d9eb-461a-accb-0c88ae9d22e4" containerName="console" Apr 19 12:36:58.863937 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:58.863921 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jshbm" Apr 19 12:36:58.865890 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:58.865872 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 19 12:36:58.871732 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:58.871710 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jshbm"] Apr 19 12:36:58.965403 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:58.965358 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/264963bd-7b49-4b81-8bf7-ed2bb6b6df20-original-pull-secret\") pod \"global-pull-secret-syncer-jshbm\" (UID: \"264963bd-7b49-4b81-8bf7-ed2bb6b6df20\") " pod="kube-system/global-pull-secret-syncer-jshbm" Apr 19 12:36:58.965550 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:58.965419 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/264963bd-7b49-4b81-8bf7-ed2bb6b6df20-kubelet-config\") pod \"global-pull-secret-syncer-jshbm\" (UID: \"264963bd-7b49-4b81-8bf7-ed2bb6b6df20\") " pod="kube-system/global-pull-secret-syncer-jshbm" Apr 19 12:36:58.965550 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:58.965449 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/264963bd-7b49-4b81-8bf7-ed2bb6b6df20-dbus\") pod \"global-pull-secret-syncer-jshbm\" (UID: \"264963bd-7b49-4b81-8bf7-ed2bb6b6df20\") " pod="kube-system/global-pull-secret-syncer-jshbm" Apr 19 12:36:59.066428 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:59.066403 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/264963bd-7b49-4b81-8bf7-ed2bb6b6df20-kubelet-config\") pod \"global-pull-secret-syncer-jshbm\" (UID: \"264963bd-7b49-4b81-8bf7-ed2bb6b6df20\") " pod="kube-system/global-pull-secret-syncer-jshbm" Apr 19 12:36:59.066575 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:59.066443 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/264963bd-7b49-4b81-8bf7-ed2bb6b6df20-dbus\") pod \"global-pull-secret-syncer-jshbm\" (UID: \"264963bd-7b49-4b81-8bf7-ed2bb6b6df20\") " pod="kube-system/global-pull-secret-syncer-jshbm" Apr 19 12:36:59.066575 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:59.066516 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/264963bd-7b49-4b81-8bf7-ed2bb6b6df20-kubelet-config\") pod \"global-pull-secret-syncer-jshbm\" (UID: \"264963bd-7b49-4b81-8bf7-ed2bb6b6df20\") " pod="kube-system/global-pull-secret-syncer-jshbm" Apr 19 12:36:59.066575 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:59.066504 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/264963bd-7b49-4b81-8bf7-ed2bb6b6df20-original-pull-secret\") pod \"global-pull-secret-syncer-jshbm\" (UID: \"264963bd-7b49-4b81-8bf7-ed2bb6b6df20\") " pod="kube-system/global-pull-secret-syncer-jshbm" Apr 19 12:36:59.066724 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:59.066684 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/264963bd-7b49-4b81-8bf7-ed2bb6b6df20-dbus\") pod \"global-pull-secret-syncer-jshbm\" (UID: \"264963bd-7b49-4b81-8bf7-ed2bb6b6df20\") " pod="kube-system/global-pull-secret-syncer-jshbm" Apr 19 12:36:59.068815 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:59.068792 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/264963bd-7b49-4b81-8bf7-ed2bb6b6df20-original-pull-secret\") pod \"global-pull-secret-syncer-jshbm\" (UID: \"264963bd-7b49-4b81-8bf7-ed2bb6b6df20\") " pod="kube-system/global-pull-secret-syncer-jshbm" Apr 19 12:36:59.173313 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:59.173229 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jshbm" Apr 19 12:36:59.292539 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:59.292507 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jshbm"] Apr 19 12:36:59.295885 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:36:59.295836 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod264963bd_7b49_4b81_8bf7_ed2bb6b6df20.slice/crio-2dace64282cf417b2637e3649b418a8dfe9755971d22c894f4df7bb56734d50e WatchSource:0}: Error finding container 2dace64282cf417b2637e3649b418a8dfe9755971d22c894f4df7bb56734d50e: Status 404 returned error can't find the container with id 2dace64282cf417b2637e3649b418a8dfe9755971d22c894f4df7bb56734d50e Apr 19 12:36:59.297442 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:59.297425 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:36:59.615909 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:36:59.615867 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jshbm" event={"ID":"264963bd-7b49-4b81-8bf7-ed2bb6b6df20","Type":"ContainerStarted","Data":"2dace64282cf417b2637e3649b418a8dfe9755971d22c894f4df7bb56734d50e"} Apr 19 12:37:04.632597 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:37:04.632557 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jshbm" event={"ID":"264963bd-7b49-4b81-8bf7-ed2bb6b6df20","Type":"ContainerStarted","Data":"a8879a118a9dccc0ec2450271e69008c7e886895b6ca2522acdff1285ef8d26c"} Apr 19 12:37:04.646728 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:37:04.646680 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-jshbm" podStartSLOduration=2.151722551 podStartE2EDuration="6.646666483s" podCreationTimestamp="2026-04-19 12:36:58 +0000 UTC" firstStartedPulling="2026-04-19 12:36:59.297549153 +0000 UTC m=+374.535199396" lastFinishedPulling="2026-04-19 12:37:03.792493085 +0000 UTC m=+379.030143328" observedRunningTime="2026-04-19 12:37:04.645323847 +0000 UTC m=+379.882974118" watchObservedRunningTime="2026-04-19 12:37:04.646666483 +0000 UTC m=+379.884316743" Apr 19 12:37:55.287511 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:37:55.287473 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-f7skn"] Apr 19 12:37:55.291609 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:37:55.291587 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-f7skn" Apr 19 12:37:55.293434 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:37:55.293417 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 19 12:37:55.293797 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:37:55.293775 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:37:55.293907 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:37:55.293780 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-kbtm5\"" Apr 19 12:37:55.296538 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:37:55.296515 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-f7skn"] Apr 19 12:37:55.384254 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:37:55.384228 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zqc6\" (UniqueName: \"kubernetes.io/projected/0607ed45-9366-4f10-8352-5a266131218d-kube-api-access-7zqc6\") pod \"openshift-lws-operator-bfc7f696d-f7skn\" (UID: \"0607ed45-9366-4f10-8352-5a266131218d\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-f7skn" Apr 19 12:37:55.384460 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:37:55.384279 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0607ed45-9366-4f10-8352-5a266131218d-tmp\") pod \"openshift-lws-operator-bfc7f696d-f7skn\" (UID: \"0607ed45-9366-4f10-8352-5a266131218d\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-f7skn" Apr 19 12:37:55.485576 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:37:55.485546 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0607ed45-9366-4f10-8352-5a266131218d-tmp\") pod \"openshift-lws-operator-bfc7f696d-f7skn\" (UID: \"0607ed45-9366-4f10-8352-5a266131218d\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-f7skn" Apr 19 12:37:55.485705 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:37:55.485610 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zqc6\" (UniqueName: \"kubernetes.io/projected/0607ed45-9366-4f10-8352-5a266131218d-kube-api-access-7zqc6\") pod \"openshift-lws-operator-bfc7f696d-f7skn\" (UID: \"0607ed45-9366-4f10-8352-5a266131218d\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-f7skn" Apr 19 12:37:55.485948 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:37:55.485928 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0607ed45-9366-4f10-8352-5a266131218d-tmp\") pod \"openshift-lws-operator-bfc7f696d-f7skn\" (UID: \"0607ed45-9366-4f10-8352-5a266131218d\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-f7skn" Apr 19 12:37:55.493799 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:37:55.493768 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zqc6\" (UniqueName: \"kubernetes.io/projected/0607ed45-9366-4f10-8352-5a266131218d-kube-api-access-7zqc6\") pod \"openshift-lws-operator-bfc7f696d-f7skn\" (UID: \"0607ed45-9366-4f10-8352-5a266131218d\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-f7skn" Apr 19 12:37:55.600918 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:37:55.600863 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-f7skn" Apr 19 12:37:55.718452 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:37:55.718419 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-f7skn"] Apr 19 12:37:55.722160 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:37:55.722138 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0607ed45_9366_4f10_8352_5a266131218d.slice/crio-db90a62065d8536c17c7f3cfc00d0e821d60498ec9cee66f555c9e910299a35b WatchSource:0}: Error finding container db90a62065d8536c17c7f3cfc00d0e821d60498ec9cee66f555c9e910299a35b: Status 404 returned error can't find the container with id db90a62065d8536c17c7f3cfc00d0e821d60498ec9cee66f555c9e910299a35b Apr 19 12:37:55.777972 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:37:55.777942 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-f7skn" event={"ID":"0607ed45-9366-4f10-8352-5a266131218d","Type":"ContainerStarted","Data":"db90a62065d8536c17c7f3cfc00d0e821d60498ec9cee66f555c9e910299a35b"} Apr 19 12:37:58.793082 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:37:58.793001 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-f7skn" event={"ID":"0607ed45-9366-4f10-8352-5a266131218d","Type":"ContainerStarted","Data":"1b45eb800c9e4bef575f12297568137c2a4deaedda69f2c75a302147bb710214"} Apr 19 12:37:58.807512 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:37:58.807467 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-f7skn" podStartSLOduration=1.379693348 podStartE2EDuration="3.807453677s" podCreationTimestamp="2026-04-19 12:37:55 +0000 UTC" firstStartedPulling="2026-04-19 12:37:55.72351792 +0000 UTC m=+430.961168158" lastFinishedPulling="2026-04-19 12:37:58.151278233 +0000 UTC m=+433.388928487" observedRunningTime="2026-04-19 12:37:58.806577253 +0000 UTC m=+434.044227525" watchObservedRunningTime="2026-04-19 12:37:58.807453677 +0000 UTC m=+434.045103938" Apr 19 12:38:09.703008 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:09.702979 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-844f57dbd6-9q4qq"] Apr 19 12:38:09.706115 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:09.706099 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-9q4qq" Apr 19 12:38:09.708738 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:09.708719 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 19 12:38:09.708870 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:09.708717 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-dtr48\"" Apr 19 12:38:09.708870 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:09.708755 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 19 12:38:09.708870 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:09.708816 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 19 12:38:09.712229 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:09.712206 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-844f57dbd6-9q4qq"] Apr 19 12:38:09.792388 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:09.792363 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d39c5544-0e4d-468b-b2d9-635b760bb77b-cert\") pod \"lws-controller-manager-844f57dbd6-9q4qq\" (UID: \"d39c5544-0e4d-468b-b2d9-635b760bb77b\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-9q4qq" Apr 19 12:38:09.792489 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:09.792392 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d39c5544-0e4d-468b-b2d9-635b760bb77b-manager-config\") pod \"lws-controller-manager-844f57dbd6-9q4qq\" (UID: \"d39c5544-0e4d-468b-b2d9-635b760bb77b\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-9q4qq" Apr 19 12:38:09.792489 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:09.792466 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/d39c5544-0e4d-468b-b2d9-635b760bb77b-metrics-cert\") pod \"lws-controller-manager-844f57dbd6-9q4qq\" (UID: \"d39c5544-0e4d-468b-b2d9-635b760bb77b\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-9q4qq" Apr 19 12:38:09.792489 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:09.792486 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5b8d\" (UniqueName: \"kubernetes.io/projected/d39c5544-0e4d-468b-b2d9-635b760bb77b-kube-api-access-h5b8d\") pod \"lws-controller-manager-844f57dbd6-9q4qq\" (UID: \"d39c5544-0e4d-468b-b2d9-635b760bb77b\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-9q4qq" Apr 19 12:38:09.893493 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:09.893467 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/d39c5544-0e4d-468b-b2d9-635b760bb77b-metrics-cert\") pod \"lws-controller-manager-844f57dbd6-9q4qq\" (UID: \"d39c5544-0e4d-468b-b2d9-635b760bb77b\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-9q4qq" Apr 19 12:38:09.893584 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:09.893496 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5b8d\" (UniqueName: \"kubernetes.io/projected/d39c5544-0e4d-468b-b2d9-635b760bb77b-kube-api-access-h5b8d\") pod \"lws-controller-manager-844f57dbd6-9q4qq\" (UID: \"d39c5544-0e4d-468b-b2d9-635b760bb77b\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-9q4qq" Apr 19 12:38:09.893584 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:09.893524 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d39c5544-0e4d-468b-b2d9-635b760bb77b-cert\") pod \"lws-controller-manager-844f57dbd6-9q4qq\" (UID: \"d39c5544-0e4d-468b-b2d9-635b760bb77b\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-9q4qq" Apr 19 12:38:09.893584 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:09.893554 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d39c5544-0e4d-468b-b2d9-635b760bb77b-manager-config\") pod \"lws-controller-manager-844f57dbd6-9q4qq\" (UID: \"d39c5544-0e4d-468b-b2d9-635b760bb77b\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-9q4qq" Apr 19 12:38:09.894283 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:09.894218 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d39c5544-0e4d-468b-b2d9-635b760bb77b-manager-config\") pod \"lws-controller-manager-844f57dbd6-9q4qq\" (UID: \"d39c5544-0e4d-468b-b2d9-635b760bb77b\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-9q4qq" Apr 19 12:38:09.896077 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:09.896058 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d39c5544-0e4d-468b-b2d9-635b760bb77b-cert\") pod \"lws-controller-manager-844f57dbd6-9q4qq\" (UID: \"d39c5544-0e4d-468b-b2d9-635b760bb77b\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-9q4qq" Apr 19 12:38:09.896249 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:09.896229 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/d39c5544-0e4d-468b-b2d9-635b760bb77b-metrics-cert\") pod \"lws-controller-manager-844f57dbd6-9q4qq\" (UID: \"d39c5544-0e4d-468b-b2d9-635b760bb77b\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-9q4qq" Apr 19 12:38:09.901054 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:09.901037 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5b8d\" (UniqueName: \"kubernetes.io/projected/d39c5544-0e4d-468b-b2d9-635b760bb77b-kube-api-access-h5b8d\") pod \"lws-controller-manager-844f57dbd6-9q4qq\" (UID: \"d39c5544-0e4d-468b-b2d9-635b760bb77b\") " pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-9q4qq" Apr 19 12:38:10.016113 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:10.016055 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-9q4qq" Apr 19 12:38:10.134591 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:10.134491 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-844f57dbd6-9q4qq"] Apr 19 12:38:10.137494 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:38:10.137469 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd39c5544_0e4d_468b_b2d9_635b760bb77b.slice/crio-34d1a17b87cce55ecab8268aa465c293295320f45dbbfe5803acb65480ee6eec WatchSource:0}: Error finding container 34d1a17b87cce55ecab8268aa465c293295320f45dbbfe5803acb65480ee6eec: Status 404 returned error can't find the container with id 34d1a17b87cce55ecab8268aa465c293295320f45dbbfe5803acb65480ee6eec Apr 19 12:38:10.830098 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:10.830064 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-9q4qq" event={"ID":"d39c5544-0e4d-468b-b2d9-635b760bb77b","Type":"ContainerStarted","Data":"34d1a17b87cce55ecab8268aa465c293295320f45dbbfe5803acb65480ee6eec"} Apr 19 12:38:11.835372 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:11.835281 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-9q4qq" event={"ID":"d39c5544-0e4d-468b-b2d9-635b760bb77b","Type":"ContainerStarted","Data":"c16858a26135bfbfcef36b429b4b73c18342d59d174fdadcceaeecf057746290"} Apr 19 12:38:11.835372 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:11.835335 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-9q4qq" Apr 19 12:38:11.852745 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:11.852693 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-9q4qq" podStartSLOduration=1.435529664 podStartE2EDuration="2.852674831s" podCreationTimestamp="2026-04-19 12:38:09 +0000 UTC" firstStartedPulling="2026-04-19 12:38:10.139666857 +0000 UTC m=+445.377317098" lastFinishedPulling="2026-04-19 12:38:11.556812025 +0000 UTC m=+446.794462265" observedRunningTime="2026-04-19 12:38:11.851597244 +0000 UTC m=+447.089247496" watchObservedRunningTime="2026-04-19 12:38:11.852674831 +0000 UTC m=+447.090325122" Apr 19 12:38:14.579149 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:14.579112 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9ff869b6b-vkdlw"] Apr 19 12:38:14.582357 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:14.582341 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-vkdlw" Apr 19 12:38:14.585397 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:14.585375 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 19 12:38:14.585680 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:14.585660 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 19 12:38:14.585906 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:14.585881 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 19 12:38:14.586008 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:14.585946 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-scmp2\"" Apr 19 12:38:14.586177 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:14.586156 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 19 12:38:14.609577 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:14.609556 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9ff869b6b-vkdlw"] Apr 19 12:38:14.727604 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:14.727580 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c584614-1441-4ab1-a7c3-1df91d5bd84e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9ff869b6b-vkdlw\" (UID: \"0c584614-1441-4ab1-a7c3-1df91d5bd84e\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-vkdlw" Apr 19 12:38:14.727710 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:14.727621 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zft6z\" (UniqueName: \"kubernetes.io/projected/0c584614-1441-4ab1-a7c3-1df91d5bd84e-kube-api-access-zft6z\") pod \"opendatahub-operator-controller-manager-9ff869b6b-vkdlw\" (UID: \"0c584614-1441-4ab1-a7c3-1df91d5bd84e\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-vkdlw" Apr 19 12:38:14.727751 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:14.727728 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c584614-1441-4ab1-a7c3-1df91d5bd84e-webhook-cert\") pod \"opendatahub-operator-controller-manager-9ff869b6b-vkdlw\" (UID: \"0c584614-1441-4ab1-a7c3-1df91d5bd84e\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-vkdlw" Apr 19 12:38:14.828250 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:14.828224 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zft6z\" (UniqueName: \"kubernetes.io/projected/0c584614-1441-4ab1-a7c3-1df91d5bd84e-kube-api-access-zft6z\") pod \"opendatahub-operator-controller-manager-9ff869b6b-vkdlw\" (UID: \"0c584614-1441-4ab1-a7c3-1df91d5bd84e\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-vkdlw" Apr 19 12:38:14.828353 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:14.828283 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c584614-1441-4ab1-a7c3-1df91d5bd84e-webhook-cert\") pod \"opendatahub-operator-controller-manager-9ff869b6b-vkdlw\" (UID: \"0c584614-1441-4ab1-a7c3-1df91d5bd84e\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-vkdlw" Apr 19 12:38:14.828353 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:14.828306 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c584614-1441-4ab1-a7c3-1df91d5bd84e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9ff869b6b-vkdlw\" (UID: \"0c584614-1441-4ab1-a7c3-1df91d5bd84e\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-vkdlw" Apr 19 12:38:14.831047 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:14.830987 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c584614-1441-4ab1-a7c3-1df91d5bd84e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9ff869b6b-vkdlw\" (UID: \"0c584614-1441-4ab1-a7c3-1df91d5bd84e\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-vkdlw" Apr 19 12:38:14.831047 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:14.830993 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c584614-1441-4ab1-a7c3-1df91d5bd84e-webhook-cert\") pod \"opendatahub-operator-controller-manager-9ff869b6b-vkdlw\" (UID: \"0c584614-1441-4ab1-a7c3-1df91d5bd84e\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-vkdlw" Apr 19 12:38:14.835365 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:14.835342 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zft6z\" (UniqueName: \"kubernetes.io/projected/0c584614-1441-4ab1-a7c3-1df91d5bd84e-kube-api-access-zft6z\") pod \"opendatahub-operator-controller-manager-9ff869b6b-vkdlw\" (UID: \"0c584614-1441-4ab1-a7c3-1df91d5bd84e\") " pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-vkdlw" Apr 19 12:38:14.892436 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:14.892406 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-vkdlw" Apr 19 12:38:15.016657 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:15.016629 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9ff869b6b-vkdlw"] Apr 19 12:38:15.020161 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:38:15.020111 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c584614_1441_4ab1_a7c3_1df91d5bd84e.slice/crio-03981a675f02bf393ac09ce591bc4fb92d26b5720b4738724f49a60aa48ccb9b WatchSource:0}: Error finding container 03981a675f02bf393ac09ce591bc4fb92d26b5720b4738724f49a60aa48ccb9b: Status 404 returned error can't find the container with id 03981a675f02bf393ac09ce591bc4fb92d26b5720b4738724f49a60aa48ccb9b Apr 19 12:38:15.850273 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:15.850234 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-vkdlw" event={"ID":"0c584614-1441-4ab1-a7c3-1df91d5bd84e","Type":"ContainerStarted","Data":"03981a675f02bf393ac09ce591bc4fb92d26b5720b4738724f49a60aa48ccb9b"} Apr 19 12:38:17.858695 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:17.858614 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-vkdlw" event={"ID":"0c584614-1441-4ab1-a7c3-1df91d5bd84e","Type":"ContainerStarted","Data":"c031cb22a5b619feb0f5eddcffab07b082790d73d5f3074f8a3c9fd3e980df1b"} Apr 19 12:38:17.859086 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:17.858747 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-vkdlw" Apr 19 12:38:17.887409 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:17.887371 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-vkdlw" podStartSLOduration=1.364024527 podStartE2EDuration="3.887357834s" podCreationTimestamp="2026-04-19 12:38:14 +0000 UTC" firstStartedPulling="2026-04-19 12:38:15.021822855 +0000 UTC m=+450.259473093" lastFinishedPulling="2026-04-19 12:38:17.545156161 +0000 UTC m=+452.782806400" observedRunningTime="2026-04-19 12:38:17.885514426 +0000 UTC m=+453.123164687" watchObservedRunningTime="2026-04-19 12:38:17.887357834 +0000 UTC m=+453.125008075" Apr 19 12:38:22.840687 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:22.840656 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-844f57dbd6-9q4qq" Apr 19 12:38:28.865583 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:38:28.865549 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-9ff869b6b-vkdlw" Apr 19 12:39:18.782571 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.782539 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22"] Apr 19 12:39:18.786275 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.786257 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.788250 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.788228 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-7ttq7\"" Apr 19 12:39:18.788333 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.788228 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 19 12:39:18.796323 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.796293 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22"] Apr 19 12:39:18.884042 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.884020 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6550e017-0db5-4e24-bb67-58828cfb90dd-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.884143 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.884053 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbnhr\" (UniqueName: \"kubernetes.io/projected/6550e017-0db5-4e24-bb67-58828cfb90dd-kube-api-access-zbnhr\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.884143 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.884083 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6550e017-0db5-4e24-bb67-58828cfb90dd-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.884143 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.884109 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6550e017-0db5-4e24-bb67-58828cfb90dd-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.884242 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.884155 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6550e017-0db5-4e24-bb67-58828cfb90dd-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.884275 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.884241 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6550e017-0db5-4e24-bb67-58828cfb90dd-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.884275 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.884256 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6550e017-0db5-4e24-bb67-58828cfb90dd-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.884275 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.884271 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6550e017-0db5-4e24-bb67-58828cfb90dd-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.884366 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.884317 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6550e017-0db5-4e24-bb67-58828cfb90dd-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.985160 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.985142 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6550e017-0db5-4e24-bb67-58828cfb90dd-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.985257 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.985168 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6550e017-0db5-4e24-bb67-58828cfb90dd-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.985257 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.985186 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6550e017-0db5-4e24-bb67-58828cfb90dd-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.985257 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.985206 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6550e017-0db5-4e24-bb67-58828cfb90dd-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.985257 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.985246 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6550e017-0db5-4e24-bb67-58828cfb90dd-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.985422 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.985278 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbnhr\" (UniqueName: \"kubernetes.io/projected/6550e017-0db5-4e24-bb67-58828cfb90dd-kube-api-access-zbnhr\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.985422 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.985325 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6550e017-0db5-4e24-bb67-58828cfb90dd-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.985422 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.985350 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6550e017-0db5-4e24-bb67-58828cfb90dd-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.985422 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.985376 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6550e017-0db5-4e24-bb67-58828cfb90dd-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.985592 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.985572 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6550e017-0db5-4e24-bb67-58828cfb90dd-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.985650 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.985593 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6550e017-0db5-4e24-bb67-58828cfb90dd-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.985744 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.985723 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6550e017-0db5-4e24-bb67-58828cfb90dd-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.985797 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.985767 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6550e017-0db5-4e24-bb67-58828cfb90dd-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.985971 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.985953 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6550e017-0db5-4e24-bb67-58828cfb90dd-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.987733 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.987708 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6550e017-0db5-4e24-bb67-58828cfb90dd-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.987972 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.987955 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6550e017-0db5-4e24-bb67-58828cfb90dd-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.996115 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.996086 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbnhr\" (UniqueName: \"kubernetes.io/projected/6550e017-0db5-4e24-bb67-58828cfb90dd-kube-api-access-zbnhr\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:18.996376 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:18.996356 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6550e017-0db5-4e24-bb67-58828cfb90dd-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffmj22\" (UID: \"6550e017-0db5-4e24-bb67-58828cfb90dd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:19.097691 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:19.097616 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:19.220383 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:19.220347 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22"] Apr 19 12:39:19.224528 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:39:19.224502 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6550e017_0db5_4e24_bb67_58828cfb90dd.slice/crio-2c75b994e8337f9f1783b8117944880e76963fa9631ca58ddceb3f9f41e8c673 WatchSource:0}: Error finding container 2c75b994e8337f9f1783b8117944880e76963fa9631ca58ddceb3f9f41e8c673: Status 404 returned error can't find the container with id 2c75b994e8337f9f1783b8117944880e76963fa9631ca58ddceb3f9f41e8c673 Apr 19 12:39:20.070213 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:20.070159 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" event={"ID":"6550e017-0db5-4e24-bb67-58828cfb90dd","Type":"ContainerStarted","Data":"2c75b994e8337f9f1783b8117944880e76963fa9631ca58ddceb3f9f41e8c673"} Apr 19 12:39:21.698600 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:21.698558 2583 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 19 12:39:21.698994 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:21.698631 2583 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 19 12:39:21.698994 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:21.698657 2583 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 19 12:39:22.077549 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:22.077461 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" event={"ID":"6550e017-0db5-4e24-bb67-58828cfb90dd","Type":"ContainerStarted","Data":"eaaa25a6b4c6c3bdecc2264deb40b968c2c28608f8762a65d7540424e8080da4"} Apr 19 12:39:22.096886 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:22.096817 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" podStartSLOduration=1.6249726820000001 podStartE2EDuration="4.096799508s" podCreationTimestamp="2026-04-19 12:39:18 +0000 UTC" firstStartedPulling="2026-04-19 12:39:19.22649315 +0000 UTC m=+514.464143400" lastFinishedPulling="2026-04-19 12:39:21.69831997 +0000 UTC m=+516.935970226" observedRunningTime="2026-04-19 12:39:22.094975617 +0000 UTC m=+517.332625879" watchObservedRunningTime="2026-04-19 12:39:22.096799508 +0000 UTC m=+517.334449771" Apr 19 12:39:22.097800 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:22.097777 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:22.102497 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:22.102476 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:23.080455 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:23.080422 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:23.081370 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:23.081349 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffmj22" Apr 19 12:39:32.132963 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:32.132883 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-8nn5m"] Apr 19 12:39:32.135973 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:32.135957 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-8nn5m" Apr 19 12:39:32.137804 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:32.137778 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 19 12:39:32.137954 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:32.137829 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-lz8z7\"" Apr 19 12:39:32.138320 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:32.138302 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 19 12:39:32.145478 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:32.145457 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-8nn5m"] Apr 19 12:39:32.185166 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:32.185135 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt6sc\" (UniqueName: \"kubernetes.io/projected/5dbbb334-0c1a-4835-ad7c-6e1c2b53339e-kube-api-access-qt6sc\") pod \"kuadrant-operator-catalog-8nn5m\" (UID: \"5dbbb334-0c1a-4835-ad7c-6e1c2b53339e\") " pod="kuadrant-system/kuadrant-operator-catalog-8nn5m" Apr 19 12:39:32.286304 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:32.286269 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qt6sc\" (UniqueName: \"kubernetes.io/projected/5dbbb334-0c1a-4835-ad7c-6e1c2b53339e-kube-api-access-qt6sc\") pod \"kuadrant-operator-catalog-8nn5m\" (UID: \"5dbbb334-0c1a-4835-ad7c-6e1c2b53339e\") " pod="kuadrant-system/kuadrant-operator-catalog-8nn5m" Apr 19 12:39:32.293941 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:32.293914 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt6sc\" (UniqueName: \"kubernetes.io/projected/5dbbb334-0c1a-4835-ad7c-6e1c2b53339e-kube-api-access-qt6sc\") pod \"kuadrant-operator-catalog-8nn5m\" (UID: \"5dbbb334-0c1a-4835-ad7c-6e1c2b53339e\") " pod="kuadrant-system/kuadrant-operator-catalog-8nn5m" Apr 19 12:39:32.445915 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:32.445885 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-8nn5m" Apr 19 12:39:32.498957 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:32.498821 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-8nn5m"] Apr 19 12:39:32.564491 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:32.564467 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-8nn5m"] Apr 19 12:39:32.567219 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:39:32.567191 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dbbb334_0c1a_4835_ad7c_6e1c2b53339e.slice/crio-19ff5a85d6fe78db4494b5bb1ab549250e4e408d084db80cc96966a13f883043 WatchSource:0}: Error finding container 19ff5a85d6fe78db4494b5bb1ab549250e4e408d084db80cc96966a13f883043: Status 404 returned error can't find the container with id 19ff5a85d6fe78db4494b5bb1ab549250e4e408d084db80cc96966a13f883043 Apr 19 12:39:32.705525 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:32.705456 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bf79f"] Apr 19 12:39:32.709493 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:32.709476 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-bf79f" Apr 19 12:39:32.714756 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:32.714731 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bf79f"] Apr 19 12:39:32.789646 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:32.789621 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx28m\" (UniqueName: \"kubernetes.io/projected/08e3e49e-bd5d-4465-a80f-4ec32e63b636-kube-api-access-zx28m\") pod \"kuadrant-operator-catalog-bf79f\" (UID: \"08e3e49e-bd5d-4465-a80f-4ec32e63b636\") " pod="kuadrant-system/kuadrant-operator-catalog-bf79f" Apr 19 12:39:32.890637 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:32.890606 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zx28m\" (UniqueName: \"kubernetes.io/projected/08e3e49e-bd5d-4465-a80f-4ec32e63b636-kube-api-access-zx28m\") pod \"kuadrant-operator-catalog-bf79f\" (UID: \"08e3e49e-bd5d-4465-a80f-4ec32e63b636\") " pod="kuadrant-system/kuadrant-operator-catalog-bf79f" Apr 19 12:39:32.901592 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:32.901564 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx28m\" (UniqueName: \"kubernetes.io/projected/08e3e49e-bd5d-4465-a80f-4ec32e63b636-kube-api-access-zx28m\") pod \"kuadrant-operator-catalog-bf79f\" (UID: \"08e3e49e-bd5d-4465-a80f-4ec32e63b636\") " pod="kuadrant-system/kuadrant-operator-catalog-bf79f" Apr 19 12:39:33.020042 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:33.019971 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-bf79f" Apr 19 12:39:33.121796 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:33.121759 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-8nn5m" event={"ID":"5dbbb334-0c1a-4835-ad7c-6e1c2b53339e","Type":"ContainerStarted","Data":"19ff5a85d6fe78db4494b5bb1ab549250e4e408d084db80cc96966a13f883043"} Apr 19 12:39:33.165256 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:33.165204 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bf79f"] Apr 19 12:39:33.168875 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:39:33.168829 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08e3e49e_bd5d_4465_a80f_4ec32e63b636.slice/crio-d295beb45bf42b2367fc34a98b84d6aba9a311918a5e6a2c1305e8dcc25473c8 WatchSource:0}: Error finding container d295beb45bf42b2367fc34a98b84d6aba9a311918a5e6a2c1305e8dcc25473c8: Status 404 returned error can't find the container with id d295beb45bf42b2367fc34a98b84d6aba9a311918a5e6a2c1305e8dcc25473c8 Apr 19 12:39:34.127491 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:34.127459 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-bf79f" event={"ID":"08e3e49e-bd5d-4465-a80f-4ec32e63b636","Type":"ContainerStarted","Data":"d295beb45bf42b2367fc34a98b84d6aba9a311918a5e6a2c1305e8dcc25473c8"} Apr 19 12:39:35.132133 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:35.132036 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-8nn5m" event={"ID":"5dbbb334-0c1a-4835-ad7c-6e1c2b53339e","Type":"ContainerStarted","Data":"4b558f67498e6d5ee2200e52e3d825b334f3a8a88b35e4a392f3fb8311975191"} Apr 19 12:39:35.132133 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:35.132055 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-8nn5m" podUID="5dbbb334-0c1a-4835-ad7c-6e1c2b53339e" containerName="registry-server" containerID="cri-o://4b558f67498e6d5ee2200e52e3d825b334f3a8a88b35e4a392f3fb8311975191" gracePeriod=2 Apr 19 12:39:35.133395 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:35.133355 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-bf79f" event={"ID":"08e3e49e-bd5d-4465-a80f-4ec32e63b636","Type":"ContainerStarted","Data":"b6a743aa67db0deb5c95f35dd42ed79a49da514acd2adb38dd10acefd248f452"} Apr 19 12:39:35.147272 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:35.147220 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-8nn5m" podStartSLOduration=0.973765441 podStartE2EDuration="3.147203846s" podCreationTimestamp="2026-04-19 12:39:32 +0000 UTC" firstStartedPulling="2026-04-19 12:39:32.568540118 +0000 UTC m=+527.806190357" lastFinishedPulling="2026-04-19 12:39:34.741978522 +0000 UTC m=+529.979628762" observedRunningTime="2026-04-19 12:39:35.144468201 +0000 UTC m=+530.382118462" watchObservedRunningTime="2026-04-19 12:39:35.147203846 +0000 UTC m=+530.384854108" Apr 19 12:39:35.157409 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:35.157357 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-bf79f" podStartSLOduration=1.5829589849999999 podStartE2EDuration="3.157340427s" podCreationTimestamp="2026-04-19 12:39:32 +0000 UTC" firstStartedPulling="2026-04-19 12:39:33.17015825 +0000 UTC m=+528.407808489" lastFinishedPulling="2026-04-19 12:39:34.744539692 +0000 UTC m=+529.982189931" observedRunningTime="2026-04-19 12:39:35.156442738 +0000 UTC m=+530.394092999" watchObservedRunningTime="2026-04-19 12:39:35.157340427 +0000 UTC m=+530.394990689" Apr 19 12:39:35.379374 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:35.379346 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-8nn5m" Apr 19 12:39:35.413188 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:35.413126 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt6sc\" (UniqueName: \"kubernetes.io/projected/5dbbb334-0c1a-4835-ad7c-6e1c2b53339e-kube-api-access-qt6sc\") pod \"5dbbb334-0c1a-4835-ad7c-6e1c2b53339e\" (UID: \"5dbbb334-0c1a-4835-ad7c-6e1c2b53339e\") " Apr 19 12:39:35.415377 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:35.415352 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dbbb334-0c1a-4835-ad7c-6e1c2b53339e-kube-api-access-qt6sc" (OuterVolumeSpecName: "kube-api-access-qt6sc") pod "5dbbb334-0c1a-4835-ad7c-6e1c2b53339e" (UID: "5dbbb334-0c1a-4835-ad7c-6e1c2b53339e"). InnerVolumeSpecName "kube-api-access-qt6sc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:39:35.514520 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:35.514489 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qt6sc\" (UniqueName: \"kubernetes.io/projected/5dbbb334-0c1a-4835-ad7c-6e1c2b53339e-kube-api-access-qt6sc\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:39:36.138058 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:36.137970 2583 generic.go:358] "Generic (PLEG): container finished" podID="5dbbb334-0c1a-4835-ad7c-6e1c2b53339e" containerID="4b558f67498e6d5ee2200e52e3d825b334f3a8a88b35e4a392f3fb8311975191" exitCode=0 Apr 19 12:39:36.138058 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:36.138027 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-8nn5m" Apr 19 12:39:36.138541 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:36.138057 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-8nn5m" event={"ID":"5dbbb334-0c1a-4835-ad7c-6e1c2b53339e","Type":"ContainerDied","Data":"4b558f67498e6d5ee2200e52e3d825b334f3a8a88b35e4a392f3fb8311975191"} Apr 19 12:39:36.138541 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:36.138095 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-8nn5m" event={"ID":"5dbbb334-0c1a-4835-ad7c-6e1c2b53339e","Type":"ContainerDied","Data":"19ff5a85d6fe78db4494b5bb1ab549250e4e408d084db80cc96966a13f883043"} Apr 19 12:39:36.138541 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:36.138112 2583 scope.go:117] "RemoveContainer" containerID="4b558f67498e6d5ee2200e52e3d825b334f3a8a88b35e4a392f3fb8311975191" Apr 19 12:39:36.146704 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:36.146688 2583 scope.go:117] "RemoveContainer" containerID="4b558f67498e6d5ee2200e52e3d825b334f3a8a88b35e4a392f3fb8311975191" Apr 19 12:39:36.146992 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:39:36.146973 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b558f67498e6d5ee2200e52e3d825b334f3a8a88b35e4a392f3fb8311975191\": container with ID starting with 4b558f67498e6d5ee2200e52e3d825b334f3a8a88b35e4a392f3fb8311975191 not found: ID does not exist" containerID="4b558f67498e6d5ee2200e52e3d825b334f3a8a88b35e4a392f3fb8311975191" Apr 19 12:39:36.147061 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:36.147000 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b558f67498e6d5ee2200e52e3d825b334f3a8a88b35e4a392f3fb8311975191"} err="failed to get container status \"4b558f67498e6d5ee2200e52e3d825b334f3a8a88b35e4a392f3fb8311975191\": rpc error: code = NotFound desc = could not find container \"4b558f67498e6d5ee2200e52e3d825b334f3a8a88b35e4a392f3fb8311975191\": container with ID starting with 4b558f67498e6d5ee2200e52e3d825b334f3a8a88b35e4a392f3fb8311975191 not found: ID does not exist" Apr 19 12:39:36.156873 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:36.156831 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-8nn5m"] Apr 19 12:39:36.158126 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:36.158108 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-8nn5m"] Apr 19 12:39:37.328018 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:37.327984 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dbbb334-0c1a-4835-ad7c-6e1c2b53339e" path="/var/lib/kubelet/pods/5dbbb334-0c1a-4835-ad7c-6e1c2b53339e/volumes" Apr 19 12:39:43.020136 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:43.020098 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-bf79f" Apr 19 12:39:43.020136 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:43.020146 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-bf79f" Apr 19 12:39:43.041799 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:43.041770 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-bf79f" Apr 19 12:39:43.181442 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:39:43.181415 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-bf79f" Apr 19 12:40:03.918700 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:03.918664 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-zgvtb"] Apr 19 12:40:03.919104 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:03.918985 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5dbbb334-0c1a-4835-ad7c-6e1c2b53339e" containerName="registry-server" Apr 19 12:40:03.919104 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:03.918996 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dbbb334-0c1a-4835-ad7c-6e1c2b53339e" containerName="registry-server" Apr 19 12:40:03.919104 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:03.919059 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="5dbbb334-0c1a-4835-ad7c-6e1c2b53339e" containerName="registry-server" Apr 19 12:40:03.921799 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:03.921782 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zgvtb" Apr 19 12:40:03.923882 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:03.923839 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 19 12:40:03.923882 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:03.923878 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-r8lfs\"" Apr 19 12:40:03.933163 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:03.933140 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-zgvtb"] Apr 19 12:40:04.049386 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:04.049356 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k5x8\" (UniqueName: \"kubernetes.io/projected/937c503e-246e-490f-9c8e-970ece3e2265-kube-api-access-2k5x8\") pod \"dns-operator-controller-manager-648d5c98bc-zgvtb\" (UID: \"937c503e-246e-490f-9c8e-970ece3e2265\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zgvtb" Apr 19 12:40:04.150338 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:04.150312 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2k5x8\" (UniqueName: \"kubernetes.io/projected/937c503e-246e-490f-9c8e-970ece3e2265-kube-api-access-2k5x8\") pod \"dns-operator-controller-manager-648d5c98bc-zgvtb\" (UID: \"937c503e-246e-490f-9c8e-970ece3e2265\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zgvtb" Apr 19 12:40:04.159924 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:04.159896 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k5x8\" (UniqueName: \"kubernetes.io/projected/937c503e-246e-490f-9c8e-970ece3e2265-kube-api-access-2k5x8\") pod \"dns-operator-controller-manager-648d5c98bc-zgvtb\" (UID: \"937c503e-246e-490f-9c8e-970ece3e2265\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zgvtb" Apr 19 12:40:04.231898 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:04.231842 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zgvtb" Apr 19 12:40:04.358073 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:04.358042 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-zgvtb"] Apr 19 12:40:04.361328 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:40:04.361292 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod937c503e_246e_490f_9c8e_970ece3e2265.slice/crio-1d1ebaa786af1660673b2471cb79331372c4ccd185512cb29a57caaf79f339ba WatchSource:0}: Error finding container 1d1ebaa786af1660673b2471cb79331372c4ccd185512cb29a57caaf79f339ba: Status 404 returned error can't find the container with id 1d1ebaa786af1660673b2471cb79331372c4ccd185512cb29a57caaf79f339ba Apr 19 12:40:05.229217 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:05.229183 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zgvtb" event={"ID":"937c503e-246e-490f-9c8e-970ece3e2265","Type":"ContainerStarted","Data":"1d1ebaa786af1660673b2471cb79331372c4ccd185512cb29a57caaf79f339ba"} Apr 19 12:40:07.237005 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:07.236970 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zgvtb" event={"ID":"937c503e-246e-490f-9c8e-970ece3e2265","Type":"ContainerStarted","Data":"ac907dd08d04456c976233cec6d33e62dde78c2580142991126cf73f9d339157"} Apr 19 12:40:07.237371 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:07.237083 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zgvtb" Apr 19 12:40:07.252951 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:07.252906 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zgvtb" podStartSLOduration=2.119521799 podStartE2EDuration="4.25289145s" podCreationTimestamp="2026-04-19 12:40:03 +0000 UTC" firstStartedPulling="2026-04-19 12:40:04.363274677 +0000 UTC m=+559.600924917" lastFinishedPulling="2026-04-19 12:40:06.496644329 +0000 UTC m=+561.734294568" observedRunningTime="2026-04-19 12:40:07.250644473 +0000 UTC m=+562.488294734" watchObservedRunningTime="2026-04-19 12:40:07.25289145 +0000 UTC m=+562.490541710" Apr 19 12:40:07.750036 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:07.750001 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-6w9nl"] Apr 19 12:40:07.753308 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:07.753285 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-6w9nl" Apr 19 12:40:07.755258 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:07.755242 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-5nrph\"" Apr 19 12:40:07.766525 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:07.766499 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-6w9nl"] Apr 19 12:40:07.884242 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:07.884211 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg4ks\" (UniqueName: \"kubernetes.io/projected/ef035de9-8a2d-49a1-be4a-4f9b3b926959-kube-api-access-wg4ks\") pod \"authorino-operator-657f44b778-6w9nl\" (UID: \"ef035de9-8a2d-49a1-be4a-4f9b3b926959\") " pod="kuadrant-system/authorino-operator-657f44b778-6w9nl" Apr 19 12:40:07.985533 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:07.985499 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wg4ks\" (UniqueName: \"kubernetes.io/projected/ef035de9-8a2d-49a1-be4a-4f9b3b926959-kube-api-access-wg4ks\") pod \"authorino-operator-657f44b778-6w9nl\" (UID: \"ef035de9-8a2d-49a1-be4a-4f9b3b926959\") " pod="kuadrant-system/authorino-operator-657f44b778-6w9nl" Apr 19 12:40:08.000661 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:08.000596 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg4ks\" (UniqueName: \"kubernetes.io/projected/ef035de9-8a2d-49a1-be4a-4f9b3b926959-kube-api-access-wg4ks\") pod \"authorino-operator-657f44b778-6w9nl\" (UID: \"ef035de9-8a2d-49a1-be4a-4f9b3b926959\") " pod="kuadrant-system/authorino-operator-657f44b778-6w9nl" Apr 19 12:40:08.063309 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:08.063271 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-6w9nl" Apr 19 12:40:08.190789 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:08.190755 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-6w9nl"] Apr 19 12:40:08.194952 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:40:08.194911 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef035de9_8a2d_49a1_be4a_4f9b3b926959.slice/crio-3af6b932e6912db6c61011f382eb1a14528a92cfea1e9e93b6b775a20dbba98f WatchSource:0}: Error finding container 3af6b932e6912db6c61011f382eb1a14528a92cfea1e9e93b6b775a20dbba98f: Status 404 returned error can't find the container with id 3af6b932e6912db6c61011f382eb1a14528a92cfea1e9e93b6b775a20dbba98f Apr 19 12:40:08.240717 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:08.240687 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-6w9nl" event={"ID":"ef035de9-8a2d-49a1-be4a-4f9b3b926959","Type":"ContainerStarted","Data":"3af6b932e6912db6c61011f382eb1a14528a92cfea1e9e93b6b775a20dbba98f"} Apr 19 12:40:11.088957 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.088919 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6677f5ccc8-jqsg4"] Apr 19 12:40:11.092652 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.092634 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.095477 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.095245 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 19 12:40:11.095477 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.095296 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-8nsl6\"" Apr 19 12:40:11.095477 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.095306 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 19 12:40:11.095477 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.095351 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 19 12:40:11.095477 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.095381 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 19 12:40:11.095477 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.095304 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 19 12:40:11.095477 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.095402 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 19 12:40:11.095986 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.095581 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 19 12:40:11.100584 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.100559 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 19 12:40:11.103477 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.103454 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6677f5ccc8-jqsg4"] Apr 19 12:40:11.111623 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.111597 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be80a5d9-5649-4446-bf8c-c80a0a9d3e9a-console-oauth-config\") pod \"console-6677f5ccc8-jqsg4\" (UID: \"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a\") " pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.111742 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.111635 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f97bh\" (UniqueName: \"kubernetes.io/projected/be80a5d9-5649-4446-bf8c-c80a0a9d3e9a-kube-api-access-f97bh\") pod \"console-6677f5ccc8-jqsg4\" (UID: \"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a\") " pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.111742 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.111675 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be80a5d9-5649-4446-bf8c-c80a0a9d3e9a-trusted-ca-bundle\") pod \"console-6677f5ccc8-jqsg4\" (UID: \"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a\") " pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.111742 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.111701 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be80a5d9-5649-4446-bf8c-c80a0a9d3e9a-oauth-serving-cert\") pod \"console-6677f5ccc8-jqsg4\" (UID: \"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a\") " pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.111948 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.111747 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be80a5d9-5649-4446-bf8c-c80a0a9d3e9a-console-config\") pod \"console-6677f5ccc8-jqsg4\" (UID: \"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a\") " pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.111948 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.111827 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be80a5d9-5649-4446-bf8c-c80a0a9d3e9a-console-serving-cert\") pod \"console-6677f5ccc8-jqsg4\" (UID: \"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a\") " pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.111948 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.111868 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be80a5d9-5649-4446-bf8c-c80a0a9d3e9a-service-ca\") pod \"console-6677f5ccc8-jqsg4\" (UID: \"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a\") " pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.212657 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.212622 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be80a5d9-5649-4446-bf8c-c80a0a9d3e9a-console-serving-cert\") pod \"console-6677f5ccc8-jqsg4\" (UID: \"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a\") " pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.212657 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.212656 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be80a5d9-5649-4446-bf8c-c80a0a9d3e9a-service-ca\") pod \"console-6677f5ccc8-jqsg4\" (UID: \"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a\") " pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.212894 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.212794 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be80a5d9-5649-4446-bf8c-c80a0a9d3e9a-console-oauth-config\") pod \"console-6677f5ccc8-jqsg4\" (UID: \"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a\") " pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.212894 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.212841 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f97bh\" (UniqueName: \"kubernetes.io/projected/be80a5d9-5649-4446-bf8c-c80a0a9d3e9a-kube-api-access-f97bh\") pod \"console-6677f5ccc8-jqsg4\" (UID: \"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a\") " pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.212969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.212906 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be80a5d9-5649-4446-bf8c-c80a0a9d3e9a-trusted-ca-bundle\") pod \"console-6677f5ccc8-jqsg4\" (UID: \"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a\") " pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.212969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.212929 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be80a5d9-5649-4446-bf8c-c80a0a9d3e9a-oauth-serving-cert\") pod \"console-6677f5ccc8-jqsg4\" (UID: \"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a\") " pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.213070 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.213004 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be80a5d9-5649-4446-bf8c-c80a0a9d3e9a-console-config\") pod \"console-6677f5ccc8-jqsg4\" (UID: \"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a\") " pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.213403 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.213377 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be80a5d9-5649-4446-bf8c-c80a0a9d3e9a-service-ca\") pod \"console-6677f5ccc8-jqsg4\" (UID: \"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a\") " pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.213628 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.213612 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be80a5d9-5649-4446-bf8c-c80a0a9d3e9a-console-config\") pod \"console-6677f5ccc8-jqsg4\" (UID: \"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a\") " pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.213938 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.213919 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be80a5d9-5649-4446-bf8c-c80a0a9d3e9a-oauth-serving-cert\") pod \"console-6677f5ccc8-jqsg4\" (UID: \"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a\") " pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.213995 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.213975 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be80a5d9-5649-4446-bf8c-c80a0a9d3e9a-trusted-ca-bundle\") pod \"console-6677f5ccc8-jqsg4\" (UID: \"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a\") " pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.215443 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.215413 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be80a5d9-5649-4446-bf8c-c80a0a9d3e9a-console-serving-cert\") pod \"console-6677f5ccc8-jqsg4\" (UID: \"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a\") " pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.215565 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.215545 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be80a5d9-5649-4446-bf8c-c80a0a9d3e9a-console-oauth-config\") pod \"console-6677f5ccc8-jqsg4\" (UID: \"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a\") " pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.225069 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.225045 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f97bh\" (UniqueName: \"kubernetes.io/projected/be80a5d9-5649-4446-bf8c-c80a0a9d3e9a-kube-api-access-f97bh\") pod \"console-6677f5ccc8-jqsg4\" (UID: \"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a\") " pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.252321 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.252290 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-6w9nl" event={"ID":"ef035de9-8a2d-49a1-be4a-4f9b3b926959","Type":"ContainerStarted","Data":"2a5afa93b399be3d35d5716dc61e7c84940bebafce0e9a3ab0480274986ad3cc"} Apr 19 12:40:11.252464 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.252417 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-6w9nl" Apr 19 12:40:11.277760 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.277691 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-6w9nl" podStartSLOduration=2.158065566 podStartE2EDuration="4.277677223s" podCreationTimestamp="2026-04-19 12:40:07 +0000 UTC" firstStartedPulling="2026-04-19 12:40:08.197178747 +0000 UTC m=+563.434828986" lastFinishedPulling="2026-04-19 12:40:10.31679039 +0000 UTC m=+565.554440643" observedRunningTime="2026-04-19 12:40:11.275936896 +0000 UTC m=+566.513587157" watchObservedRunningTime="2026-04-19 12:40:11.277677223 +0000 UTC m=+566.515327483" Apr 19 12:40:11.404437 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.404353 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:11.536342 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:11.536317 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6677f5ccc8-jqsg4"] Apr 19 12:40:11.538842 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:40:11.538817 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe80a5d9_5649_4446_bf8c_c80a0a9d3e9a.slice/crio-0366d61cfe9ca93c57063a1772488bddfddf9b2ad9d5b7a2f60a7d563da373da WatchSource:0}: Error finding container 0366d61cfe9ca93c57063a1772488bddfddf9b2ad9d5b7a2f60a7d563da373da: Status 404 returned error can't find the container with id 0366d61cfe9ca93c57063a1772488bddfddf9b2ad9d5b7a2f60a7d563da373da Apr 19 12:40:12.257168 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:12.257127 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6677f5ccc8-jqsg4" event={"ID":"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a","Type":"ContainerStarted","Data":"68a024808ed5184e91c1cadde8fe11f9f23e70b7ddf51dfe8304fc810aa655f1"} Apr 19 12:40:12.257545 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:12.257175 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6677f5ccc8-jqsg4" event={"ID":"be80a5d9-5649-4446-bf8c-c80a0a9d3e9a","Type":"ContainerStarted","Data":"0366d61cfe9ca93c57063a1772488bddfddf9b2ad9d5b7a2f60a7d563da373da"} Apr 19 12:40:12.273104 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:12.273049 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6677f5ccc8-jqsg4" podStartSLOduration=1.273029748 podStartE2EDuration="1.273029748s" podCreationTimestamp="2026-04-19 12:40:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:40:12.272262942 +0000 UTC m=+567.509913203" watchObservedRunningTime="2026-04-19 12:40:12.273029748 +0000 UTC m=+567.510680011" Apr 19 12:40:15.787733 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:15.787703 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs"] Apr 19 12:40:15.791440 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:15.791424 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs" Apr 19 12:40:15.793392 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:15.793371 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-wnc7s\"" Apr 19 12:40:15.803255 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:15.803231 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs"] Apr 19 12:40:15.849611 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:15.849589 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/423217d4-c256-4bf8-9850-b6bd3bc82f4f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cbtxs\" (UID: \"423217d4-c256-4bf8-9850-b6bd3bc82f4f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs" Apr 19 12:40:15.849713 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:15.849649 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7rlc\" (UniqueName: \"kubernetes.io/projected/423217d4-c256-4bf8-9850-b6bd3bc82f4f-kube-api-access-c7rlc\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cbtxs\" (UID: \"423217d4-c256-4bf8-9850-b6bd3bc82f4f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs" Apr 19 12:40:15.950382 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:15.950349 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7rlc\" (UniqueName: \"kubernetes.io/projected/423217d4-c256-4bf8-9850-b6bd3bc82f4f-kube-api-access-c7rlc\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cbtxs\" (UID: \"423217d4-c256-4bf8-9850-b6bd3bc82f4f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs" Apr 19 12:40:15.950486 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:15.950420 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/423217d4-c256-4bf8-9850-b6bd3bc82f4f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cbtxs\" (UID: \"423217d4-c256-4bf8-9850-b6bd3bc82f4f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs" Apr 19 12:40:15.950770 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:15.950750 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/423217d4-c256-4bf8-9850-b6bd3bc82f4f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cbtxs\" (UID: \"423217d4-c256-4bf8-9850-b6bd3bc82f4f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs" Apr 19 12:40:15.957654 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:15.957632 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7rlc\" (UniqueName: \"kubernetes.io/projected/423217d4-c256-4bf8-9850-b6bd3bc82f4f-kube-api-access-c7rlc\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cbtxs\" (UID: \"423217d4-c256-4bf8-9850-b6bd3bc82f4f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs" Apr 19 12:40:16.101594 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:16.101543 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs" Apr 19 12:40:16.227881 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:16.227805 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs"] Apr 19 12:40:16.232148 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:40:16.232102 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod423217d4_c256_4bf8_9850_b6bd3bc82f4f.slice/crio-d837eceb7d96c0c7e81e0d6bd9d20aaf54e21c5c5b777226add020ef6bf606e8 WatchSource:0}: Error finding container d837eceb7d96c0c7e81e0d6bd9d20aaf54e21c5c5b777226add020ef6bf606e8: Status 404 returned error can't find the container with id d837eceb7d96c0c7e81e0d6bd9d20aaf54e21c5c5b777226add020ef6bf606e8 Apr 19 12:40:16.271383 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:16.271354 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs" event={"ID":"423217d4-c256-4bf8-9850-b6bd3bc82f4f","Type":"ContainerStarted","Data":"d837eceb7d96c0c7e81e0d6bd9d20aaf54e21c5c5b777226add020ef6bf606e8"} Apr 19 12:40:18.243353 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:18.243322 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zgvtb" Apr 19 12:40:21.295161 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:21.295124 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs" event={"ID":"423217d4-c256-4bf8-9850-b6bd3bc82f4f","Type":"ContainerStarted","Data":"b58f12dfacb9d2698f63c25adbb03cdac0474986779d0032243e001a6cca4405"} Apr 19 12:40:21.295523 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:21.295300 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs" Apr 19 12:40:21.311290 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:21.311255 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs" podStartSLOduration=1.781881041 podStartE2EDuration="6.31124139s" podCreationTimestamp="2026-04-19 12:40:15 +0000 UTC" firstStartedPulling="2026-04-19 12:40:16.234642667 +0000 UTC m=+571.472292906" lastFinishedPulling="2026-04-19 12:40:20.76400301 +0000 UTC m=+576.001653255" observedRunningTime="2026-04-19 12:40:21.310432624 +0000 UTC m=+576.548082887" watchObservedRunningTime="2026-04-19 12:40:21.31124139 +0000 UTC m=+576.548891647" Apr 19 12:40:21.404876 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:21.404794 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:21.404876 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:21.404823 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:21.409498 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:21.409478 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:22.259186 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:22.259155 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-6w9nl" Apr 19 12:40:22.302874 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:22.302820 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6677f5ccc8-jqsg4" Apr 19 12:40:32.301279 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:32.301241 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs" Apr 19 12:40:33.972040 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:33.972004 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs"] Apr 19 12:40:33.972476 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:33.972196 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs" podUID="423217d4-c256-4bf8-9850-b6bd3bc82f4f" containerName="manager" containerID="cri-o://b58f12dfacb9d2698f63c25adbb03cdac0474986779d0032243e001a6cca4405" gracePeriod=2 Apr 19 12:40:33.979924 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:33.979893 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs"] Apr 19 12:40:33.999105 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:33.999078 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t"] Apr 19 12:40:33.999639 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:33.999617 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="423217d4-c256-4bf8-9850-b6bd3bc82f4f" containerName="manager" Apr 19 12:40:33.999721 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:33.999643 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="423217d4-c256-4bf8-9850-b6bd3bc82f4f" containerName="manager" Apr 19 12:40:33.999758 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:33.999727 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="423217d4-c256-4bf8-9850-b6bd3bc82f4f" containerName="manager" Apr 19 12:40:34.002837 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.002818 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t" Apr 19 12:40:34.017505 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.017482 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t"] Apr 19 12:40:34.035715 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.035687 2583 status_manager.go:895] "Failed to get status for pod" podUID="423217d4-c256-4bf8-9850-b6bd3bc82f4f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-cbtxs\" is forbidden: User \"system:node:ip-10-0-140-194.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-194.ec2.internal' and this object" Apr 19 12:40:34.101398 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.101358 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8efb7caf-34a7-4cb4-a8ec-044de2b177fa-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-zsq8t\" (UID: \"8efb7caf-34a7-4cb4-a8ec-044de2b177fa\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t" Apr 19 12:40:34.101513 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.101493 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzpfj\" (UniqueName: \"kubernetes.io/projected/8efb7caf-34a7-4cb4-a8ec-044de2b177fa-kube-api-access-nzpfj\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-zsq8t\" (UID: \"8efb7caf-34a7-4cb4-a8ec-044de2b177fa\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t" Apr 19 12:40:34.201869 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.201834 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs" Apr 19 12:40:34.201973 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.201883 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nzpfj\" (UniqueName: \"kubernetes.io/projected/8efb7caf-34a7-4cb4-a8ec-044de2b177fa-kube-api-access-nzpfj\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-zsq8t\" (UID: \"8efb7caf-34a7-4cb4-a8ec-044de2b177fa\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t" Apr 19 12:40:34.202016 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.201978 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8efb7caf-34a7-4cb4-a8ec-044de2b177fa-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-zsq8t\" (UID: \"8efb7caf-34a7-4cb4-a8ec-044de2b177fa\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t" Apr 19 12:40:34.202286 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.202270 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8efb7caf-34a7-4cb4-a8ec-044de2b177fa-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-zsq8t\" (UID: \"8efb7caf-34a7-4cb4-a8ec-044de2b177fa\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t" Apr 19 12:40:34.203939 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.203912 2583 status_manager.go:895] "Failed to get status for pod" podUID="423217d4-c256-4bf8-9850-b6bd3bc82f4f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-cbtxs\" is forbidden: User \"system:node:ip-10-0-140-194.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-194.ec2.internal' and this object" Apr 19 12:40:34.215819 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.215781 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzpfj\" (UniqueName: \"kubernetes.io/projected/8efb7caf-34a7-4cb4-a8ec-044de2b177fa-kube-api-access-nzpfj\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-zsq8t\" (UID: \"8efb7caf-34a7-4cb4-a8ec-044de2b177fa\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t" Apr 19 12:40:34.302937 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.302876 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7rlc\" (UniqueName: \"kubernetes.io/projected/423217d4-c256-4bf8-9850-b6bd3bc82f4f-kube-api-access-c7rlc\") pod \"423217d4-c256-4bf8-9850-b6bd3bc82f4f\" (UID: \"423217d4-c256-4bf8-9850-b6bd3bc82f4f\") " Apr 19 12:40:34.302937 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.302931 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/423217d4-c256-4bf8-9850-b6bd3bc82f4f-extensions-socket-volume\") pod \"423217d4-c256-4bf8-9850-b6bd3bc82f4f\" (UID: \"423217d4-c256-4bf8-9850-b6bd3bc82f4f\") " Apr 19 12:40:34.303411 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.303386 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/423217d4-c256-4bf8-9850-b6bd3bc82f4f-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "423217d4-c256-4bf8-9850-b6bd3bc82f4f" (UID: "423217d4-c256-4bf8-9850-b6bd3bc82f4f"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:40:34.304995 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.304963 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/423217d4-c256-4bf8-9850-b6bd3bc82f4f-kube-api-access-c7rlc" (OuterVolumeSpecName: "kube-api-access-c7rlc") pod "423217d4-c256-4bf8-9850-b6bd3bc82f4f" (UID: "423217d4-c256-4bf8-9850-b6bd3bc82f4f"). InnerVolumeSpecName "kube-api-access-c7rlc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:40:34.338884 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.338829 2583 generic.go:358] "Generic (PLEG): container finished" podID="423217d4-c256-4bf8-9850-b6bd3bc82f4f" containerID="b58f12dfacb9d2698f63c25adbb03cdac0474986779d0032243e001a6cca4405" exitCode=0 Apr 19 12:40:34.338975 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.338893 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs" Apr 19 12:40:34.338975 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.338927 2583 scope.go:117] "RemoveContainer" containerID="b58f12dfacb9d2698f63c25adbb03cdac0474986779d0032243e001a6cca4405" Apr 19 12:40:34.340827 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.340801 2583 status_manager.go:895] "Failed to get status for pod" podUID="423217d4-c256-4bf8-9850-b6bd3bc82f4f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-cbtxs\" is forbidden: User \"system:node:ip-10-0-140-194.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-194.ec2.internal' and this object" Apr 19 12:40:34.347098 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.347083 2583 scope.go:117] "RemoveContainer" containerID="b58f12dfacb9d2698f63c25adbb03cdac0474986779d0032243e001a6cca4405" Apr 19 12:40:34.347317 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:40:34.347299 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b58f12dfacb9d2698f63c25adbb03cdac0474986779d0032243e001a6cca4405\": container with ID starting with b58f12dfacb9d2698f63c25adbb03cdac0474986779d0032243e001a6cca4405 not found: ID does not exist" containerID="b58f12dfacb9d2698f63c25adbb03cdac0474986779d0032243e001a6cca4405" Apr 19 12:40:34.347369 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.347327 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b58f12dfacb9d2698f63c25adbb03cdac0474986779d0032243e001a6cca4405"} err="failed to get container status \"b58f12dfacb9d2698f63c25adbb03cdac0474986779d0032243e001a6cca4405\": rpc error: code = NotFound desc = could not find container \"b58f12dfacb9d2698f63c25adbb03cdac0474986779d0032243e001a6cca4405\": container with ID starting with b58f12dfacb9d2698f63c25adbb03cdac0474986779d0032243e001a6cca4405 not found: ID does not exist" Apr 19 12:40:34.349865 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.349822 2583 status_manager.go:895] "Failed to get status for pod" podUID="423217d4-c256-4bf8-9850-b6bd3bc82f4f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-cbtxs\" is forbidden: User \"system:node:ip-10-0-140-194.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-194.ec2.internal' and this object" Apr 19 12:40:34.353291 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.353274 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t" Apr 19 12:40:34.404683 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.404578 2583 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/423217d4-c256-4bf8-9850-b6bd3bc82f4f-extensions-socket-volume\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:40:34.404683 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.404607 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c7rlc\" (UniqueName: \"kubernetes.io/projected/423217d4-c256-4bf8-9850-b6bd3bc82f4f-kube-api-access-c7rlc\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:40:34.482260 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:34.482234 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t"] Apr 19 12:40:34.485536 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:40:34.485505 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8efb7caf_34a7_4cb4_a8ec_044de2b177fa.slice/crio-515bc88534d40c6f99b0b85660c9d74a3ca640a09b68b5fcc9a47d75b8ed3989 WatchSource:0}: Error finding container 515bc88534d40c6f99b0b85660c9d74a3ca640a09b68b5fcc9a47d75b8ed3989: Status 404 returned error can't find the container with id 515bc88534d40c6f99b0b85660c9d74a3ca640a09b68b5fcc9a47d75b8ed3989 Apr 19 12:40:35.327637 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:35.327598 2583 status_manager.go:895] "Failed to get status for pod" podUID="423217d4-c256-4bf8-9850-b6bd3bc82f4f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cbtxs" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-cbtxs\" is forbidden: User \"system:node:ip-10-0-140-194.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-194.ec2.internal' and this object" Apr 19 12:40:35.328522 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:35.328501 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="423217d4-c256-4bf8-9850-b6bd3bc82f4f" path="/var/lib/kubelet/pods/423217d4-c256-4bf8-9850-b6bd3bc82f4f/volumes" Apr 19 12:40:35.344207 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:35.344178 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t" event={"ID":"8efb7caf-34a7-4cb4-a8ec-044de2b177fa","Type":"ContainerStarted","Data":"3bfb436b43c589e7c18585655ebe331bbfef7d678db860d9db85cd2914ad0bfd"} Apr 19 12:40:35.344207 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:35.344209 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t" event={"ID":"8efb7caf-34a7-4cb4-a8ec-044de2b177fa","Type":"ContainerStarted","Data":"515bc88534d40c6f99b0b85660c9d74a3ca640a09b68b5fcc9a47d75b8ed3989"} Apr 19 12:40:35.344372 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:35.344246 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t" Apr 19 12:40:35.363405 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:35.363359 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t" podStartSLOduration=2.363346379 podStartE2EDuration="2.363346379s" podCreationTimestamp="2026-04-19 12:40:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:40:35.362192421 +0000 UTC m=+590.599842681" watchObservedRunningTime="2026-04-19 12:40:35.363346379 +0000 UTC m=+590.600996703" Apr 19 12:40:45.259818 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:45.259778 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovn-acl-logging/0.log" Apr 19 12:40:45.260209 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:45.260128 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovn-acl-logging/0.log" Apr 19 12:40:46.350145 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:46.350118 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t" Apr 19 12:40:48.851656 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:48.851623 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t"] Apr 19 12:40:48.854096 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:48.851892 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t" podUID="8efb7caf-34a7-4cb4-a8ec-044de2b177fa" containerName="manager" containerID="cri-o://3bfb436b43c589e7c18585655ebe331bbfef7d678db860d9db85cd2914ad0bfd" gracePeriod=10 Apr 19 12:40:49.068589 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.068560 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2"] Apr 19 12:40:49.072014 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.071997 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2" Apr 19 12:40:49.084696 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.084671 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2"] Apr 19 12:40:49.100075 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.100057 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t" Apr 19 12:40:49.216246 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.216210 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8efb7caf-34a7-4cb4-a8ec-044de2b177fa-extensions-socket-volume\") pod \"8efb7caf-34a7-4cb4-a8ec-044de2b177fa\" (UID: \"8efb7caf-34a7-4cb4-a8ec-044de2b177fa\") " Apr 19 12:40:49.216370 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.216297 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzpfj\" (UniqueName: \"kubernetes.io/projected/8efb7caf-34a7-4cb4-a8ec-044de2b177fa-kube-api-access-nzpfj\") pod \"8efb7caf-34a7-4cb4-a8ec-044de2b177fa\" (UID: \"8efb7caf-34a7-4cb4-a8ec-044de2b177fa\") " Apr 19 12:40:49.216440 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.216382 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm9pq\" (UniqueName: \"kubernetes.io/projected/3f54722c-3d20-42f6-a950-c57518b156c6-kube-api-access-nm9pq\") pod \"kuadrant-operator-controller-manager-55c7f4c975-f58p2\" (UID: \"3f54722c-3d20-42f6-a950-c57518b156c6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2" Apr 19 12:40:49.216440 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.216419 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3f54722c-3d20-42f6-a950-c57518b156c6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-f58p2\" (UID: \"3f54722c-3d20-42f6-a950-c57518b156c6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2" Apr 19 12:40:49.216642 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.216615 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8efb7caf-34a7-4cb4-a8ec-044de2b177fa-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "8efb7caf-34a7-4cb4-a8ec-044de2b177fa" (UID: "8efb7caf-34a7-4cb4-a8ec-044de2b177fa"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:40:49.218438 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.218415 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8efb7caf-34a7-4cb4-a8ec-044de2b177fa-kube-api-access-nzpfj" (OuterVolumeSpecName: "kube-api-access-nzpfj") pod "8efb7caf-34a7-4cb4-a8ec-044de2b177fa" (UID: "8efb7caf-34a7-4cb4-a8ec-044de2b177fa"). InnerVolumeSpecName "kube-api-access-nzpfj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:40:49.317101 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.317077 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nm9pq\" (UniqueName: \"kubernetes.io/projected/3f54722c-3d20-42f6-a950-c57518b156c6-kube-api-access-nm9pq\") pod \"kuadrant-operator-controller-manager-55c7f4c975-f58p2\" (UID: \"3f54722c-3d20-42f6-a950-c57518b156c6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2" Apr 19 12:40:49.317223 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.317123 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3f54722c-3d20-42f6-a950-c57518b156c6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-f58p2\" (UID: \"3f54722c-3d20-42f6-a950-c57518b156c6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2" Apr 19 12:40:49.317223 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.317182 2583 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8efb7caf-34a7-4cb4-a8ec-044de2b177fa-extensions-socket-volume\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:40:49.317223 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.317195 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nzpfj\" (UniqueName: \"kubernetes.io/projected/8efb7caf-34a7-4cb4-a8ec-044de2b177fa-kube-api-access-nzpfj\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:40:49.317475 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.317457 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3f54722c-3d20-42f6-a950-c57518b156c6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-f58p2\" (UID: \"3f54722c-3d20-42f6-a950-c57518b156c6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2" Apr 19 12:40:49.328228 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.328204 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm9pq\" (UniqueName: \"kubernetes.io/projected/3f54722c-3d20-42f6-a950-c57518b156c6-kube-api-access-nm9pq\") pod \"kuadrant-operator-controller-manager-55c7f4c975-f58p2\" (UID: \"3f54722c-3d20-42f6-a950-c57518b156c6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2" Apr 19 12:40:49.395910 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.395882 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2" Apr 19 12:40:49.396682 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.396662 2583 generic.go:358] "Generic (PLEG): container finished" podID="8efb7caf-34a7-4cb4-a8ec-044de2b177fa" containerID="3bfb436b43c589e7c18585655ebe331bbfef7d678db860d9db85cd2914ad0bfd" exitCode=0 Apr 19 12:40:49.396751 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.396709 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t" event={"ID":"8efb7caf-34a7-4cb4-a8ec-044de2b177fa","Type":"ContainerDied","Data":"3bfb436b43c589e7c18585655ebe331bbfef7d678db860d9db85cd2914ad0bfd"} Apr 19 12:40:49.396751 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.396734 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t" Apr 19 12:40:49.396890 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.396750 2583 scope.go:117] "RemoveContainer" containerID="3bfb436b43c589e7c18585655ebe331bbfef7d678db860d9db85cd2914ad0bfd" Apr 19 12:40:49.396890 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.396740 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t" event={"ID":"8efb7caf-34a7-4cb4-a8ec-044de2b177fa","Type":"ContainerDied","Data":"515bc88534d40c6f99b0b85660c9d74a3ca640a09b68b5fcc9a47d75b8ed3989"} Apr 19 12:40:49.405658 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.405641 2583 scope.go:117] "RemoveContainer" containerID="3bfb436b43c589e7c18585655ebe331bbfef7d678db860d9db85cd2914ad0bfd" Apr 19 12:40:49.405948 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:40:49.405926 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bfb436b43c589e7c18585655ebe331bbfef7d678db860d9db85cd2914ad0bfd\": container with ID starting with 3bfb436b43c589e7c18585655ebe331bbfef7d678db860d9db85cd2914ad0bfd not found: ID does not exist" containerID="3bfb436b43c589e7c18585655ebe331bbfef7d678db860d9db85cd2914ad0bfd" Apr 19 12:40:49.405999 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.405960 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bfb436b43c589e7c18585655ebe331bbfef7d678db860d9db85cd2914ad0bfd"} err="failed to get container status \"3bfb436b43c589e7c18585655ebe331bbfef7d678db860d9db85cd2914ad0bfd\": rpc error: code = NotFound desc = could not find container \"3bfb436b43c589e7c18585655ebe331bbfef7d678db860d9db85cd2914ad0bfd\": container with ID starting with 3bfb436b43c589e7c18585655ebe331bbfef7d678db860d9db85cd2914ad0bfd not found: ID does not exist" Apr 19 12:40:49.413035 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.413007 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t"] Apr 19 12:40:49.417029 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.417008 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zsq8t"] Apr 19 12:40:49.528178 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:49.528152 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2"] Apr 19 12:40:49.530825 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:40:49.530794 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f54722c_3d20_42f6_a950_c57518b156c6.slice/crio-4e5895ce24dbf98ddcb242388f07288e045d9c5c4eb423d71671a185ef026619 WatchSource:0}: Error finding container 4e5895ce24dbf98ddcb242388f07288e045d9c5c4eb423d71671a185ef026619: Status 404 returned error can't find the container with id 4e5895ce24dbf98ddcb242388f07288e045d9c5c4eb423d71671a185ef026619 Apr 19 12:40:50.404107 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:50.404075 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2" event={"ID":"3f54722c-3d20-42f6-a950-c57518b156c6","Type":"ContainerStarted","Data":"2780c832617cbf92d35b2975bd88d31901f7a5f0fa248a5438e3c0412accafc9"} Apr 19 12:40:50.404535 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:50.404111 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2" event={"ID":"3f54722c-3d20-42f6-a950-c57518b156c6","Type":"ContainerStarted","Data":"4e5895ce24dbf98ddcb242388f07288e045d9c5c4eb423d71671a185ef026619"} Apr 19 12:40:50.404535 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:50.404179 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2" Apr 19 12:40:50.423368 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:50.423317 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2" podStartSLOduration=1.423300041 podStartE2EDuration="1.423300041s" podCreationTimestamp="2026-04-19 12:40:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:40:50.421409939 +0000 UTC m=+605.659060199" watchObservedRunningTime="2026-04-19 12:40:50.423300041 +0000 UTC m=+605.660950301" Apr 19 12:40:51.328064 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:40:51.328026 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8efb7caf-34a7-4cb4-a8ec-044de2b177fa" path="/var/lib/kubelet/pods/8efb7caf-34a7-4cb4-a8ec-044de2b177fa/volumes" Apr 19 12:41:01.409946 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:01.409866 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2" Apr 19 12:41:05.011864 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.011810 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j"] Apr 19 12:41:05.012247 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.012195 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8efb7caf-34a7-4cb4-a8ec-044de2b177fa" containerName="manager" Apr 19 12:41:05.012247 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.012208 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="8efb7caf-34a7-4cb4-a8ec-044de2b177fa" containerName="manager" Apr 19 12:41:05.012331 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.012269 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="8efb7caf-34a7-4cb4-a8ec-044de2b177fa" containerName="manager" Apr 19 12:41:05.015306 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.015279 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.017528 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.017499 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-dbzk6\"" Apr 19 12:41:05.027045 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.027020 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j"] Apr 19 12:41:05.137196 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.137167 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/7e385182-92c1-4522-b9e6-1aba2dea9c27-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.137326 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.137204 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/7e385182-92c1-4522-b9e6-1aba2dea9c27-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.137326 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.137255 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/7e385182-92c1-4522-b9e6-1aba2dea9c27-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.137326 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.137289 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7e385182-92c1-4522-b9e6-1aba2dea9c27-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.137326 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.137313 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2drrp\" (UniqueName: \"kubernetes.io/projected/7e385182-92c1-4522-b9e6-1aba2dea9c27-kube-api-access-2drrp\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.137487 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.137344 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/7e385182-92c1-4522-b9e6-1aba2dea9c27-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.137487 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.137386 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/7e385182-92c1-4522-b9e6-1aba2dea9c27-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.137487 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.137403 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/7e385182-92c1-4522-b9e6-1aba2dea9c27-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.137487 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.137425 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/7e385182-92c1-4522-b9e6-1aba2dea9c27-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.238407 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.238383 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2drrp\" (UniqueName: \"kubernetes.io/projected/7e385182-92c1-4522-b9e6-1aba2dea9c27-kube-api-access-2drrp\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.238577 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.238418 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/7e385182-92c1-4522-b9e6-1aba2dea9c27-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.238577 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.238450 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/7e385182-92c1-4522-b9e6-1aba2dea9c27-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.238577 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.238465 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/7e385182-92c1-4522-b9e6-1aba2dea9c27-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.238577 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.238481 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/7e385182-92c1-4522-b9e6-1aba2dea9c27-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.238577 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.238515 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/7e385182-92c1-4522-b9e6-1aba2dea9c27-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.238577 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.238538 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/7e385182-92c1-4522-b9e6-1aba2dea9c27-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.238958 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.238594 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/7e385182-92c1-4522-b9e6-1aba2dea9c27-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.238958 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.238616 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7e385182-92c1-4522-b9e6-1aba2dea9c27-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.238958 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.238875 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/7e385182-92c1-4522-b9e6-1aba2dea9c27-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.238958 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.238922 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/7e385182-92c1-4522-b9e6-1aba2dea9c27-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.239158 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.239140 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/7e385182-92c1-4522-b9e6-1aba2dea9c27-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.239240 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.239214 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/7e385182-92c1-4522-b9e6-1aba2dea9c27-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.239283 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.239268 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/7e385182-92c1-4522-b9e6-1aba2dea9c27-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.240921 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.240894 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/7e385182-92c1-4522-b9e6-1aba2dea9c27-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.241220 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.241200 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/7e385182-92c1-4522-b9e6-1aba2dea9c27-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.246060 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.246032 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7e385182-92c1-4522-b9e6-1aba2dea9c27-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.246153 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.246109 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2drrp\" (UniqueName: \"kubernetes.io/projected/7e385182-92c1-4522-b9e6-1aba2dea9c27-kube-api-access-2drrp\") pod \"maas-default-gateway-openshift-default-58b6f876-r522j\" (UID: \"7e385182-92c1-4522-b9e6-1aba2dea9c27\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.329003 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.328948 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:05.482197 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.482168 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j"] Apr 19 12:41:05.485272 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:41:05.485244 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e385182_92c1_4522_b9e6_1aba2dea9c27.slice/crio-3929d0f71832f38c0323f190fcbedf5b1952611479122612cea9c904e29d5e12 WatchSource:0}: Error finding container 3929d0f71832f38c0323f190fcbedf5b1952611479122612cea9c904e29d5e12: Status 404 returned error can't find the container with id 3929d0f71832f38c0323f190fcbedf5b1952611479122612cea9c904e29d5e12 Apr 19 12:41:05.487128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.487099 2583 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 19 12:41:05.487199 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.487160 2583 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 19 12:41:05.487199 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:05.487185 2583 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 19 12:41:06.459408 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:06.459369 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" event={"ID":"7e385182-92c1-4522-b9e6-1aba2dea9c27","Type":"ContainerStarted","Data":"e3f1ebbb4e8ffa9fce54e7277221687b3c246e62930d275c579b54b43fb531f1"} Apr 19 12:41:06.459408 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:06.459411 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" event={"ID":"7e385182-92c1-4522-b9e6-1aba2dea9c27","Type":"ContainerStarted","Data":"3929d0f71832f38c0323f190fcbedf5b1952611479122612cea9c904e29d5e12"} Apr 19 12:41:06.479023 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:06.478981 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" podStartSLOduration=2.478968129 podStartE2EDuration="2.478968129s" podCreationTimestamp="2026-04-19 12:41:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:41:06.478552686 +0000 UTC m=+621.716202951" watchObservedRunningTime="2026-04-19 12:41:06.478968129 +0000 UTC m=+621.716618390" Apr 19 12:41:07.329905 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:07.329872 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:07.334455 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:07.334433 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:07.462618 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:07.462592 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:07.463339 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:07.463324 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-r522j" Apr 19 12:41:19.379302 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:19.379269 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-42nk4"] Apr 19 12:41:19.382784 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:19.382767 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-42nk4" Apr 19 12:41:19.384490 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:19.384466 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-svsgq\"" Apr 19 12:41:19.387882 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:19.387842 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-42nk4"] Apr 19 12:41:19.451162 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:19.451139 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmhsg\" (UniqueName: \"kubernetes.io/projected/714f2b8d-6891-4e6e-a3e6-817fbd0a9a26-kube-api-access-hmhsg\") pod \"authorino-7498df8756-42nk4\" (UID: \"714f2b8d-6891-4e6e-a3e6-817fbd0a9a26\") " pod="kuadrant-system/authorino-7498df8756-42nk4" Apr 19 12:41:19.552516 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:19.552490 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmhsg\" (UniqueName: \"kubernetes.io/projected/714f2b8d-6891-4e6e-a3e6-817fbd0a9a26-kube-api-access-hmhsg\") pod \"authorino-7498df8756-42nk4\" (UID: \"714f2b8d-6891-4e6e-a3e6-817fbd0a9a26\") " pod="kuadrant-system/authorino-7498df8756-42nk4" Apr 19 12:41:19.562222 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:19.562197 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmhsg\" (UniqueName: \"kubernetes.io/projected/714f2b8d-6891-4e6e-a3e6-817fbd0a9a26-kube-api-access-hmhsg\") pod \"authorino-7498df8756-42nk4\" (UID: \"714f2b8d-6891-4e6e-a3e6-817fbd0a9a26\") " pod="kuadrant-system/authorino-7498df8756-42nk4" Apr 19 12:41:19.693219 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:19.693197 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-42nk4" Apr 19 12:41:19.812488 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:19.812380 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-42nk4"] Apr 19 12:41:19.815393 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:41:19.815369 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod714f2b8d_6891_4e6e_a3e6_817fbd0a9a26.slice/crio-c11a107b9747c2f0017ea3acbbe4769014542415ebe71fd92c3e0775c5bcfa7b WatchSource:0}: Error finding container c11a107b9747c2f0017ea3acbbe4769014542415ebe71fd92c3e0775c5bcfa7b: Status 404 returned error can't find the container with id c11a107b9747c2f0017ea3acbbe4769014542415ebe71fd92c3e0775c5bcfa7b Apr 19 12:41:20.517402 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:20.517334 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-42nk4" event={"ID":"714f2b8d-6891-4e6e-a3e6-817fbd0a9a26","Type":"ContainerStarted","Data":"c11a107b9747c2f0017ea3acbbe4769014542415ebe71fd92c3e0775c5bcfa7b"} Apr 19 12:41:23.529376 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:23.529336 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-42nk4" event={"ID":"714f2b8d-6891-4e6e-a3e6-817fbd0a9a26","Type":"ContainerStarted","Data":"414e3919a103ecff038285d2b93bb882e026bceb6ad230f04ec6c328bfc60f45"} Apr 19 12:41:23.543016 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:23.542970 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-42nk4" podStartSLOduration=1.443057818 podStartE2EDuration="4.542955941s" podCreationTimestamp="2026-04-19 12:41:19 +0000 UTC" firstStartedPulling="2026-04-19 12:41:19.817074389 +0000 UTC m=+635.054724628" lastFinishedPulling="2026-04-19 12:41:22.916972508 +0000 UTC m=+638.154622751" observedRunningTime="2026-04-19 12:41:23.541317361 +0000 UTC m=+638.778967615" watchObservedRunningTime="2026-04-19 12:41:23.542955941 +0000 UTC m=+638.780606203" Apr 19 12:41:49.998287 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:49.998239 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-42nk4"] Apr 19 12:41:49.998728 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:49.998521 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-42nk4" podUID="714f2b8d-6891-4e6e-a3e6-817fbd0a9a26" containerName="authorino" containerID="cri-o://414e3919a103ecff038285d2b93bb882e026bceb6ad230f04ec6c328bfc60f45" gracePeriod=30 Apr 19 12:41:50.237788 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:50.237763 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-42nk4" Apr 19 12:41:50.297512 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:50.297443 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmhsg\" (UniqueName: \"kubernetes.io/projected/714f2b8d-6891-4e6e-a3e6-817fbd0a9a26-kube-api-access-hmhsg\") pod \"714f2b8d-6891-4e6e-a3e6-817fbd0a9a26\" (UID: \"714f2b8d-6891-4e6e-a3e6-817fbd0a9a26\") " Apr 19 12:41:50.299663 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:50.299637 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/714f2b8d-6891-4e6e-a3e6-817fbd0a9a26-kube-api-access-hmhsg" (OuterVolumeSpecName: "kube-api-access-hmhsg") pod "714f2b8d-6891-4e6e-a3e6-817fbd0a9a26" (UID: "714f2b8d-6891-4e6e-a3e6-817fbd0a9a26"). InnerVolumeSpecName "kube-api-access-hmhsg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:41:50.399000 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:50.398967 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hmhsg\" (UniqueName: \"kubernetes.io/projected/714f2b8d-6891-4e6e-a3e6-817fbd0a9a26-kube-api-access-hmhsg\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:41:50.628716 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:50.628642 2583 generic.go:358] "Generic (PLEG): container finished" podID="714f2b8d-6891-4e6e-a3e6-817fbd0a9a26" containerID="414e3919a103ecff038285d2b93bb882e026bceb6ad230f04ec6c328bfc60f45" exitCode=0 Apr 19 12:41:50.628716 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:50.628690 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-42nk4" Apr 19 12:41:50.628907 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:50.628724 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-42nk4" event={"ID":"714f2b8d-6891-4e6e-a3e6-817fbd0a9a26","Type":"ContainerDied","Data":"414e3919a103ecff038285d2b93bb882e026bceb6ad230f04ec6c328bfc60f45"} Apr 19 12:41:50.628907 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:50.628758 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-42nk4" event={"ID":"714f2b8d-6891-4e6e-a3e6-817fbd0a9a26","Type":"ContainerDied","Data":"c11a107b9747c2f0017ea3acbbe4769014542415ebe71fd92c3e0775c5bcfa7b"} Apr 19 12:41:50.628907 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:50.628772 2583 scope.go:117] "RemoveContainer" containerID="414e3919a103ecff038285d2b93bb882e026bceb6ad230f04ec6c328bfc60f45" Apr 19 12:41:50.637642 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:50.637626 2583 scope.go:117] "RemoveContainer" containerID="414e3919a103ecff038285d2b93bb882e026bceb6ad230f04ec6c328bfc60f45" Apr 19 12:41:50.637904 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:41:50.637882 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"414e3919a103ecff038285d2b93bb882e026bceb6ad230f04ec6c328bfc60f45\": container with ID starting with 414e3919a103ecff038285d2b93bb882e026bceb6ad230f04ec6c328bfc60f45 not found: ID does not exist" containerID="414e3919a103ecff038285d2b93bb882e026bceb6ad230f04ec6c328bfc60f45" Apr 19 12:41:50.637959 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:50.637913 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"414e3919a103ecff038285d2b93bb882e026bceb6ad230f04ec6c328bfc60f45"} err="failed to get container status \"414e3919a103ecff038285d2b93bb882e026bceb6ad230f04ec6c328bfc60f45\": rpc error: code = NotFound desc = could not find container \"414e3919a103ecff038285d2b93bb882e026bceb6ad230f04ec6c328bfc60f45\": container with ID starting with 414e3919a103ecff038285d2b93bb882e026bceb6ad230f04ec6c328bfc60f45 not found: ID does not exist" Apr 19 12:41:50.653020 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:50.652994 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-42nk4"] Apr 19 12:41:50.656696 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:50.656676 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-42nk4"] Apr 19 12:41:51.328309 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:41:51.328276 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="714f2b8d-6891-4e6e-a3e6-817fbd0a9a26" path="/var/lib/kubelet/pods/714f2b8d-6891-4e6e-a3e6-817fbd0a9a26/volumes" Apr 19 12:42:06.414526 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:06.414490 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7f7fd5f68b-7b26h"] Apr 19 12:42:06.414899 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:06.414887 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="714f2b8d-6891-4e6e-a3e6-817fbd0a9a26" containerName="authorino" Apr 19 12:42:06.414956 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:06.414902 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="714f2b8d-6891-4e6e-a3e6-817fbd0a9a26" containerName="authorino" Apr 19 12:42:06.414990 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:06.414972 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="714f2b8d-6891-4e6e-a3e6-817fbd0a9a26" containerName="authorino" Apr 19 12:42:06.418632 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:06.418615 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7f7fd5f68b-7b26h" Apr 19 12:42:06.420337 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:06.420315 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-6fc62\"" Apr 19 12:42:06.423234 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:06.423209 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7f7fd5f68b-7b26h"] Apr 19 12:42:06.542553 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:06.542525 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6rmw\" (UniqueName: \"kubernetes.io/projected/613e4691-4af8-4ae1-905e-8a79a12f849c-kube-api-access-p6rmw\") pod \"maas-controller-7f7fd5f68b-7b26h\" (UID: \"613e4691-4af8-4ae1-905e-8a79a12f849c\") " pod="opendatahub/maas-controller-7f7fd5f68b-7b26h" Apr 19 12:42:06.643692 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:06.643663 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6rmw\" (UniqueName: \"kubernetes.io/projected/613e4691-4af8-4ae1-905e-8a79a12f849c-kube-api-access-p6rmw\") pod \"maas-controller-7f7fd5f68b-7b26h\" (UID: \"613e4691-4af8-4ae1-905e-8a79a12f849c\") " pod="opendatahub/maas-controller-7f7fd5f68b-7b26h" Apr 19 12:42:06.650996 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:06.650974 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6rmw\" (UniqueName: \"kubernetes.io/projected/613e4691-4af8-4ae1-905e-8a79a12f849c-kube-api-access-p6rmw\") pod \"maas-controller-7f7fd5f68b-7b26h\" (UID: \"613e4691-4af8-4ae1-905e-8a79a12f849c\") " pod="opendatahub/maas-controller-7f7fd5f68b-7b26h" Apr 19 12:42:06.731082 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:06.731057 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7f7fd5f68b-7b26h" Apr 19 12:42:06.852306 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:06.852270 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7f7fd5f68b-7b26h"] Apr 19 12:42:06.856044 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:42:06.856007 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod613e4691_4af8_4ae1_905e_8a79a12f849c.slice/crio-6e2c68a14f55743ff48e16e253df6a4d342d8aeb955cbf5c018a729b4a60e66c WatchSource:0}: Error finding container 6e2c68a14f55743ff48e16e253df6a4d342d8aeb955cbf5c018a729b4a60e66c: Status 404 returned error can't find the container with id 6e2c68a14f55743ff48e16e253df6a4d342d8aeb955cbf5c018a729b4a60e66c Apr 19 12:42:06.857136 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:06.857116 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:42:07.692378 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:07.692338 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7f7fd5f68b-7b26h" event={"ID":"613e4691-4af8-4ae1-905e-8a79a12f849c","Type":"ContainerStarted","Data":"6e2c68a14f55743ff48e16e253df6a4d342d8aeb955cbf5c018a729b4a60e66c"} Apr 19 12:42:09.700687 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:09.700654 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7f7fd5f68b-7b26h" event={"ID":"613e4691-4af8-4ae1-905e-8a79a12f849c","Type":"ContainerStarted","Data":"18653994e8245ab73ac7e260cf0e407aec25521bb309c53287ecd8cc123d8e3f"} Apr 19 12:42:09.701062 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:09.700701 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7f7fd5f68b-7b26h" Apr 19 12:42:09.716777 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:09.716735 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7f7fd5f68b-7b26h" podStartSLOduration=1.435224624 podStartE2EDuration="3.716721291s" podCreationTimestamp="2026-04-19 12:42:06 +0000 UTC" firstStartedPulling="2026-04-19 12:42:06.857274661 +0000 UTC m=+682.094924900" lastFinishedPulling="2026-04-19 12:42:09.138771324 +0000 UTC m=+684.376421567" observedRunningTime="2026-04-19 12:42:09.7144959 +0000 UTC m=+684.952146186" watchObservedRunningTime="2026-04-19 12:42:09.716721291 +0000 UTC m=+684.954371552" Apr 19 12:42:20.710211 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:20.710176 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7f7fd5f68b-7b26h" Apr 19 12:42:58.004986 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.004946 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm"] Apr 19 12:42:58.012047 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.012015 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" Apr 19 12:42:58.013799 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.013770 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-v7ph6\"" Apr 19 12:42:58.013799 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.013776 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 19 12:42:58.014347 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.014324 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 19 12:42:58.014557 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.014419 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 19 12:42:58.018128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.018104 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm"] Apr 19 12:42:58.177018 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.176990 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6ae3159b-5f65-49cd-8408-7527930ef42d-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm\" (UID: \"6ae3159b-5f65-49cd-8408-7527930ef42d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" Apr 19 12:42:58.177154 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.177029 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6ae3159b-5f65-49cd-8408-7527930ef42d-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm\" (UID: \"6ae3159b-5f65-49cd-8408-7527930ef42d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" Apr 19 12:42:58.177154 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.177055 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6ae3159b-5f65-49cd-8408-7527930ef42d-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm\" (UID: \"6ae3159b-5f65-49cd-8408-7527930ef42d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" Apr 19 12:42:58.177154 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.177134 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6ae3159b-5f65-49cd-8408-7527930ef42d-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm\" (UID: \"6ae3159b-5f65-49cd-8408-7527930ef42d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" Apr 19 12:42:58.177262 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.177177 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6ae3159b-5f65-49cd-8408-7527930ef42d-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm\" (UID: \"6ae3159b-5f65-49cd-8408-7527930ef42d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" Apr 19 12:42:58.177262 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.177200 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqdk5\" (UniqueName: \"kubernetes.io/projected/6ae3159b-5f65-49cd-8408-7527930ef42d-kube-api-access-dqdk5\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm\" (UID: \"6ae3159b-5f65-49cd-8408-7527930ef42d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" Apr 19 12:42:58.277778 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.277715 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6ae3159b-5f65-49cd-8408-7527930ef42d-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm\" (UID: \"6ae3159b-5f65-49cd-8408-7527930ef42d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" Apr 19 12:42:58.277778 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.277753 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dqdk5\" (UniqueName: \"kubernetes.io/projected/6ae3159b-5f65-49cd-8408-7527930ef42d-kube-api-access-dqdk5\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm\" (UID: \"6ae3159b-5f65-49cd-8408-7527930ef42d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" Apr 19 12:42:58.278005 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.277802 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6ae3159b-5f65-49cd-8408-7527930ef42d-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm\" (UID: \"6ae3159b-5f65-49cd-8408-7527930ef42d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" Apr 19 12:42:58.278005 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.277839 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6ae3159b-5f65-49cd-8408-7527930ef42d-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm\" (UID: \"6ae3159b-5f65-49cd-8408-7527930ef42d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" Apr 19 12:42:58.278005 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.277904 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6ae3159b-5f65-49cd-8408-7527930ef42d-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm\" (UID: \"6ae3159b-5f65-49cd-8408-7527930ef42d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" Apr 19 12:42:58.278155 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.278071 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6ae3159b-5f65-49cd-8408-7527930ef42d-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm\" (UID: \"6ae3159b-5f65-49cd-8408-7527930ef42d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" Apr 19 12:42:58.278247 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.278224 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6ae3159b-5f65-49cd-8408-7527930ef42d-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm\" (UID: \"6ae3159b-5f65-49cd-8408-7527930ef42d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" Apr 19 12:42:58.278299 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.278263 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6ae3159b-5f65-49cd-8408-7527930ef42d-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm\" (UID: \"6ae3159b-5f65-49cd-8408-7527930ef42d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" Apr 19 12:42:58.278367 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.278350 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6ae3159b-5f65-49cd-8408-7527930ef42d-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm\" (UID: \"6ae3159b-5f65-49cd-8408-7527930ef42d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" Apr 19 12:42:58.280219 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.280197 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6ae3159b-5f65-49cd-8408-7527930ef42d-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm\" (UID: \"6ae3159b-5f65-49cd-8408-7527930ef42d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" Apr 19 12:42:58.280496 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.280460 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6ae3159b-5f65-49cd-8408-7527930ef42d-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm\" (UID: \"6ae3159b-5f65-49cd-8408-7527930ef42d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" Apr 19 12:42:58.285325 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.285301 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqdk5\" (UniqueName: \"kubernetes.io/projected/6ae3159b-5f65-49cd-8408-7527930ef42d-kube-api-access-dqdk5\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm\" (UID: \"6ae3159b-5f65-49cd-8408-7527930ef42d\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" Apr 19 12:42:58.323026 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.322987 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" Apr 19 12:42:58.450564 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.450540 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm"] Apr 19 12:42:58.453221 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:42:58.453193 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ae3159b_5f65_49cd_8408_7527930ef42d.slice/crio-fb6323e25ce0b3d8a14611a1a73902bc9a6b3bceac081712d67771f633a872a0 WatchSource:0}: Error finding container fb6323e25ce0b3d8a14611a1a73902bc9a6b3bceac081712d67771f633a872a0: Status 404 returned error can't find the container with id fb6323e25ce0b3d8a14611a1a73902bc9a6b3bceac081712d67771f633a872a0 Apr 19 12:42:58.874152 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:58.874102 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" event={"ID":"6ae3159b-5f65-49cd-8408-7527930ef42d","Type":"ContainerStarted","Data":"fb6323e25ce0b3d8a14611a1a73902bc9a6b3bceac081712d67771f633a872a0"} Apr 19 12:42:59.211744 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.211706 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6"] Apr 19 12:42:59.215858 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.215820 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" Apr 19 12:42:59.218093 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.218047 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 19 12:42:59.221586 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.221562 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6"] Apr 19 12:42:59.387970 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.387638 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/982f194d-da77-4b6b-a4fe-9f03dc6bfc42-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6\" (UID: \"982f194d-da77-4b6b-a4fe-9f03dc6bfc42\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" Apr 19 12:42:59.387970 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.387708 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/982f194d-da77-4b6b-a4fe-9f03dc6bfc42-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6\" (UID: \"982f194d-da77-4b6b-a4fe-9f03dc6bfc42\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" Apr 19 12:42:59.387970 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.387760 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/982f194d-da77-4b6b-a4fe-9f03dc6bfc42-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6\" (UID: \"982f194d-da77-4b6b-a4fe-9f03dc6bfc42\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" Apr 19 12:42:59.387970 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.387806 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfpzt\" (UniqueName: \"kubernetes.io/projected/982f194d-da77-4b6b-a4fe-9f03dc6bfc42-kube-api-access-tfpzt\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6\" (UID: \"982f194d-da77-4b6b-a4fe-9f03dc6bfc42\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" Apr 19 12:42:59.387970 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.387834 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/982f194d-da77-4b6b-a4fe-9f03dc6bfc42-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6\" (UID: \"982f194d-da77-4b6b-a4fe-9f03dc6bfc42\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" Apr 19 12:42:59.387970 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.387943 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/982f194d-da77-4b6b-a4fe-9f03dc6bfc42-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6\" (UID: \"982f194d-da77-4b6b-a4fe-9f03dc6bfc42\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" Apr 19 12:42:59.489104 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.489001 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/982f194d-da77-4b6b-a4fe-9f03dc6bfc42-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6\" (UID: \"982f194d-da77-4b6b-a4fe-9f03dc6bfc42\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" Apr 19 12:42:59.489104 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.489066 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/982f194d-da77-4b6b-a4fe-9f03dc6bfc42-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6\" (UID: \"982f194d-da77-4b6b-a4fe-9f03dc6bfc42\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" Apr 19 12:42:59.489456 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.489106 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfpzt\" (UniqueName: \"kubernetes.io/projected/982f194d-da77-4b6b-a4fe-9f03dc6bfc42-kube-api-access-tfpzt\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6\" (UID: \"982f194d-da77-4b6b-a4fe-9f03dc6bfc42\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" Apr 19 12:42:59.489456 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.489134 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/982f194d-da77-4b6b-a4fe-9f03dc6bfc42-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6\" (UID: \"982f194d-da77-4b6b-a4fe-9f03dc6bfc42\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" Apr 19 12:42:59.489456 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.489194 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/982f194d-da77-4b6b-a4fe-9f03dc6bfc42-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6\" (UID: \"982f194d-da77-4b6b-a4fe-9f03dc6bfc42\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" Apr 19 12:42:59.489456 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.489376 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/982f194d-da77-4b6b-a4fe-9f03dc6bfc42-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6\" (UID: \"982f194d-da77-4b6b-a4fe-9f03dc6bfc42\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" Apr 19 12:42:59.490037 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.489735 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/982f194d-da77-4b6b-a4fe-9f03dc6bfc42-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6\" (UID: \"982f194d-da77-4b6b-a4fe-9f03dc6bfc42\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" Apr 19 12:42:59.490037 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.489979 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/982f194d-da77-4b6b-a4fe-9f03dc6bfc42-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6\" (UID: \"982f194d-da77-4b6b-a4fe-9f03dc6bfc42\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" Apr 19 12:42:59.490208 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.490125 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/982f194d-da77-4b6b-a4fe-9f03dc6bfc42-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6\" (UID: \"982f194d-da77-4b6b-a4fe-9f03dc6bfc42\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" Apr 19 12:42:59.491966 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.491927 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/982f194d-da77-4b6b-a4fe-9f03dc6bfc42-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6\" (UID: \"982f194d-da77-4b6b-a4fe-9f03dc6bfc42\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" Apr 19 12:42:59.494256 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.494220 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/982f194d-da77-4b6b-a4fe-9f03dc6bfc42-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6\" (UID: \"982f194d-da77-4b6b-a4fe-9f03dc6bfc42\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" Apr 19 12:42:59.497653 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.497516 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfpzt\" (UniqueName: \"kubernetes.io/projected/982f194d-da77-4b6b-a4fe-9f03dc6bfc42-kube-api-access-tfpzt\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6\" (UID: \"982f194d-da77-4b6b-a4fe-9f03dc6bfc42\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" Apr 19 12:42:59.529943 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.529881 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" Apr 19 12:42:59.679529 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.679495 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6"] Apr 19 12:42:59.684477 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:42:59.684407 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod982f194d_da77_4b6b_a4fe_9f03dc6bfc42.slice/crio-de3250507e0cc6fa782a0c6df14c70217c9343b718c3eb6c69c179abaee97eb9 WatchSource:0}: Error finding container de3250507e0cc6fa782a0c6df14c70217c9343b718c3eb6c69c179abaee97eb9: Status 404 returned error can't find the container with id de3250507e0cc6fa782a0c6df14c70217c9343b718c3eb6c69c179abaee97eb9 Apr 19 12:42:59.879293 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:42:59.879206 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" event={"ID":"982f194d-da77-4b6b-a4fe-9f03dc6bfc42","Type":"ContainerStarted","Data":"de3250507e0cc6fa782a0c6df14c70217c9343b718c3eb6c69c179abaee97eb9"} Apr 19 12:43:04.900537 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:43:04.900499 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" event={"ID":"6ae3159b-5f65-49cd-8408-7527930ef42d","Type":"ContainerStarted","Data":"6f00e88de30f6fc4d028da0bc53c466dd0a83fcf074caa8eef9732d9df18a567"} Apr 19 12:43:04.906076 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:43:04.906041 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" event={"ID":"982f194d-da77-4b6b-a4fe-9f03dc6bfc42","Type":"ContainerStarted","Data":"a84eb33e03b7cf3de11ab51763a4cd30ac71f33a44ab72c9881949bd834945b5"} Apr 19 12:43:09.926050 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:43:09.926014 2583 generic.go:358] "Generic (PLEG): container finished" podID="6ae3159b-5f65-49cd-8408-7527930ef42d" containerID="6f00e88de30f6fc4d028da0bc53c466dd0a83fcf074caa8eef9732d9df18a567" exitCode=0 Apr 19 12:43:09.926545 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:43:09.926095 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" event={"ID":"6ae3159b-5f65-49cd-8408-7527930ef42d","Type":"ContainerDied","Data":"6f00e88de30f6fc4d028da0bc53c466dd0a83fcf074caa8eef9732d9df18a567"} Apr 19 12:43:09.927650 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:43:09.927415 2583 generic.go:358] "Generic (PLEG): container finished" podID="982f194d-da77-4b6b-a4fe-9f03dc6bfc42" containerID="a84eb33e03b7cf3de11ab51763a4cd30ac71f33a44ab72c9881949bd834945b5" exitCode=0 Apr 19 12:43:09.927650 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:43:09.927463 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" event={"ID":"982f194d-da77-4b6b-a4fe-9f03dc6bfc42","Type":"ContainerDied","Data":"a84eb33e03b7cf3de11ab51763a4cd30ac71f33a44ab72c9881949bd834945b5"} Apr 19 12:43:11.941135 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:43:11.941094 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" event={"ID":"982f194d-da77-4b6b-a4fe-9f03dc6bfc42","Type":"ContainerStarted","Data":"b3c4466bb0eb1bd2b029b1fa3593174fabd13a77568aab8eebb632f25db87e34"} Apr 19 12:43:11.941533 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:43:11.941338 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" Apr 19 12:43:11.942725 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:43:11.942703 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" event={"ID":"6ae3159b-5f65-49cd-8408-7527930ef42d","Type":"ContainerStarted","Data":"62cbe7630f31b09c3eef6bb90b37c5b0ec308788d7dffd823fe90cc421514429"} Apr 19 12:43:11.942915 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:43:11.942901 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" Apr 19 12:43:11.956812 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:43:11.956775 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" podStartSLOduration=1.561744321 podStartE2EDuration="12.956761942s" podCreationTimestamp="2026-04-19 12:42:59 +0000 UTC" firstStartedPulling="2026-04-19 12:42:59.686586109 +0000 UTC m=+734.924236357" lastFinishedPulling="2026-04-19 12:43:11.081603735 +0000 UTC m=+746.319253978" observedRunningTime="2026-04-19 12:43:11.95591047 +0000 UTC m=+747.193560731" watchObservedRunningTime="2026-04-19 12:43:11.956761942 +0000 UTC m=+747.194412204" Apr 19 12:43:11.972327 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:43:11.972292 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" podStartSLOduration=2.341980201 podStartE2EDuration="14.972282005s" podCreationTimestamp="2026-04-19 12:42:57 +0000 UTC" firstStartedPulling="2026-04-19 12:42:58.45523342 +0000 UTC m=+733.692883659" lastFinishedPulling="2026-04-19 12:43:11.085535223 +0000 UTC m=+746.323185463" observedRunningTime="2026-04-19 12:43:11.971372306 +0000 UTC m=+747.209022567" watchObservedRunningTime="2026-04-19 12:43:11.972282005 +0000 UTC m=+747.209932266" Apr 19 12:43:22.960686 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:43:22.960657 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6" Apr 19 12:43:22.961669 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:43:22.961651 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm" Apr 19 12:45:00.139788 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:00.139743 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29610045-q8b9l"] Apr 19 12:45:00.143333 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:00.143308 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29610045-q8b9l" Apr 19 12:45:00.145201 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:00.145167 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-bbg64\"" Apr 19 12:45:00.148455 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:00.148421 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29610045-q8b9l"] Apr 19 12:45:00.246287 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:00.246246 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bplf6\" (UniqueName: \"kubernetes.io/projected/235ae275-5cb5-45d2-9105-a093d2d18328-kube-api-access-bplf6\") pod \"maas-api-key-cleanup-29610045-q8b9l\" (UID: \"235ae275-5cb5-45d2-9105-a093d2d18328\") " pod="opendatahub/maas-api-key-cleanup-29610045-q8b9l" Apr 19 12:45:00.347432 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:00.347385 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bplf6\" (UniqueName: \"kubernetes.io/projected/235ae275-5cb5-45d2-9105-a093d2d18328-kube-api-access-bplf6\") pod \"maas-api-key-cleanup-29610045-q8b9l\" (UID: \"235ae275-5cb5-45d2-9105-a093d2d18328\") " pod="opendatahub/maas-api-key-cleanup-29610045-q8b9l" Apr 19 12:45:00.355923 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:00.355884 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bplf6\" (UniqueName: \"kubernetes.io/projected/235ae275-5cb5-45d2-9105-a093d2d18328-kube-api-access-bplf6\") pod \"maas-api-key-cleanup-29610045-q8b9l\" (UID: \"235ae275-5cb5-45d2-9105-a093d2d18328\") " pod="opendatahub/maas-api-key-cleanup-29610045-q8b9l" Apr 19 12:45:00.458744 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:00.458699 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29610045-q8b9l" Apr 19 12:45:00.598623 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:00.598588 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29610045-q8b9l"] Apr 19 12:45:00.601416 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:45:00.601371 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod235ae275_5cb5_45d2_9105_a093d2d18328.slice/crio-4a3cb641947090f2f65e382dd2c62946d847b38d958511edd26d5ca31dcf0fd3 WatchSource:0}: Error finding container 4a3cb641947090f2f65e382dd2c62946d847b38d958511edd26d5ca31dcf0fd3: Status 404 returned error can't find the container with id 4a3cb641947090f2f65e382dd2c62946d847b38d958511edd26d5ca31dcf0fd3 Apr 19 12:45:01.329057 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:01.329008 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610045-q8b9l" event={"ID":"235ae275-5cb5-45d2-9105-a093d2d18328","Type":"ContainerStarted","Data":"4a3cb641947090f2f65e382dd2c62946d847b38d958511edd26d5ca31dcf0fd3"} Apr 19 12:45:02.328572 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:02.328535 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610045-q8b9l" event={"ID":"235ae275-5cb5-45d2-9105-a093d2d18328","Type":"ContainerStarted","Data":"deefae0beec8a8256eaea254ae137a83ec33b71a41292c490be6a0208514330e"} Apr 19 12:45:02.342765 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:02.342712 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29610045-q8b9l" podStartSLOduration=1.815287836 podStartE2EDuration="2.342692204s" podCreationTimestamp="2026-04-19 12:45:00 +0000 UTC" firstStartedPulling="2026-04-19 12:45:00.603625897 +0000 UTC m=+855.841276135" lastFinishedPulling="2026-04-19 12:45:01.131030262 +0000 UTC m=+856.368680503" observedRunningTime="2026-04-19 12:45:02.34175066 +0000 UTC m=+857.579400921" watchObservedRunningTime="2026-04-19 12:45:02.342692204 +0000 UTC m=+857.580342465" Apr 19 12:45:22.401867 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:22.401811 2583 generic.go:358] "Generic (PLEG): container finished" podID="235ae275-5cb5-45d2-9105-a093d2d18328" containerID="deefae0beec8a8256eaea254ae137a83ec33b71a41292c490be6a0208514330e" exitCode=6 Apr 19 12:45:22.402236 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:22.401915 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610045-q8b9l" event={"ID":"235ae275-5cb5-45d2-9105-a093d2d18328","Type":"ContainerDied","Data":"deefae0beec8a8256eaea254ae137a83ec33b71a41292c490be6a0208514330e"} Apr 19 12:45:22.402438 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:22.402418 2583 scope.go:117] "RemoveContainer" containerID="deefae0beec8a8256eaea254ae137a83ec33b71a41292c490be6a0208514330e" Apr 19 12:45:23.406578 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:23.406550 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610045-q8b9l" event={"ID":"235ae275-5cb5-45d2-9105-a093d2d18328","Type":"ContainerStarted","Data":"7eab769d9a233f6f40897ec76f8d1d865b1b771544cc134bacd2798dc1a91643"} Apr 19 12:45:39.954066 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:39.953967 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7f7fd5f68b-7b26h"] Apr 19 12:45:39.954593 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:39.954327 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-7f7fd5f68b-7b26h" podUID="613e4691-4af8-4ae1-905e-8a79a12f849c" containerName="manager" containerID="cri-o://18653994e8245ab73ac7e260cf0e407aec25521bb309c53287ecd8cc123d8e3f" gracePeriod=10 Apr 19 12:45:40.210447 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:40.210392 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7f7fd5f68b-7b26h" Apr 19 12:45:40.292601 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:40.292573 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6rmw\" (UniqueName: \"kubernetes.io/projected/613e4691-4af8-4ae1-905e-8a79a12f849c-kube-api-access-p6rmw\") pod \"613e4691-4af8-4ae1-905e-8a79a12f849c\" (UID: \"613e4691-4af8-4ae1-905e-8a79a12f849c\") " Apr 19 12:45:40.295025 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:40.295001 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/613e4691-4af8-4ae1-905e-8a79a12f849c-kube-api-access-p6rmw" (OuterVolumeSpecName: "kube-api-access-p6rmw") pod "613e4691-4af8-4ae1-905e-8a79a12f849c" (UID: "613e4691-4af8-4ae1-905e-8a79a12f849c"). InnerVolumeSpecName "kube-api-access-p6rmw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:45:40.393978 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:40.393949 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p6rmw\" (UniqueName: \"kubernetes.io/projected/613e4691-4af8-4ae1-905e-8a79a12f849c-kube-api-access-p6rmw\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:45:40.464934 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:40.464877 2583 generic.go:358] "Generic (PLEG): container finished" podID="613e4691-4af8-4ae1-905e-8a79a12f849c" containerID="18653994e8245ab73ac7e260cf0e407aec25521bb309c53287ecd8cc123d8e3f" exitCode=0 Apr 19 12:45:40.465013 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:40.464968 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7f7fd5f68b-7b26h" event={"ID":"613e4691-4af8-4ae1-905e-8a79a12f849c","Type":"ContainerDied","Data":"18653994e8245ab73ac7e260cf0e407aec25521bb309c53287ecd8cc123d8e3f"} Apr 19 12:45:40.465013 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:40.464995 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7f7fd5f68b-7b26h" event={"ID":"613e4691-4af8-4ae1-905e-8a79a12f849c","Type":"ContainerDied","Data":"6e2c68a14f55743ff48e16e253df6a4d342d8aeb955cbf5c018a729b4a60e66c"} Apr 19 12:45:40.465013 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:40.464997 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7f7fd5f68b-7b26h" Apr 19 12:45:40.465114 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:40.465010 2583 scope.go:117] "RemoveContainer" containerID="18653994e8245ab73ac7e260cf0e407aec25521bb309c53287ecd8cc123d8e3f" Apr 19 12:45:40.473378 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:40.473360 2583 scope.go:117] "RemoveContainer" containerID="18653994e8245ab73ac7e260cf0e407aec25521bb309c53287ecd8cc123d8e3f" Apr 19 12:45:40.473609 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:45:40.473592 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18653994e8245ab73ac7e260cf0e407aec25521bb309c53287ecd8cc123d8e3f\": container with ID starting with 18653994e8245ab73ac7e260cf0e407aec25521bb309c53287ecd8cc123d8e3f not found: ID does not exist" containerID="18653994e8245ab73ac7e260cf0e407aec25521bb309c53287ecd8cc123d8e3f" Apr 19 12:45:40.473668 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:40.473617 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18653994e8245ab73ac7e260cf0e407aec25521bb309c53287ecd8cc123d8e3f"} err="failed to get container status \"18653994e8245ab73ac7e260cf0e407aec25521bb309c53287ecd8cc123d8e3f\": rpc error: code = NotFound desc = could not find container \"18653994e8245ab73ac7e260cf0e407aec25521bb309c53287ecd8cc123d8e3f\": container with ID starting with 18653994e8245ab73ac7e260cf0e407aec25521bb309c53287ecd8cc123d8e3f not found: ID does not exist" Apr 19 12:45:40.484403 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:40.484378 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7f7fd5f68b-7b26h"] Apr 19 12:45:40.489130 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:40.489108 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-7f7fd5f68b-7b26h"] Apr 19 12:45:41.155446 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:41.155412 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7f7fd5f68b-pmkvb"] Apr 19 12:45:41.155804 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:41.155791 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="613e4691-4af8-4ae1-905e-8a79a12f849c" containerName="manager" Apr 19 12:45:41.155804 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:41.155805 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="613e4691-4af8-4ae1-905e-8a79a12f849c" containerName="manager" Apr 19 12:45:41.155911 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:41.155901 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="613e4691-4af8-4ae1-905e-8a79a12f849c" containerName="manager" Apr 19 12:45:41.160224 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:41.160207 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7f7fd5f68b-pmkvb" Apr 19 12:45:41.162022 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:41.162004 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-6fc62\"" Apr 19 12:45:41.165414 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:41.165394 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7f7fd5f68b-pmkvb"] Apr 19 12:45:41.202129 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:41.202101 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j52gw\" (UniqueName: \"kubernetes.io/projected/cde66671-7a74-43c9-b2c3-a9c15c3f3af5-kube-api-access-j52gw\") pod \"maas-controller-7f7fd5f68b-pmkvb\" (UID: \"cde66671-7a74-43c9-b2c3-a9c15c3f3af5\") " pod="opendatahub/maas-controller-7f7fd5f68b-pmkvb" Apr 19 12:45:41.303154 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:41.303129 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j52gw\" (UniqueName: \"kubernetes.io/projected/cde66671-7a74-43c9-b2c3-a9c15c3f3af5-kube-api-access-j52gw\") pod \"maas-controller-7f7fd5f68b-pmkvb\" (UID: \"cde66671-7a74-43c9-b2c3-a9c15c3f3af5\") " pod="opendatahub/maas-controller-7f7fd5f68b-pmkvb" Apr 19 12:45:41.310570 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:41.310548 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j52gw\" (UniqueName: \"kubernetes.io/projected/cde66671-7a74-43c9-b2c3-a9c15c3f3af5-kube-api-access-j52gw\") pod \"maas-controller-7f7fd5f68b-pmkvb\" (UID: \"cde66671-7a74-43c9-b2c3-a9c15c3f3af5\") " pod="opendatahub/maas-controller-7f7fd5f68b-pmkvb" Apr 19 12:45:41.327879 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:41.327831 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="613e4691-4af8-4ae1-905e-8a79a12f849c" path="/var/lib/kubelet/pods/613e4691-4af8-4ae1-905e-8a79a12f849c/volumes" Apr 19 12:45:41.471460 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:41.471435 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7f7fd5f68b-pmkvb" Apr 19 12:45:41.592940 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:41.592912 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7f7fd5f68b-pmkvb"] Apr 19 12:45:41.595740 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:45:41.595711 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcde66671_7a74_43c9_b2c3_a9c15c3f3af5.slice/crio-89021361fb9d1489c6a25f2e6324399e94a07c71d4d220c3246c311900377164 WatchSource:0}: Error finding container 89021361fb9d1489c6a25f2e6324399e94a07c71d4d220c3246c311900377164: Status 404 returned error can't find the container with id 89021361fb9d1489c6a25f2e6324399e94a07c71d4d220c3246c311900377164 Apr 19 12:45:42.473801 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:42.473757 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7f7fd5f68b-pmkvb" event={"ID":"cde66671-7a74-43c9-b2c3-a9c15c3f3af5","Type":"ContainerStarted","Data":"7bad28495d5295eeaf1ea0ba37a4ff03227c067700c1f6b140b9d5282611dea8"} Apr 19 12:45:42.473801 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:42.473796 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7f7fd5f68b-pmkvb" event={"ID":"cde66671-7a74-43c9-b2c3-a9c15c3f3af5","Type":"ContainerStarted","Data":"89021361fb9d1489c6a25f2e6324399e94a07c71d4d220c3246c311900377164"} Apr 19 12:45:42.474283 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:42.473869 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7f7fd5f68b-pmkvb" Apr 19 12:45:42.489885 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:42.489821 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7f7fd5f68b-pmkvb" podStartSLOduration=1.024776465 podStartE2EDuration="1.489809329s" podCreationTimestamp="2026-04-19 12:45:41 +0000 UTC" firstStartedPulling="2026-04-19 12:45:41.597424769 +0000 UTC m=+896.835075011" lastFinishedPulling="2026-04-19 12:45:42.062457636 +0000 UTC m=+897.300107875" observedRunningTime="2026-04-19 12:45:42.488462366 +0000 UTC m=+897.726112626" watchObservedRunningTime="2026-04-19 12:45:42.489809329 +0000 UTC m=+897.727459590" Apr 19 12:45:43.478872 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:43.478826 2583 generic.go:358] "Generic (PLEG): container finished" podID="235ae275-5cb5-45d2-9105-a093d2d18328" containerID="7eab769d9a233f6f40897ec76f8d1d865b1b771544cc134bacd2798dc1a91643" exitCode=6 Apr 19 12:45:43.479308 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:43.478892 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610045-q8b9l" event={"ID":"235ae275-5cb5-45d2-9105-a093d2d18328","Type":"ContainerDied","Data":"7eab769d9a233f6f40897ec76f8d1d865b1b771544cc134bacd2798dc1a91643"} Apr 19 12:45:43.479308 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:43.478940 2583 scope.go:117] "RemoveContainer" containerID="deefae0beec8a8256eaea254ae137a83ec33b71a41292c490be6a0208514330e" Apr 19 12:45:43.479567 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:43.479321 2583 scope.go:117] "RemoveContainer" containerID="7eab769d9a233f6f40897ec76f8d1d865b1b771544cc134bacd2798dc1a91643" Apr 19 12:45:43.479567 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:45:43.479539 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29610045-q8b9l_opendatahub(235ae275-5cb5-45d2-9105-a093d2d18328)\"" pod="opendatahub/maas-api-key-cleanup-29610045-q8b9l" podUID="235ae275-5cb5-45d2-9105-a093d2d18328" Apr 19 12:45:45.291302 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:45.291277 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovn-acl-logging/0.log" Apr 19 12:45:45.291939 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:45.291920 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovn-acl-logging/0.log" Apr 19 12:45:53.485554 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:53.485522 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7f7fd5f68b-pmkvb" Apr 19 12:45:58.324073 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:58.324036 2583 scope.go:117] "RemoveContainer" containerID="7eab769d9a233f6f40897ec76f8d1d865b1b771544cc134bacd2798dc1a91643" Apr 19 12:45:59.545223 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:45:59.545179 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610045-q8b9l" event={"ID":"235ae275-5cb5-45d2-9105-a093d2d18328","Type":"ContainerStarted","Data":"80555f7c2587607856c7a1770b96c927c4d69174661a7b4a4d0bdd96bc456e6c"} Apr 19 12:46:00.009800 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:46:00.009753 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29610045-q8b9l"] Apr 19 12:46:00.548913 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:46:00.548838 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29610045-q8b9l" podUID="235ae275-5cb5-45d2-9105-a093d2d18328" containerName="cleanup" containerID="cri-o://80555f7c2587607856c7a1770b96c927c4d69174661a7b4a4d0bdd96bc456e6c" gracePeriod=30 Apr 19 12:46:19.206487 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:46:19.206452 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29610045-q8b9l" Apr 19 12:46:19.254737 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:46:19.254640 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bplf6\" (UniqueName: \"kubernetes.io/projected/235ae275-5cb5-45d2-9105-a093d2d18328-kube-api-access-bplf6\") pod \"235ae275-5cb5-45d2-9105-a093d2d18328\" (UID: \"235ae275-5cb5-45d2-9105-a093d2d18328\") " Apr 19 12:46:19.257123 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:46:19.257077 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/235ae275-5cb5-45d2-9105-a093d2d18328-kube-api-access-bplf6" (OuterVolumeSpecName: "kube-api-access-bplf6") pod "235ae275-5cb5-45d2-9105-a093d2d18328" (UID: "235ae275-5cb5-45d2-9105-a093d2d18328"). InnerVolumeSpecName "kube-api-access-bplf6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:46:19.355577 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:46:19.355530 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bplf6\" (UniqueName: \"kubernetes.io/projected/235ae275-5cb5-45d2-9105-a093d2d18328-kube-api-access-bplf6\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:46:19.623285 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:46:19.623175 2583 generic.go:358] "Generic (PLEG): container finished" podID="235ae275-5cb5-45d2-9105-a093d2d18328" containerID="80555f7c2587607856c7a1770b96c927c4d69174661a7b4a4d0bdd96bc456e6c" exitCode=6 Apr 19 12:46:19.623285 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:46:19.623263 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29610045-q8b9l" Apr 19 12:46:19.623499 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:46:19.623263 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610045-q8b9l" event={"ID":"235ae275-5cb5-45d2-9105-a093d2d18328","Type":"ContainerDied","Data":"80555f7c2587607856c7a1770b96c927c4d69174661a7b4a4d0bdd96bc456e6c"} Apr 19 12:46:19.623499 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:46:19.623375 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610045-q8b9l" event={"ID":"235ae275-5cb5-45d2-9105-a093d2d18328","Type":"ContainerDied","Data":"4a3cb641947090f2f65e382dd2c62946d847b38d958511edd26d5ca31dcf0fd3"} Apr 19 12:46:19.623499 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:46:19.623396 2583 scope.go:117] "RemoveContainer" containerID="80555f7c2587607856c7a1770b96c927c4d69174661a7b4a4d0bdd96bc456e6c" Apr 19 12:46:19.632754 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:46:19.632728 2583 scope.go:117] "RemoveContainer" containerID="7eab769d9a233f6f40897ec76f8d1d865b1b771544cc134bacd2798dc1a91643" Apr 19 12:46:19.637439 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:46:19.637353 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29610045-q8b9l"] Apr 19 12:46:19.639373 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:46:19.639346 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29610045-q8b9l"] Apr 19 12:46:19.642537 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:46:19.642516 2583 scope.go:117] "RemoveContainer" containerID="80555f7c2587607856c7a1770b96c927c4d69174661a7b4a4d0bdd96bc456e6c" Apr 19 12:46:19.642840 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:46:19.642809 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80555f7c2587607856c7a1770b96c927c4d69174661a7b4a4d0bdd96bc456e6c\": container with ID starting with 80555f7c2587607856c7a1770b96c927c4d69174661a7b4a4d0bdd96bc456e6c not found: ID does not exist" containerID="80555f7c2587607856c7a1770b96c927c4d69174661a7b4a4d0bdd96bc456e6c" Apr 19 12:46:19.642967 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:46:19.642861 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80555f7c2587607856c7a1770b96c927c4d69174661a7b4a4d0bdd96bc456e6c"} err="failed to get container status \"80555f7c2587607856c7a1770b96c927c4d69174661a7b4a4d0bdd96bc456e6c\": rpc error: code = NotFound desc = could not find container \"80555f7c2587607856c7a1770b96c927c4d69174661a7b4a4d0bdd96bc456e6c\": container with ID starting with 80555f7c2587607856c7a1770b96c927c4d69174661a7b4a4d0bdd96bc456e6c not found: ID does not exist" Apr 19 12:46:19.642967 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:46:19.642893 2583 scope.go:117] "RemoveContainer" containerID="7eab769d9a233f6f40897ec76f8d1d865b1b771544cc134bacd2798dc1a91643" Apr 19 12:46:19.643183 ip-10-0-140-194 kubenswrapper[2583]: E0419 12:46:19.643162 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eab769d9a233f6f40897ec76f8d1d865b1b771544cc134bacd2798dc1a91643\": container with ID starting with 7eab769d9a233f6f40897ec76f8d1d865b1b771544cc134bacd2798dc1a91643 not found: ID does not exist" containerID="7eab769d9a233f6f40897ec76f8d1d865b1b771544cc134bacd2798dc1a91643" Apr 19 12:46:19.643228 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:46:19.643189 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eab769d9a233f6f40897ec76f8d1d865b1b771544cc134bacd2798dc1a91643"} err="failed to get container status \"7eab769d9a233f6f40897ec76f8d1d865b1b771544cc134bacd2798dc1a91643\": rpc error: code = NotFound desc = could not find container \"7eab769d9a233f6f40897ec76f8d1d865b1b771544cc134bacd2798dc1a91643\": container with ID starting with 7eab769d9a233f6f40897ec76f8d1d865b1b771544cc134bacd2798dc1a91643 not found: ID does not exist" Apr 19 12:46:21.329106 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:46:21.329067 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="235ae275-5cb5-45d2-9105-a093d2d18328" path="/var/lib/kubelet/pods/235ae275-5cb5-45d2-9105-a093d2d18328/volumes" Apr 19 12:50:45.326032 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:50:45.325998 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovn-acl-logging/0.log" Apr 19 12:50:45.328680 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:50:45.328653 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovn-acl-logging/0.log" Apr 19 12:55:45.351574 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:55:45.351546 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovn-acl-logging/0.log" Apr 19 12:55:45.354822 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:55:45.354802 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovn-acl-logging/0.log" Apr 19 12:55:54.578176 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:55:54.578138 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2"] Apr 19 12:55:54.578607 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:55:54.578391 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2" podUID="3f54722c-3d20-42f6-a950-c57518b156c6" containerName="manager" containerID="cri-o://2780c832617cbf92d35b2975bd88d31901f7a5f0fa248a5438e3c0412accafc9" gracePeriod=10 Apr 19 12:55:54.727543 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:55:54.727507 2583 generic.go:358] "Generic (PLEG): container finished" podID="3f54722c-3d20-42f6-a950-c57518b156c6" containerID="2780c832617cbf92d35b2975bd88d31901f7a5f0fa248a5438e3c0412accafc9" exitCode=0 Apr 19 12:55:54.727677 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:55:54.727594 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2" event={"ID":"3f54722c-3d20-42f6-a950-c57518b156c6","Type":"ContainerDied","Data":"2780c832617cbf92d35b2975bd88d31901f7a5f0fa248a5438e3c0412accafc9"} Apr 19 12:55:54.829270 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:55:54.829214 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2" Apr 19 12:55:54.979204 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:55:54.979174 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm9pq\" (UniqueName: \"kubernetes.io/projected/3f54722c-3d20-42f6-a950-c57518b156c6-kube-api-access-nm9pq\") pod \"3f54722c-3d20-42f6-a950-c57518b156c6\" (UID: \"3f54722c-3d20-42f6-a950-c57518b156c6\") " Apr 19 12:55:54.979351 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:55:54.979221 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3f54722c-3d20-42f6-a950-c57518b156c6-extensions-socket-volume\") pod \"3f54722c-3d20-42f6-a950-c57518b156c6\" (UID: \"3f54722c-3d20-42f6-a950-c57518b156c6\") " Apr 19 12:55:54.979585 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:55:54.979556 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f54722c-3d20-42f6-a950-c57518b156c6-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "3f54722c-3d20-42f6-a950-c57518b156c6" (UID: "3f54722c-3d20-42f6-a950-c57518b156c6"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:55:54.981402 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:55:54.981374 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f54722c-3d20-42f6-a950-c57518b156c6-kube-api-access-nm9pq" (OuterVolumeSpecName: "kube-api-access-nm9pq") pod "3f54722c-3d20-42f6-a950-c57518b156c6" (UID: "3f54722c-3d20-42f6-a950-c57518b156c6"). InnerVolumeSpecName "kube-api-access-nm9pq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:55:55.079934 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:55:55.079878 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nm9pq\" (UniqueName: \"kubernetes.io/projected/3f54722c-3d20-42f6-a950-c57518b156c6-kube-api-access-nm9pq\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:55:55.079934 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:55:55.079899 2583 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3f54722c-3d20-42f6-a950-c57518b156c6-extensions-socket-volume\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 12:55:55.731764 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:55:55.731731 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2" event={"ID":"3f54722c-3d20-42f6-a950-c57518b156c6","Type":"ContainerDied","Data":"4e5895ce24dbf98ddcb242388f07288e045d9c5c4eb423d71671a185ef026619"} Apr 19 12:55:55.732197 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:55:55.731775 2583 scope.go:117] "RemoveContainer" containerID="2780c832617cbf92d35b2975bd88d31901f7a5f0fa248a5438e3c0412accafc9" Apr 19 12:55:55.732197 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:55:55.731745 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2" Apr 19 12:55:55.747319 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:55:55.747292 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2"] Apr 19 12:55:55.750519 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:55:55.750498 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f58p2"] Apr 19 12:55:57.328322 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:55:57.328287 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f54722c-3d20-42f6-a950-c57518b156c6" path="/var/lib/kubelet/pods/3f54722c-3d20-42f6-a950-c57518b156c6/volumes" Apr 19 12:57:00.619923 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:00.619885 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-67pzd"] Apr 19 12:57:00.620325 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:00.620228 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="235ae275-5cb5-45d2-9105-a093d2d18328" containerName="cleanup" Apr 19 12:57:00.620325 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:00.620238 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="235ae275-5cb5-45d2-9105-a093d2d18328" containerName="cleanup" Apr 19 12:57:00.620325 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:00.620254 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="235ae275-5cb5-45d2-9105-a093d2d18328" containerName="cleanup" Apr 19 12:57:00.620325 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:00.620260 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="235ae275-5cb5-45d2-9105-a093d2d18328" containerName="cleanup" Apr 19 12:57:00.620325 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:00.620275 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f54722c-3d20-42f6-a950-c57518b156c6" containerName="manager" Apr 19 12:57:00.620325 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:00.620280 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f54722c-3d20-42f6-a950-c57518b156c6" containerName="manager" Apr 19 12:57:00.620547 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:00.620332 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f54722c-3d20-42f6-a950-c57518b156c6" containerName="manager" Apr 19 12:57:00.620547 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:00.620343 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="235ae275-5cb5-45d2-9105-a093d2d18328" containerName="cleanup" Apr 19 12:57:00.620547 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:00.620352 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="235ae275-5cb5-45d2-9105-a093d2d18328" containerName="cleanup" Apr 19 12:57:00.620547 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:00.620358 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="235ae275-5cb5-45d2-9105-a093d2d18328" containerName="cleanup" Apr 19 12:57:00.623722 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:00.623705 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-67pzd" Apr 19 12:57:00.626027 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:00.625999 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-wnc7s\"" Apr 19 12:57:00.632436 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:00.632415 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-67pzd"] Apr 19 12:57:00.699176 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:00.699145 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9ad4b9ce-100f-4b9c-bfdb-c8646f38ef58-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-67pzd\" (UID: \"9ad4b9ce-100f-4b9c-bfdb-c8646f38ef58\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-67pzd" Apr 19 12:57:00.699332 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:00.699200 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bd9b\" (UniqueName: \"kubernetes.io/projected/9ad4b9ce-100f-4b9c-bfdb-c8646f38ef58-kube-api-access-4bd9b\") pod \"kuadrant-operator-controller-manager-55c7f4c975-67pzd\" (UID: \"9ad4b9ce-100f-4b9c-bfdb-c8646f38ef58\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-67pzd" Apr 19 12:57:00.799836 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:00.799809 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9ad4b9ce-100f-4b9c-bfdb-c8646f38ef58-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-67pzd\" (UID: \"9ad4b9ce-100f-4b9c-bfdb-c8646f38ef58\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-67pzd" Apr 19 12:57:00.799974 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:00.799906 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bd9b\" (UniqueName: \"kubernetes.io/projected/9ad4b9ce-100f-4b9c-bfdb-c8646f38ef58-kube-api-access-4bd9b\") pod \"kuadrant-operator-controller-manager-55c7f4c975-67pzd\" (UID: \"9ad4b9ce-100f-4b9c-bfdb-c8646f38ef58\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-67pzd" Apr 19 12:57:00.800156 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:00.800137 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9ad4b9ce-100f-4b9c-bfdb-c8646f38ef58-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-67pzd\" (UID: \"9ad4b9ce-100f-4b9c-bfdb-c8646f38ef58\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-67pzd" Apr 19 12:57:00.810990 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:00.810966 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bd9b\" (UniqueName: \"kubernetes.io/projected/9ad4b9ce-100f-4b9c-bfdb-c8646f38ef58-kube-api-access-4bd9b\") pod \"kuadrant-operator-controller-manager-55c7f4c975-67pzd\" (UID: \"9ad4b9ce-100f-4b9c-bfdb-c8646f38ef58\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-67pzd" Apr 19 12:57:00.935608 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:00.935585 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-67pzd" Apr 19 12:57:01.272120 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:01.272081 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-67pzd"] Apr 19 12:57:01.274675 ip-10-0-140-194 kubenswrapper[2583]: W0419 12:57:01.274645 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ad4b9ce_100f_4b9c_bfdb_c8646f38ef58.slice/crio-7e30d73dfbb2ed7da2ce93072bb201c72b5a5102b2868c7b9e7ec2fcaabfe8b0 WatchSource:0}: Error finding container 7e30d73dfbb2ed7da2ce93072bb201c72b5a5102b2868c7b9e7ec2fcaabfe8b0: Status 404 returned error can't find the container with id 7e30d73dfbb2ed7da2ce93072bb201c72b5a5102b2868c7b9e7ec2fcaabfe8b0 Apr 19 12:57:01.277128 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:01.277105 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:57:01.956476 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:01.956433 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-67pzd" event={"ID":"9ad4b9ce-100f-4b9c-bfdb-c8646f38ef58","Type":"ContainerStarted","Data":"812b44a65726ceb71f1366ad7d7142dc97d7427cb95463831a2adb558343d1d2"} Apr 19 12:57:01.956476 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:01.956472 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-67pzd" event={"ID":"9ad4b9ce-100f-4b9c-bfdb-c8646f38ef58","Type":"ContainerStarted","Data":"7e30d73dfbb2ed7da2ce93072bb201c72b5a5102b2868c7b9e7ec2fcaabfe8b0"} Apr 19 12:57:01.956969 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:01.956517 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-67pzd" Apr 19 12:57:01.974871 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:01.974814 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-67pzd" podStartSLOduration=1.9747978320000001 podStartE2EDuration="1.974797832s" podCreationTimestamp="2026-04-19 12:57:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:57:01.971972465 +0000 UTC m=+1577.209622727" watchObservedRunningTime="2026-04-19 12:57:01.974797832 +0000 UTC m=+1577.212448093" Apr 19 12:57:12.962311 ip-10-0-140-194 kubenswrapper[2583]: I0419 12:57:12.962283 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-67pzd" Apr 19 13:00:00.131267 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:00.131228 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29610060-mk7xb"] Apr 19 13:00:00.133788 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:00.131767 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="235ae275-5cb5-45d2-9105-a093d2d18328" containerName="cleanup" Apr 19 13:00:00.133788 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:00.131784 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="235ae275-5cb5-45d2-9105-a093d2d18328" containerName="cleanup" Apr 19 13:00:00.134734 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:00.134717 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29610060-mk7xb" Apr 19 13:00:00.136594 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:00.136576 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-bbg64\"" Apr 19 13:00:00.144471 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:00.144445 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znlkv\" (UniqueName: \"kubernetes.io/projected/97f07452-84a3-4fa1-bafe-92fc02b9485b-kube-api-access-znlkv\") pod \"maas-api-key-cleanup-29610060-mk7xb\" (UID: \"97f07452-84a3-4fa1-bafe-92fc02b9485b\") " pod="opendatahub/maas-api-key-cleanup-29610060-mk7xb" Apr 19 13:00:00.149552 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:00.149530 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29610060-mk7xb"] Apr 19 13:00:00.245346 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:00.245303 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-znlkv\" (UniqueName: \"kubernetes.io/projected/97f07452-84a3-4fa1-bafe-92fc02b9485b-kube-api-access-znlkv\") pod \"maas-api-key-cleanup-29610060-mk7xb\" (UID: \"97f07452-84a3-4fa1-bafe-92fc02b9485b\") " pod="opendatahub/maas-api-key-cleanup-29610060-mk7xb" Apr 19 13:00:00.255200 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:00.255165 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-znlkv\" (UniqueName: \"kubernetes.io/projected/97f07452-84a3-4fa1-bafe-92fc02b9485b-kube-api-access-znlkv\") pod \"maas-api-key-cleanup-29610060-mk7xb\" (UID: \"97f07452-84a3-4fa1-bafe-92fc02b9485b\") " pod="opendatahub/maas-api-key-cleanup-29610060-mk7xb" Apr 19 13:00:00.444522 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:00.444493 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29610060-mk7xb" Apr 19 13:00:00.568323 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:00.568297 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29610060-mk7xb"] Apr 19 13:00:00.570374 ip-10-0-140-194 kubenswrapper[2583]: W0419 13:00:00.570346 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97f07452_84a3_4fa1_bafe_92fc02b9485b.slice/crio-67a8932034f13e96f3b5cd44f788380e6fc7f9c51c72b7986a416b2906869722 WatchSource:0}: Error finding container 67a8932034f13e96f3b5cd44f788380e6fc7f9c51c72b7986a416b2906869722: Status 404 returned error can't find the container with id 67a8932034f13e96f3b5cd44f788380e6fc7f9c51c72b7986a416b2906869722 Apr 19 13:00:00.596563 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:00.596537 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610060-mk7xb" event={"ID":"97f07452-84a3-4fa1-bafe-92fc02b9485b","Type":"ContainerStarted","Data":"67a8932034f13e96f3b5cd44f788380e6fc7f9c51c72b7986a416b2906869722"} Apr 19 13:00:01.601223 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:01.601186 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610060-mk7xb" event={"ID":"97f07452-84a3-4fa1-bafe-92fc02b9485b","Type":"ContainerStarted","Data":"21a4ac9b836c39a2affb0362aa2c9a8726d4c330dba3e82f6878af06b71c3986"} Apr 19 13:00:01.615418 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:01.615371 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29610060-mk7xb" podStartSLOduration=1.6153559849999999 podStartE2EDuration="1.615355985s" podCreationTimestamp="2026-04-19 13:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 13:00:01.614463069 +0000 UTC m=+1756.852113334" watchObservedRunningTime="2026-04-19 13:00:01.615355985 +0000 UTC m=+1756.853006293" Apr 19 13:00:21.668307 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:21.668221 2583 generic.go:358] "Generic (PLEG): container finished" podID="97f07452-84a3-4fa1-bafe-92fc02b9485b" containerID="21a4ac9b836c39a2affb0362aa2c9a8726d4c330dba3e82f6878af06b71c3986" exitCode=6 Apr 19 13:00:21.668307 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:21.668291 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610060-mk7xb" event={"ID":"97f07452-84a3-4fa1-bafe-92fc02b9485b","Type":"ContainerDied","Data":"21a4ac9b836c39a2affb0362aa2c9a8726d4c330dba3e82f6878af06b71c3986"} Apr 19 13:00:21.668740 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:21.668624 2583 scope.go:117] "RemoveContainer" containerID="21a4ac9b836c39a2affb0362aa2c9a8726d4c330dba3e82f6878af06b71c3986" Apr 19 13:00:22.673313 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:22.673268 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610060-mk7xb" event={"ID":"97f07452-84a3-4fa1-bafe-92fc02b9485b","Type":"ContainerStarted","Data":"eb6656eb9a66f7b6f61b68268eef818d82e477fd5ded01cb1665e99b1cc3c6cb"} Apr 19 13:00:42.744345 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:42.744312 2583 generic.go:358] "Generic (PLEG): container finished" podID="97f07452-84a3-4fa1-bafe-92fc02b9485b" containerID="eb6656eb9a66f7b6f61b68268eef818d82e477fd5ded01cb1665e99b1cc3c6cb" exitCode=6 Apr 19 13:00:42.744769 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:42.744386 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610060-mk7xb" event={"ID":"97f07452-84a3-4fa1-bafe-92fc02b9485b","Type":"ContainerDied","Data":"eb6656eb9a66f7b6f61b68268eef818d82e477fd5ded01cb1665e99b1cc3c6cb"} Apr 19 13:00:42.744769 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:42.744432 2583 scope.go:117] "RemoveContainer" containerID="21a4ac9b836c39a2affb0362aa2c9a8726d4c330dba3e82f6878af06b71c3986" Apr 19 13:00:42.744769 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:42.744717 2583 scope.go:117] "RemoveContainer" containerID="eb6656eb9a66f7b6f61b68268eef818d82e477fd5ded01cb1665e99b1cc3c6cb" Apr 19 13:00:42.744974 ip-10-0-140-194 kubenswrapper[2583]: E0419 13:00:42.744954 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29610060-mk7xb_opendatahub(97f07452-84a3-4fa1-bafe-92fc02b9485b)\"" pod="opendatahub/maas-api-key-cleanup-29610060-mk7xb" podUID="97f07452-84a3-4fa1-bafe-92fc02b9485b" Apr 19 13:00:45.377797 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:45.377710 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovn-acl-logging/0.log" Apr 19 13:00:45.382996 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:45.382975 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovn-acl-logging/0.log" Apr 19 13:00:56.323286 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:56.323254 2583 scope.go:117] "RemoveContainer" containerID="eb6656eb9a66f7b6f61b68268eef818d82e477fd5ded01cb1665e99b1cc3c6cb" Apr 19 13:00:56.797603 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:56.797571 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610060-mk7xb" event={"ID":"97f07452-84a3-4fa1-bafe-92fc02b9485b","Type":"ContainerStarted","Data":"6b92f96d27b5405891d178066b53316eb07bbe5c493a5d6f24f357231f90603b"} Apr 19 13:00:57.346737 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:57.346706 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29610060-mk7xb"] Apr 19 13:00:57.801159 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:00:57.801116 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29610060-mk7xb" podUID="97f07452-84a3-4fa1-bafe-92fc02b9485b" containerName="cleanup" containerID="cri-o://6b92f96d27b5405891d178066b53316eb07bbe5c493a5d6f24f357231f90603b" gracePeriod=30 Apr 19 13:01:17.047706 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:01:17.047682 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29610060-mk7xb" Apr 19 13:01:17.092388 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:01:17.092348 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znlkv\" (UniqueName: \"kubernetes.io/projected/97f07452-84a3-4fa1-bafe-92fc02b9485b-kube-api-access-znlkv\") pod \"97f07452-84a3-4fa1-bafe-92fc02b9485b\" (UID: \"97f07452-84a3-4fa1-bafe-92fc02b9485b\") " Apr 19 13:01:17.094923 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:01:17.094888 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97f07452-84a3-4fa1-bafe-92fc02b9485b-kube-api-access-znlkv" (OuterVolumeSpecName: "kube-api-access-znlkv") pod "97f07452-84a3-4fa1-bafe-92fc02b9485b" (UID: "97f07452-84a3-4fa1-bafe-92fc02b9485b"). InnerVolumeSpecName "kube-api-access-znlkv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 13:01:17.194017 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:01:17.193962 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-znlkv\" (UniqueName: \"kubernetes.io/projected/97f07452-84a3-4fa1-bafe-92fc02b9485b-kube-api-access-znlkv\") on node \"ip-10-0-140-194.ec2.internal\" DevicePath \"\"" Apr 19 13:01:17.872995 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:01:17.872953 2583 generic.go:358] "Generic (PLEG): container finished" podID="97f07452-84a3-4fa1-bafe-92fc02b9485b" containerID="6b92f96d27b5405891d178066b53316eb07bbe5c493a5d6f24f357231f90603b" exitCode=6 Apr 19 13:01:17.873164 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:01:17.873019 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29610060-mk7xb" Apr 19 13:01:17.873164 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:01:17.873039 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610060-mk7xb" event={"ID":"97f07452-84a3-4fa1-bafe-92fc02b9485b","Type":"ContainerDied","Data":"6b92f96d27b5405891d178066b53316eb07bbe5c493a5d6f24f357231f90603b"} Apr 19 13:01:17.873164 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:01:17.873079 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610060-mk7xb" event={"ID":"97f07452-84a3-4fa1-bafe-92fc02b9485b","Type":"ContainerDied","Data":"67a8932034f13e96f3b5cd44f788380e6fc7f9c51c72b7986a416b2906869722"} Apr 19 13:01:17.873164 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:01:17.873097 2583 scope.go:117] "RemoveContainer" containerID="6b92f96d27b5405891d178066b53316eb07bbe5c493a5d6f24f357231f90603b" Apr 19 13:01:17.881633 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:01:17.881616 2583 scope.go:117] "RemoveContainer" containerID="eb6656eb9a66f7b6f61b68268eef818d82e477fd5ded01cb1665e99b1cc3c6cb" Apr 19 13:01:17.887258 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:01:17.887232 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29610060-mk7xb"] Apr 19 13:01:17.889921 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:01:17.889899 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29610060-mk7xb"] Apr 19 13:01:17.893477 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:01:17.893453 2583 scope.go:117] "RemoveContainer" containerID="6b92f96d27b5405891d178066b53316eb07bbe5c493a5d6f24f357231f90603b" Apr 19 13:01:17.893995 ip-10-0-140-194 kubenswrapper[2583]: E0419 13:01:17.893976 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b92f96d27b5405891d178066b53316eb07bbe5c493a5d6f24f357231f90603b\": container with ID starting with 6b92f96d27b5405891d178066b53316eb07bbe5c493a5d6f24f357231f90603b not found: ID does not exist" containerID="6b92f96d27b5405891d178066b53316eb07bbe5c493a5d6f24f357231f90603b" Apr 19 13:01:17.894086 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:01:17.894004 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b92f96d27b5405891d178066b53316eb07bbe5c493a5d6f24f357231f90603b"} err="failed to get container status \"6b92f96d27b5405891d178066b53316eb07bbe5c493a5d6f24f357231f90603b\": rpc error: code = NotFound desc = could not find container \"6b92f96d27b5405891d178066b53316eb07bbe5c493a5d6f24f357231f90603b\": container with ID starting with 6b92f96d27b5405891d178066b53316eb07bbe5c493a5d6f24f357231f90603b not found: ID does not exist" Apr 19 13:01:17.894086 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:01:17.894039 2583 scope.go:117] "RemoveContainer" containerID="eb6656eb9a66f7b6f61b68268eef818d82e477fd5ded01cb1665e99b1cc3c6cb" Apr 19 13:01:17.894308 ip-10-0-140-194 kubenswrapper[2583]: E0419 13:01:17.894291 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb6656eb9a66f7b6f61b68268eef818d82e477fd5ded01cb1665e99b1cc3c6cb\": container with ID starting with eb6656eb9a66f7b6f61b68268eef818d82e477fd5ded01cb1665e99b1cc3c6cb not found: ID does not exist" containerID="eb6656eb9a66f7b6f61b68268eef818d82e477fd5ded01cb1665e99b1cc3c6cb" Apr 19 13:01:17.894354 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:01:17.894313 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb6656eb9a66f7b6f61b68268eef818d82e477fd5ded01cb1665e99b1cc3c6cb"} err="failed to get container status \"eb6656eb9a66f7b6f61b68268eef818d82e477fd5ded01cb1665e99b1cc3c6cb\": rpc error: code = NotFound desc = could not find container \"eb6656eb9a66f7b6f61b68268eef818d82e477fd5ded01cb1665e99b1cc3c6cb\": container with ID starting with eb6656eb9a66f7b6f61b68268eef818d82e477fd5ded01cb1665e99b1cc3c6cb not found: ID does not exist" Apr 19 13:01:19.327796 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:01:19.327763 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97f07452-84a3-4fa1-bafe-92fc02b9485b" path="/var/lib/kubelet/pods/97f07452-84a3-4fa1-bafe-92fc02b9485b/volumes" Apr 19 13:05:45.407484 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:05:45.407450 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovn-acl-logging/0.log" Apr 19 13:05:45.412626 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:05:45.412606 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovn-acl-logging/0.log" Apr 19 13:06:34.944552 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:34.944468 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-7f7fd5f68b-pmkvb_cde66671-7a74-43c9-b2c3-a9c15c3f3af5/manager/0.log" Apr 19 13:06:35.264673 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:35.264597 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-9ff869b6b-vkdlw_0c584614-1441-4ab1-a7c3-1df91d5bd84e/manager/0.log" Apr 19 13:06:36.792752 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:36.792717 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-6w9nl_ef035de9-8a2d-49a1-be4a-4f9b3b926959/manager/0.log" Apr 19 13:06:36.906617 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:36.906586 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-zgvtb_937c503e-246e-490f-9c8e-970ece3e2265/manager/0.log" Apr 19 13:06:37.126232 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:37.126164 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-bf79f_08e3e49e-bd5d-4465-a80f-4ec32e63b636/registry-server/0.log" Apr 19 13:06:37.253295 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:37.253264 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-67pzd_9ad4b9ce-100f-4b9c-bfdb-c8646f38ef58/manager/0.log" Apr 19 13:06:37.833044 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:37.833014 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cffmj22_6550e017-0db5-4e24-bb67-58828cfb90dd/istio-proxy/0.log" Apr 19 13:06:38.261271 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:38.261239 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-r522j_7e385182-92c1-4522-b9e6-1aba2dea9c27/istio-proxy/0.log" Apr 19 13:06:38.365987 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:38.365960 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-68bf8db9c6-c69fm_8f6bf2fb-d79a-44fa-bb49-34879196ab43/router/0.log" Apr 19 13:06:38.680235 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:38.680195 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6_982f194d-da77-4b6b-a4fe-9f03dc6bfc42/storage-initializer/0.log" Apr 19 13:06:38.686979 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:38.686952 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-lbth6_982f194d-da77-4b6b-a4fe-9f03dc6bfc42/main/0.log" Apr 19 13:06:39.018000 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:39.017921 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm_6ae3159b-5f65-49cd-8408-7527930ef42d/storage-initializer/0.log" Apr 19 13:06:39.027339 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:39.027311 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc4pznm_6ae3159b-5f65-49cd-8408-7527930ef42d/main/0.log" Apr 19 13:06:45.933737 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:45.933710 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-jshbm_264963bd-7b49-4b81-8bf7-ed2bb6b6df20/global-pull-secret-syncer/0.log" Apr 19 13:06:46.071738 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:46.071709 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vqdpz_dc6c48c7-3a5b-4289-8f49-b667f1badbea/konnectivity-agent/0.log" Apr 19 13:06:46.114763 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:46.114737 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-194.ec2.internal_45b611433db72785c05b5ca89b4fe28f/haproxy/0.log" Apr 19 13:06:50.358629 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:50.358552 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-6w9nl_ef035de9-8a2d-49a1-be4a-4f9b3b926959/manager/0.log" Apr 19 13:06:50.386160 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:50.386131 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-zgvtb_937c503e-246e-490f-9c8e-970ece3e2265/manager/0.log" Apr 19 13:06:50.446387 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:50.446363 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-bf79f_08e3e49e-bd5d-4465-a80f-4ec32e63b636/registry-server/0.log" Apr 19 13:06:50.507631 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:50.507599 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-67pzd_9ad4b9ce-100f-4b9c-bfdb-c8646f38ef58/manager/0.log" Apr 19 13:06:52.321628 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:52.321601 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8hxmd_588e865e-56c0-4ff3-aebf-a1f18329ce04/node-exporter/0.log" Apr 19 13:06:52.342199 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:52.342179 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8hxmd_588e865e-56c0-4ff3-aebf-a1f18329ce04/kube-rbac-proxy/0.log" Apr 19 13:06:52.363019 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:52.363001 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8hxmd_588e865e-56c0-4ff3-aebf-a1f18329ce04/init-textfile/0.log" Apr 19 13:06:52.617186 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:52.617092 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2d481966-39bc-49fb-9c52-0fb57e05898e/prometheus/0.log" Apr 19 13:06:52.637641 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:52.637618 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2d481966-39bc-49fb-9c52-0fb57e05898e/config-reloader/0.log" Apr 19 13:06:52.658615 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:52.658571 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2d481966-39bc-49fb-9c52-0fb57e05898e/thanos-sidecar/0.log" Apr 19 13:06:52.679629 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:52.679606 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2d481966-39bc-49fb-9c52-0fb57e05898e/kube-rbac-proxy-web/0.log" Apr 19 13:06:52.701114 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:52.701090 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2d481966-39bc-49fb-9c52-0fb57e05898e/kube-rbac-proxy/0.log" Apr 19 13:06:52.723048 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:52.723030 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2d481966-39bc-49fb-9c52-0fb57e05898e/kube-rbac-proxy-thanos/0.log" Apr 19 13:06:52.744231 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:52.744210 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2d481966-39bc-49fb-9c52-0fb57e05898e/init-config-reloader/0.log" Apr 19 13:06:52.934831 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:52.934799 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-658448b984-6hs44_45f0f7dc-3b40-4d18-843e-456c4cad83c2/thanos-query/0.log" Apr 19 13:06:52.955359 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:52.955339 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-658448b984-6hs44_45f0f7dc-3b40-4d18-843e-456c4cad83c2/kube-rbac-proxy-web/0.log" Apr 19 13:06:52.975294 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:52.975273 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-658448b984-6hs44_45f0f7dc-3b40-4d18-843e-456c4cad83c2/kube-rbac-proxy/0.log" Apr 19 13:06:52.995871 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:52.995835 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-658448b984-6hs44_45f0f7dc-3b40-4d18-843e-456c4cad83c2/prom-label-proxy/0.log" Apr 19 13:06:53.016382 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:53.016360 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-658448b984-6hs44_45f0f7dc-3b40-4d18-843e-456c4cad83c2/kube-rbac-proxy-rules/0.log" Apr 19 13:06:53.036490 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:53.036467 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-658448b984-6hs44_45f0f7dc-3b40-4d18-843e-456c4cad83c2/kube-rbac-proxy-metrics/0.log" Apr 19 13:06:54.743675 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.743568 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f5849/perf-node-gather-daemonset-blffd"] Apr 19 13:06:54.744175 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.744149 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97f07452-84a3-4fa1-bafe-92fc02b9485b" containerName="cleanup" Apr 19 13:06:54.744175 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.744169 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="97f07452-84a3-4fa1-bafe-92fc02b9485b" containerName="cleanup" Apr 19 13:06:54.744300 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.744189 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97f07452-84a3-4fa1-bafe-92fc02b9485b" containerName="cleanup" Apr 19 13:06:54.744300 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.744198 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="97f07452-84a3-4fa1-bafe-92fc02b9485b" containerName="cleanup" Apr 19 13:06:54.744300 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.744207 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97f07452-84a3-4fa1-bafe-92fc02b9485b" containerName="cleanup" Apr 19 13:06:54.744300 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.744216 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="97f07452-84a3-4fa1-bafe-92fc02b9485b" containerName="cleanup" Apr 19 13:06:54.744543 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.744307 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="97f07452-84a3-4fa1-bafe-92fc02b9485b" containerName="cleanup" Apr 19 13:06:54.744543 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.744322 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="97f07452-84a3-4fa1-bafe-92fc02b9485b" containerName="cleanup" Apr 19 13:06:54.747512 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.747493 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5849/perf-node-gather-daemonset-blffd" Apr 19 13:06:54.749898 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.749707 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f5849\"/\"openshift-service-ca.crt\"" Apr 19 13:06:54.750185 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.750164 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-f5849\"/\"default-dockercfg-fxnnq\"" Apr 19 13:06:54.750282 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.750180 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f5849\"/\"kube-root-ca.crt\"" Apr 19 13:06:54.751929 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.751909 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f5849/perf-node-gather-daemonset-blffd"] Apr 19 13:06:54.860328 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.860293 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqsz8\" (UniqueName: \"kubernetes.io/projected/17f934b2-351e-4d2a-8485-6da000d7d41f-kube-api-access-pqsz8\") pod \"perf-node-gather-daemonset-blffd\" (UID: \"17f934b2-351e-4d2a-8485-6da000d7d41f\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-blffd" Apr 19 13:06:54.860546 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.860337 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/17f934b2-351e-4d2a-8485-6da000d7d41f-lib-modules\") pod \"perf-node-gather-daemonset-blffd\" (UID: \"17f934b2-351e-4d2a-8485-6da000d7d41f\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-blffd" Apr 19 13:06:54.860546 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.860447 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/17f934b2-351e-4d2a-8485-6da000d7d41f-podres\") pod \"perf-node-gather-daemonset-blffd\" (UID: \"17f934b2-351e-4d2a-8485-6da000d7d41f\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-blffd" Apr 19 13:06:54.860546 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.860517 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/17f934b2-351e-4d2a-8485-6da000d7d41f-proc\") pod \"perf-node-gather-daemonset-blffd\" (UID: \"17f934b2-351e-4d2a-8485-6da000d7d41f\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-blffd" Apr 19 13:06:54.860692 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.860557 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/17f934b2-351e-4d2a-8485-6da000d7d41f-sys\") pod \"perf-node-gather-daemonset-blffd\" (UID: \"17f934b2-351e-4d2a-8485-6da000d7d41f\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-blffd" Apr 19 13:06:54.961573 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.961535 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/17f934b2-351e-4d2a-8485-6da000d7d41f-podres\") pod \"perf-node-gather-daemonset-blffd\" (UID: \"17f934b2-351e-4d2a-8485-6da000d7d41f\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-blffd" Apr 19 13:06:54.961736 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.961585 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/17f934b2-351e-4d2a-8485-6da000d7d41f-proc\") pod \"perf-node-gather-daemonset-blffd\" (UID: \"17f934b2-351e-4d2a-8485-6da000d7d41f\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-blffd" Apr 19 13:06:54.961736 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.961610 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/17f934b2-351e-4d2a-8485-6da000d7d41f-sys\") pod \"perf-node-gather-daemonset-blffd\" (UID: \"17f934b2-351e-4d2a-8485-6da000d7d41f\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-blffd" Apr 19 13:06:54.961736 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.961646 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqsz8\" (UniqueName: \"kubernetes.io/projected/17f934b2-351e-4d2a-8485-6da000d7d41f-kube-api-access-pqsz8\") pod \"perf-node-gather-daemonset-blffd\" (UID: \"17f934b2-351e-4d2a-8485-6da000d7d41f\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-blffd" Apr 19 13:06:54.961736 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.961671 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/17f934b2-351e-4d2a-8485-6da000d7d41f-lib-modules\") pod \"perf-node-gather-daemonset-blffd\" (UID: \"17f934b2-351e-4d2a-8485-6da000d7d41f\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-blffd" Apr 19 13:06:54.961736 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.961688 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/17f934b2-351e-4d2a-8485-6da000d7d41f-podres\") pod \"perf-node-gather-daemonset-blffd\" (UID: \"17f934b2-351e-4d2a-8485-6da000d7d41f\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-blffd" Apr 19 13:06:54.961736 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.961729 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/17f934b2-351e-4d2a-8485-6da000d7d41f-proc\") pod \"perf-node-gather-daemonset-blffd\" (UID: \"17f934b2-351e-4d2a-8485-6da000d7d41f\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-blffd" Apr 19 13:06:54.962029 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.961740 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/17f934b2-351e-4d2a-8485-6da000d7d41f-sys\") pod \"perf-node-gather-daemonset-blffd\" (UID: \"17f934b2-351e-4d2a-8485-6da000d7d41f\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-blffd" Apr 19 13:06:54.962029 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.961842 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/17f934b2-351e-4d2a-8485-6da000d7d41f-lib-modules\") pod \"perf-node-gather-daemonset-blffd\" (UID: \"17f934b2-351e-4d2a-8485-6da000d7d41f\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-blffd" Apr 19 13:06:54.968558 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:54.968534 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqsz8\" (UniqueName: \"kubernetes.io/projected/17f934b2-351e-4d2a-8485-6da000d7d41f-kube-api-access-pqsz8\") pod \"perf-node-gather-daemonset-blffd\" (UID: \"17f934b2-351e-4d2a-8485-6da000d7d41f\") " pod="openshift-must-gather-f5849/perf-node-gather-daemonset-blffd" Apr 19 13:06:55.058309 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:55.058227 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5849/perf-node-gather-daemonset-blffd" Apr 19 13:06:55.075667 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:55.075636 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6677f5ccc8-jqsg4_be80a5d9-5649-4446-bf8c-c80a0a9d3e9a/console/0.log" Apr 19 13:06:55.181289 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:55.181262 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f5849/perf-node-gather-daemonset-blffd"] Apr 19 13:06:55.183245 ip-10-0-140-194 kubenswrapper[2583]: W0419 13:06:55.183207 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod17f934b2_351e_4d2a_8485_6da000d7d41f.slice/crio-0381c2c2028dcbc3da2f43c79fd89a81cc551e69a8c3c0f59635644f714f7a54 WatchSource:0}: Error finding container 0381c2c2028dcbc3da2f43c79fd89a81cc551e69a8c3c0f59635644f714f7a54: Status 404 returned error can't find the container with id 0381c2c2028dcbc3da2f43c79fd89a81cc551e69a8c3c0f59635644f714f7a54 Apr 19 13:06:55.184913 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:55.184895 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 13:06:55.617161 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:55.617128 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-glrsn_a7a09300-5b01-42d1-9c9e-64749c7103a2/volume-data-source-validator/0.log" Apr 19 13:06:56.064421 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:56.064379 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5849/perf-node-gather-daemonset-blffd" event={"ID":"17f934b2-351e-4d2a-8485-6da000d7d41f","Type":"ContainerStarted","Data":"024ced9c76d67b62a3ea31a5d1a83eec9e76741715d15ccd231d4d6ba73f4a80"} Apr 19 13:06:56.064421 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:56.064418 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5849/perf-node-gather-daemonset-blffd" event={"ID":"17f934b2-351e-4d2a-8485-6da000d7d41f","Type":"ContainerStarted","Data":"0381c2c2028dcbc3da2f43c79fd89a81cc551e69a8c3c0f59635644f714f7a54"} Apr 19 13:06:56.064901 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:56.064454 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-f5849/perf-node-gather-daemonset-blffd" Apr 19 13:06:56.078122 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:56.078080 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f5849/perf-node-gather-daemonset-blffd" podStartSLOduration=2.078066471 podStartE2EDuration="2.078066471s" podCreationTimestamp="2026-04-19 13:06:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 13:06:56.077447853 +0000 UTC m=+2171.315098115" watchObservedRunningTime="2026-04-19 13:06:56.078066471 +0000 UTC m=+2171.315716732" Apr 19 13:06:56.520569 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:56.520538 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-z7glk_6d466489-3e13-462c-b90d-4b13b586caae/dns/0.log" Apr 19 13:06:56.540982 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:56.540958 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-z7glk_6d466489-3e13-462c-b90d-4b13b586caae/kube-rbac-proxy/0.log" Apr 19 13:06:56.585828 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:56.585805 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w56w7_5d31c3f1-1682-400f-9db4-ef1c50b1f94d/dns-node-resolver/0.log" Apr 19 13:06:57.095716 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:57.095687 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wtzrr_6ad204b3-eb17-4f25-b1ef-6950791a05cd/node-ca/0.log" Apr 19 13:06:57.859020 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:57.858993 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cffmj22_6550e017-0db5-4e24-bb67-58828cfb90dd/istio-proxy/0.log" Apr 19 13:06:58.050104 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:58.050071 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-r522j_7e385182-92c1-4522-b9e6-1aba2dea9c27/istio-proxy/0.log" Apr 19 13:06:58.071248 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:58.071217 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-68bf8db9c6-c69fm_8f6bf2fb-d79a-44fa-bb49-34879196ab43/router/0.log" Apr 19 13:06:58.601423 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:58.601391 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vlsjm_e9a634b4-9352-4e62-926c-390e9b19d228/serve-healthcheck-canary/0.log" Apr 19 13:06:59.014550 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:59.014516 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-kxff4_a7f8c09c-afa9-4393-b47b-0e8efface148/insights-operator/0.log" Apr 19 13:06:59.014748 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:59.014656 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-kxff4_a7f8c09c-afa9-4393-b47b-0e8efface148/insights-operator/1.log" Apr 19 13:06:59.164150 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:59.164119 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wqsqg_b159597a-1990-49de-af45-edc96be184cd/kube-rbac-proxy/0.log" Apr 19 13:06:59.185482 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:59.185455 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wqsqg_b159597a-1990-49de-af45-edc96be184cd/exporter/0.log" Apr 19 13:06:59.208316 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:06:59.208291 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wqsqg_b159597a-1990-49de-af45-edc96be184cd/extractor/0.log" Apr 19 13:07:01.086336 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:01.086303 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-7f7fd5f68b-pmkvb_cde66671-7a74-43c9-b2c3-a9c15c3f3af5/manager/0.log" Apr 19 13:07:01.165993 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:01.165964 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-9ff869b6b-vkdlw_0c584614-1441-4ab1-a7c3-1df91d5bd84e/manager/0.log" Apr 19 13:07:02.080299 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:02.080274 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-f5849/perf-node-gather-daemonset-blffd" Apr 19 13:07:02.325824 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:02.325794 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-844f57dbd6-9q4qq_d39c5544-0e4d-468b-b2d9-635b760bb77b/manager/0.log" Apr 19 13:07:02.373741 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:02.373676 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-f7skn_0607ed45-9366-4f10-8352-5a266131218d/openshift-lws-operator/0.log" Apr 19 13:07:06.751802 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:06.751775 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-wgbsg_917f4be8-d4ee-4f39-b77b-59c9da8491e2/migrator/0.log" Apr 19 13:07:06.772068 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:06.772028 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-wgbsg_917f4be8-d4ee-4f39-b77b-59c9da8491e2/graceful-termination/0.log" Apr 19 13:07:08.526473 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:08.526445 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tgwkw_8c38afde-25f7-4408-bdb5-22a5ea2b4c03/kube-multus-additional-cni-plugins/0.log" Apr 19 13:07:08.568298 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:08.568270 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tgwkw_8c38afde-25f7-4408-bdb5-22a5ea2b4c03/egress-router-binary-copy/0.log" Apr 19 13:07:08.610386 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:08.610354 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tgwkw_8c38afde-25f7-4408-bdb5-22a5ea2b4c03/cni-plugins/0.log" Apr 19 13:07:08.654220 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:08.654187 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tgwkw_8c38afde-25f7-4408-bdb5-22a5ea2b4c03/bond-cni-plugin/0.log" Apr 19 13:07:08.694276 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:08.694252 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tgwkw_8c38afde-25f7-4408-bdb5-22a5ea2b4c03/routeoverride-cni/0.log" Apr 19 13:07:08.735324 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:08.735302 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tgwkw_8c38afde-25f7-4408-bdb5-22a5ea2b4c03/whereabouts-cni-bincopy/0.log" Apr 19 13:07:08.776552 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:08.776467 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tgwkw_8c38afde-25f7-4408-bdb5-22a5ea2b4c03/whereabouts-cni/0.log" Apr 19 13:07:08.825653 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:08.825624 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bh5jh_ddcdec9a-0940-4c7c-8298-ee39ccec754e/kube-multus/0.log" Apr 19 13:07:09.059455 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:09.059366 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lmdfj_fb73e6b2-9f0a-4bcf-9371-0d399622fe97/network-metrics-daemon/0.log" Apr 19 13:07:09.100532 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:09.100483 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lmdfj_fb73e6b2-9f0a-4bcf-9371-0d399622fe97/kube-rbac-proxy/0.log" Apr 19 13:07:10.149316 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:10.149288 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovn-controller/0.log" Apr 19 13:07:10.167169 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:10.167144 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovn-acl-logging/0.log" Apr 19 13:07:10.179735 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:10.179709 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovn-acl-logging/1.log" Apr 19 13:07:10.200325 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:10.200304 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/kube-rbac-proxy-node/0.log" Apr 19 13:07:10.221383 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:10.221359 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/kube-rbac-proxy-ovn-metrics/0.log" Apr 19 13:07:10.239503 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:10.239483 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/northd/0.log" Apr 19 13:07:10.258374 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:10.258347 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/nbdb/0.log" Apr 19 13:07:10.278891 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:10.278864 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/sbdb/0.log" Apr 19 13:07:10.377867 ip-10-0-140-194 kubenswrapper[2583]: I0419 13:07:10.377827 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pvnl8_59537546-a323-4987-9ad2-4ce6e8f679c8/ovnkube-controller/0.log"