Apr 22 14:13:13.704518 ip-10-0-130-255 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 14:13:13.704529 ip-10-0-130-255 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 14:13:13.704537 ip-10-0-130-255 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 14:13:13.704771 ip-10-0-130-255 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 14:13:25.211853 ip-10-0-130-255 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 14:13:25.211868 ip-10-0-130-255 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot b75303af55dd40428f867045963d7f5a -- Apr 22 14:15:52.924761 ip-10-0-130-255 systemd[1]: Starting Kubernetes Kubelet... Apr 22 14:15:53.277730 ip-10-0-130-255 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:53.277730 ip-10-0-130-255 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 14:15:53.277730 ip-10-0-130-255 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:53.277730 ip-10-0-130-255 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 14:15:53.277730 ip-10-0-130-255 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:53.279822 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.279729 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 14:15:53.284348 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284324 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:53.284348 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284345 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:53.284348 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284349 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:53.284348 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284352 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:53.284348 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284355 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:53.284557 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284358 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:53.284557 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284361 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:53.284557 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284364 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:53.284557 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284366 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:53.284557 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284369 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:53.284557 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284372 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:53.284557 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284375 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:53.284557 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284377 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:53.284557 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284380 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:53.284557 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284385 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:53.284557 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284390 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:53.284557 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284393 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:53.284557 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284396 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:53.284557 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284399 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:53.284557 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284402 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:53.284557 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284405 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:53.284557 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284407 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:53.284557 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284410 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:53.284557 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284413 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:53.284557 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284415 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:53.285046 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284425 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:53.285046 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284428 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:53.285046 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284431 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:53.285046 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284433 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:53.285046 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284436 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:53.285046 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284438 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:53.285046 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284441 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:53.285046 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284443 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:53.285046 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284446 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:53.285046 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284462 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:53.285046 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284465 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:53.285046 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284468 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:53.285046 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284470 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:53.285046 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284474 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:53.285046 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284478 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:53.285046 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284482 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:53.285046 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284486 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:53.285046 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284490 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:53.285046 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284494 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:53.285046 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284497 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:53.285586 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284499 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:53.285586 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284503 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:53.285586 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284506 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:53.285586 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284509 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:53.285586 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284511 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:53.285586 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284514 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:53.285586 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284517 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:53.285586 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284520 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:53.285586 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284522 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:53.285586 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284525 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:53.285586 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284527 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:53.285586 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284530 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:53.285586 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284533 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:53.285586 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284537 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:53.285586 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284541 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:53.285586 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284544 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:53.285586 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284547 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:53.285586 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284549 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:53.285586 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284552 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:53.285586 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284555 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:53.286058 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284558 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:53.286058 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284560 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:53.286058 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284563 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:53.286058 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284565 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:53.286058 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284568 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:53.286058 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284572 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:53.286058 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284576 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:53.286058 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284579 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:53.286058 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284581 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:53.286058 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284584 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:53.286058 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284587 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:53.286058 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284589 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:53.286058 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284592 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:53.286058 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284601 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:53.286058 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284605 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:53.286058 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284607 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:53.286058 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284610 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:53.286058 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284613 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:53.286058 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284615 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:53.286542 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284618 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:53.286542 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.284620 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:53.286542 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285014 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:53.286542 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285020 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:53.286542 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285022 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:53.286542 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285025 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:53.286542 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285028 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:53.286542 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285030 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:53.286542 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285033 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:53.286542 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285036 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:53.286542 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285038 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:53.286542 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285041 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:53.286542 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285043 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:53.286542 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285046 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:53.286542 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285049 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:53.286542 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285052 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:53.286542 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285055 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:53.286542 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285058 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:53.286542 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285061 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:53.286542 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285064 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:53.287027 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285066 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:53.287027 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285069 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:53.287027 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285071 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:53.287027 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285074 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:53.287027 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285076 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:53.287027 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285079 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:53.287027 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285085 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:53.287027 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285088 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:53.287027 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285091 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:53.287027 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285093 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:53.287027 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285096 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:53.287027 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285099 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:53.287027 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285101 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:53.287027 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285103 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:53.287027 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285106 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:53.287027 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285108 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:53.287027 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285111 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:53.287027 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285113 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:53.287027 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285116 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:53.287027 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285118 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:53.287632 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285123 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:53.287632 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285126 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:53.287632 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285130 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:53.287632 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285133 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:53.287632 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285136 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:53.287632 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285139 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:53.287632 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285142 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:53.287632 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285146 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:53.287632 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285150 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:53.287632 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285153 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:53.287632 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285156 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:53.287632 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285159 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:53.287632 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285161 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:53.287632 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285164 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:53.287632 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285166 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:53.287632 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285169 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:53.287632 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285171 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:53.287632 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285174 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:53.288081 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285176 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:53.288081 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285179 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:53.288081 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285182 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:53.288081 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285185 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:53.288081 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285187 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:53.288081 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285190 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:53.288081 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285192 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:53.288081 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285199 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:53.288081 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285202 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:53.288081 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285204 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:53.288081 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285207 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:53.288081 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285209 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:53.288081 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285212 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:53.288081 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285214 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:53.288081 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285216 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:53.288081 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285219 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:53.288081 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285222 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:53.288081 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285225 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:53.288081 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285227 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:53.288081 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285230 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:53.288577 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285232 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:53.288577 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285235 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:53.288577 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285238 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:53.288577 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285241 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:53.288577 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285244 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:53.288577 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285246 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:53.288577 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285249 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:53.288577 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285251 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:53.288577 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285254 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:53.288577 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.285256 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:53.288577 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285893 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 14:15:53.288577 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285902 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 14:15:53.288577 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285909 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 14:15:53.288577 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285914 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 14:15:53.288577 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285919 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 14:15:53.288577 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285923 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 14:15:53.288577 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285927 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 14:15:53.288577 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285931 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 14:15:53.288577 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285934 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 14:15:53.288577 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285938 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 14:15:53.288577 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285941 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 14:15:53.288577 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285945 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285948 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285951 2579 flags.go:64] FLAG: --cgroup-root="" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285954 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285957 2579 flags.go:64] FLAG: --client-ca-file="" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285959 2579 flags.go:64] FLAG: --cloud-config="" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285962 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285965 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285970 2579 flags.go:64] FLAG: --cluster-domain="" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285973 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285976 2579 flags.go:64] FLAG: --config-dir="" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285980 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285983 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285991 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285994 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.285997 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286001 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286003 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286006 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286009 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286013 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286016 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286020 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286023 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286026 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 14:15:53.289112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286029 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286032 2579 flags.go:64] FLAG: --enable-server="true" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286035 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286040 2579 flags.go:64] FLAG: --event-burst="100" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286043 2579 flags.go:64] FLAG: --event-qps="50" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286046 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286050 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286052 2579 flags.go:64] FLAG: --eviction-hard="" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286056 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286059 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286062 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286066 2579 flags.go:64] FLAG: --eviction-soft="" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286069 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286072 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286075 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286079 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286082 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286085 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286088 2579 flags.go:64] FLAG: --feature-gates="" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286091 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286096 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286099 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286102 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286105 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286108 2579 flags.go:64] FLAG: --help="false" Apr 22 14:15:53.289723 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286111 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-130-255.ec2.internal" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286114 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286117 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286120 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286124 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286127 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286130 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286133 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286136 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286139 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286142 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286145 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286149 2579 flags.go:64] FLAG: --kube-reserved="" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286152 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286155 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286158 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286160 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286163 2579 flags.go:64] FLAG: --lock-file="" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286166 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286169 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286173 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286178 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286181 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286184 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 14:15:53.290310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286187 2579 flags.go:64] FLAG: --logging-format="text" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286190 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286193 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286197 2579 flags.go:64] FLAG: --manifest-url="" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286200 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286204 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286208 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286212 2579 flags.go:64] FLAG: --max-pods="110" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286215 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286218 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286221 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286224 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286227 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286230 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286233 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286240 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286243 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286246 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286250 2579 flags.go:64] FLAG: --pod-cidr="" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286253 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286258 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286261 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286264 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286267 2579 flags.go:64] FLAG: --port="10250" Apr 22 14:15:53.290948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286270 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286273 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e2198d5d6726e8da" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286276 2579 flags.go:64] FLAG: --qos-reserved="" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286279 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286283 2579 flags.go:64] FLAG: --register-node="true" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286285 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286289 2579 flags.go:64] FLAG: --register-with-taints="" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286292 2579 flags.go:64] FLAG: --registry-burst="10" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286295 2579 flags.go:64] FLAG: --registry-qps="5" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286298 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286301 2579 flags.go:64] FLAG: --reserved-memory="" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286304 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286308 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286312 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286314 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286317 2579 flags.go:64] FLAG: --runonce="false" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286320 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286323 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286326 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286329 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286332 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286335 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286339 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286342 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286344 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286347 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 14:15:53.291540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286351 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286354 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286357 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286360 2579 flags.go:64] FLAG: --system-cgroups="" Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286363 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286368 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286371 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286374 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286379 2579 flags.go:64] FLAG: --tls-min-version="" Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286382 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286385 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286388 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286391 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286394 2579 flags.go:64] FLAG: --v="2" Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286399 2579 flags.go:64] FLAG: --version="false" Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286403 2579 flags.go:64] FLAG: --vmodule="" Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286407 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.286411 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286519 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286523 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286526 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286529 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286532 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:53.292166 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286535 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:53.292744 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286537 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:53.292744 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286540 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:53.292744 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286543 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:53.292744 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286545 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:53.292744 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286548 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:53.292744 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286551 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:53.292744 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286554 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:53.292744 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286557 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:53.292744 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286560 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:53.292744 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286563 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:53.292744 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286567 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:53.292744 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286570 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:53.292744 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286573 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:53.292744 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286576 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:53.292744 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286578 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:53.292744 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286581 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:53.292744 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286584 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:53.292744 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286586 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:53.292744 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286589 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:53.292744 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286592 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:53.293252 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286594 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:53.293252 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286597 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:53.293252 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286599 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:53.293252 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286602 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:53.293252 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286605 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:53.293252 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286607 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:53.293252 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286611 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:53.293252 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286613 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:53.293252 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286616 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:53.293252 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286619 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:53.293252 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286622 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:53.293252 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286625 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:53.293252 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286629 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:53.293252 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286632 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:53.293252 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286635 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:53.293252 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286638 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:53.293252 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286640 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:53.293252 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286643 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:53.293252 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286646 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:53.293740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286648 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:53.293740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286652 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:53.293740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286655 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:53.293740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286657 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:53.293740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286660 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:53.293740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286663 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:53.293740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286665 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:53.293740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286668 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:53.293740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286670 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:53.293740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286673 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:53.293740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286676 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:53.293740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286678 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:53.293740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286681 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:53.293740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286683 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:53.293740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286686 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:53.293740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286688 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:53.293740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286691 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:53.293740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286693 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:53.293740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286696 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:53.293740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286699 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:53.294267 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286702 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:53.294267 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286704 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:53.294267 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286707 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:53.294267 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286710 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:53.294267 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286712 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:53.294267 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286715 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:53.294267 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286717 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:53.294267 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286720 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:53.294267 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286722 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:53.294267 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286725 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:53.294267 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286727 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:53.294267 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286730 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:53.294267 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286733 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:53.294267 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286736 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:53.294267 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286739 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:53.294267 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286741 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:53.294267 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286744 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:53.294267 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286747 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:53.294267 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286750 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:53.294267 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286752 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:53.294785 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.286755 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:53.294785 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.287343 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:53.294785 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.293668 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 14:15:53.294785 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.293784 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 14:15:53.294785 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293834 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:53.294785 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293839 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:53.294785 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293843 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:53.294785 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293846 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:53.294785 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293849 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:53.294785 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293853 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:53.294785 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293855 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:53.294785 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293858 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:53.294785 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293860 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:53.294785 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293863 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:53.294785 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293866 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:53.294785 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293869 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:53.295185 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293871 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:53.295185 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293874 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:53.295185 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293876 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:53.295185 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293879 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:53.295185 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293882 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:53.295185 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293885 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:53.295185 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293887 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:53.295185 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293890 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:53.295185 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293892 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:53.295185 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293895 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:53.295185 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293899 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:53.295185 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293903 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:53.295185 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293906 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:53.295185 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293916 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:53.295185 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293919 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:53.295185 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293921 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:53.295185 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293925 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:53.295185 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293928 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:53.295185 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293932 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:53.295740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293936 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:53.295740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293939 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:53.295740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293942 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:53.295740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293945 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:53.295740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293947 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:53.295740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293951 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:53.295740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293954 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:53.295740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293957 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:53.295740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293959 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:53.295740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293962 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:53.295740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293965 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:53.295740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293968 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:53.295740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293971 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:53.295740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293974 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:53.295740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293976 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:53.295740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293979 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:53.295740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293982 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:53.295740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293984 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:53.295740 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293987 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:53.296191 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293990 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:53.296191 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293992 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:53.296191 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293995 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:53.296191 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.293998 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:53.296191 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294000 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:53.296191 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294003 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:53.296191 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294005 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:53.296191 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294008 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:53.296191 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294012 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:53.296191 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294015 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:53.296191 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294017 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:53.296191 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294020 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:53.296191 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294023 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:53.296191 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294026 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:53.296191 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294028 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:53.296191 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294031 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:53.296191 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294034 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:53.296191 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294036 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:53.296191 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294039 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:53.296191 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294041 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:53.296689 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294044 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:53.296689 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294046 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:53.296689 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294049 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:53.296689 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294052 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:53.296689 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294054 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:53.296689 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294057 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:53.296689 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294059 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:53.296689 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294062 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:53.296689 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294065 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:53.296689 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294067 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:53.296689 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294070 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:53.296689 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294072 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:53.296689 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294075 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:53.296689 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294077 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:53.296689 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294080 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:53.296689 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294082 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:53.297083 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.294087 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:53.297083 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294198 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:53.297083 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294202 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:53.297083 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294205 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:53.297083 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294208 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:53.297083 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294211 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:53.297083 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294214 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:53.297083 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294217 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:53.297083 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294219 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:53.297083 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294222 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:53.297083 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294225 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:53.297083 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294228 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:53.297083 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294230 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:53.297083 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294233 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:53.297083 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294236 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:53.297471 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294239 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:53.297471 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294241 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:53.297471 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294244 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:53.297471 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294246 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:53.297471 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294249 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:53.297471 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294251 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:53.297471 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294254 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:53.297471 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294256 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:53.297471 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294259 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:53.297471 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294261 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:53.297471 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294264 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:53.297471 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294266 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:53.297471 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294269 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:53.297471 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294272 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:53.297471 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294275 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:53.297471 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294277 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:53.297471 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294280 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:53.297471 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294283 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:53.297471 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294285 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:53.297938 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294288 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:53.297938 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294290 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:53.297938 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294293 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:53.297938 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294295 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:53.297938 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294298 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:53.297938 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294301 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:53.297938 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294303 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:53.297938 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294306 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:53.297938 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294309 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:53.297938 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294312 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:53.297938 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294315 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:53.297938 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294318 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:53.297938 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294320 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:53.297938 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294322 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:53.297938 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294326 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:53.297938 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294330 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:53.297938 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294332 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:53.297938 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294335 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:53.297938 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294338 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:53.298397 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294342 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:53.298397 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294345 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:53.298397 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294348 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:53.298397 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294351 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:53.298397 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294354 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:53.298397 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294357 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:53.298397 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294359 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:53.298397 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294362 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:53.298397 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294364 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:53.298397 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294367 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:53.298397 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294370 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:53.298397 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294373 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:53.298397 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294375 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:53.298397 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294378 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:53.298397 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294381 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:53.298397 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294384 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:53.298397 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294386 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:53.298397 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294389 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:53.298397 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294395 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:53.298397 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294398 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:53.298895 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294400 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:53.298895 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294403 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:53.298895 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294406 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:53.298895 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294408 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:53.298895 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294411 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:53.298895 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294413 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:53.298895 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294416 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:53.298895 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294418 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:53.298895 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294421 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:53.298895 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294423 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:53.298895 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294426 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:53.298895 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294428 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:53.298895 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294431 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:53.298895 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:53.294433 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:53.298895 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.294438 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:53.299263 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.295092 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 14:15:53.299423 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.299408 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 14:15:53.300228 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.300217 2579 server.go:1019] "Starting client certificate rotation" Apr 22 14:15:53.300333 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.300316 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 14:15:53.300368 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.300358 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 14:15:53.320554 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.320532 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 14:15:53.322251 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.322216 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 14:15:53.339328 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.339301 2579 log.go:25] "Validated CRI v1 runtime API" Apr 22 14:15:53.344163 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.344146 2579 log.go:25] "Validated CRI v1 image API" Apr 22 14:15:53.345409 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.345390 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 14:15:53.349435 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.349411 2579 fs.go:135] Filesystem UUIDs: map[1716e8f2-e0f4-4337-8446-0dc7edeee8d5:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 8cdcf93c-2c9d-4d7c-b1e9-0a0f377ab82b:/dev/nvme0n1p4] Apr 22 14:15:53.349511 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.349433 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 14:15:53.352751 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.352731 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 14:15:53.355265 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.355155 2579 manager.go:217] Machine: {Timestamp:2026-04-22 14:15:53.353583353 +0000 UTC m=+0.329548784 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100250 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d5c5f9edabbe9679a55cd9624385c SystemUUID:ec2d5c5f-9eda-bbe9-679a-55cd9624385c BootID:b75303af-55dd-4042-8f86-7045963d7f5a Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:50:5e:4e:c3:49 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:50:5e:4e:c3:49 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:72:07:29:e8:5a:d3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 14:15:53.355640 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.355630 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 14:15:53.355768 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.355756 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 14:15:53.356070 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.356048 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 14:15:53.356223 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.356072 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-255.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 14:15:53.356269 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.356233 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 14:15:53.356269 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.356241 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 14:15:53.356269 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.356254 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 14:15:53.357443 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.357432 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 14:15:53.358712 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.358702 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 22 14:15:53.358820 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.358811 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 14:15:53.360736 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.360727 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 22 14:15:53.360779 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.360746 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 14:15:53.360779 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.360757 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 14:15:53.360779 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.360767 2579 kubelet.go:397] "Adding apiserver pod source" Apr 22 14:15:53.360892 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.360781 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 14:15:53.361654 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.361642 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 14:15:53.361727 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.361659 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 14:15:53.365156 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.365136 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 14:15:53.366826 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.366804 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 14:15:53.368338 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.368325 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 14:15:53.368389 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.368342 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 14:15:53.368389 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.368348 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 14:15:53.368389 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.368354 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 14:15:53.368389 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.368360 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 14:15:53.368389 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.368366 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 14:15:53.368389 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.368374 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 14:15:53.368389 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.368382 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 14:15:53.368389 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.368389 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 14:15:53.368625 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.368395 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 14:15:53.368625 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.368404 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 14:15:53.368625 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.368412 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 14:15:53.369119 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.369108 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 14:15:53.369152 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.369120 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 14:15:53.371243 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:53.371215 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 14:15:53.371243 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:53.371216 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-255.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 14:15:53.372840 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.372827 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 14:15:53.372878 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.372860 2579 server.go:1295] "Started kubelet" Apr 22 14:15:53.372949 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.372931 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 14:15:53.373048 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.373011 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 14:15:53.373122 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.373075 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 14:15:53.374027 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.373988 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 14:15:53.374359 ip-10-0-130-255 systemd[1]: Started Kubernetes Kubelet. Apr 22 14:15:53.375738 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.375727 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 22 14:15:53.379920 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.379897 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-255.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 14:15:53.380118 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.380099 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 14:15:53.380574 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.380559 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 14:15:53.380637 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:53.379833 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-255.ec2.internal.18a8b37192c6ca71 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-255.ec2.internal,UID:ip-10-0-130-255.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-255.ec2.internal,},FirstTimestamp:2026-04-22 14:15:53.372838513 +0000 UTC m=+0.348803944,LastTimestamp:2026-04-22 14:15:53.372838513 +0000 UTC m=+0.348803944,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-255.ec2.internal,}" Apr 22 14:15:53.381580 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.381560 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 14:15:53.381826 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:53.381590 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Apr 22 14:15:53.381826 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.381646 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 14:15:53.381961 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.381889 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 14:15:53.382020 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.381979 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 22 14:15:53.382020 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.381989 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 22 14:15:53.383051 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.382763 2579 factory.go:55] Registering systemd factory Apr 22 14:15:53.383051 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.382780 2579 factory.go:223] Registration of the systemd container factory successfully Apr 22 14:15:53.383051 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.382803 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mt79x" Apr 22 14:15:53.383051 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.383011 2579 factory.go:153] Registering CRI-O factory Apr 22 14:15:53.383051 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.383040 2579 factory.go:223] Registration of the crio container factory successfully Apr 22 14:15:53.383324 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:53.383011 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 14:15:53.383324 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.383144 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 14:15:53.383324 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:53.383149 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 14:15:53.383324 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.383163 2579 factory.go:103] Registering Raw factory Apr 22 14:15:53.383324 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.383176 2579 manager.go:1196] Started watching for new ooms in manager Apr 22 14:15:53.383594 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:53.383406 2579 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-255.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 14:15:53.384250 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.384236 2579 manager.go:319] Starting recovery of all containers Apr 22 14:15:53.390897 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.390875 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mt79x" Apr 22 14:15:53.394256 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.394239 2579 manager.go:324] Recovery completed Apr 22 14:15:53.398571 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.398559 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:53.401108 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.401091 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:53.401173 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.401120 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:53.401173 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.401131 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:53.401696 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.401681 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 14:15:53.401696 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.401695 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 14:15:53.401828 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.401713 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 22 14:15:53.405148 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.405134 2579 policy_none.go:49] "None policy: Start" Apr 22 14:15:53.405215 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.405154 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 14:15:53.405215 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.405167 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 22 14:15:53.437048 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.437029 2579 manager.go:341] "Starting Device Plugin manager" Apr 22 14:15:53.437157 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:53.437072 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 14:15:53.437157 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.437083 2579 server.go:85] "Starting device plugin registration server" Apr 22 14:15:53.437356 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.437340 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 14:15:53.437486 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.437358 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 14:15:53.437571 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.437554 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 14:15:53.438006 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.437644 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 14:15:53.438006 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.437656 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 14:15:53.438129 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:53.438091 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 14:15:53.438187 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:53.438136 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-255.ec2.internal\" not found" Apr 22 14:15:53.511164 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.511125 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 14:15:53.513499 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.512348 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 14:15:53.513499 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.512368 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 14:15:53.513499 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.512386 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 14:15:53.513499 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.512393 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 14:15:53.513499 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:53.512481 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 14:15:53.515381 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.515357 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:53.538951 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.538884 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:53.539859 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.539842 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:53.539908 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.539875 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:53.539908 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.539888 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:53.539971 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.539914 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-255.ec2.internal" Apr 22 14:15:53.548559 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.548541 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-255.ec2.internal" Apr 22 14:15:53.548639 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:53.548566 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-255.ec2.internal\": node \"ip-10-0-130-255.ec2.internal\" not found" Apr 22 14:15:53.578116 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:53.578091 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Apr 22 14:15:53.613404 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.613360 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-255.ec2.internal"] Apr 22 14:15:53.613549 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.613468 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:53.614931 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.614914 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:53.614998 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.614947 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:53.614998 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.614956 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:53.617236 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.617223 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:53.617366 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.617352 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" Apr 22 14:15:53.617404 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.617386 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:53.618005 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.617984 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:53.618123 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.618015 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:53.618123 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.618029 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:53.618123 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.617989 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:53.618123 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.618084 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:53.618123 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.618094 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:53.620271 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.620256 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-255.ec2.internal" Apr 22 14:15:53.620339 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.620280 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:53.620929 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.620915 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:53.621005 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.620945 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:53.621005 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.620959 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:53.637055 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:53.637033 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-255.ec2.internal\" not found" node="ip-10-0-130-255.ec2.internal" Apr 22 14:15:53.641565 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:53.641548 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-255.ec2.internal\" not found" node="ip-10-0-130-255.ec2.internal" Apr 22 14:15:53.678556 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:53.678526 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Apr 22 14:15:53.683905 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.683885 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/16c3d2024ae6a72619e3784aacdb1436-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal\" (UID: \"16c3d2024ae6a72619e3784aacdb1436\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" Apr 22 14:15:53.683983 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.683913 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16c3d2024ae6a72619e3784aacdb1436-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal\" (UID: \"16c3d2024ae6a72619e3784aacdb1436\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" Apr 22 14:15:53.683983 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.683931 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a7f3a71818e5c0ac42d79312a43ec6c3-config\") pod \"kube-apiserver-proxy-ip-10-0-130-255.ec2.internal\" (UID: \"a7f3a71818e5c0ac42d79312a43ec6c3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-255.ec2.internal" Apr 22 14:15:53.778822 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:53.778798 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Apr 22 14:15:53.784078 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.784062 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/16c3d2024ae6a72619e3784aacdb1436-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal\" (UID: \"16c3d2024ae6a72619e3784aacdb1436\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" Apr 22 14:15:53.784145 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.784087 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16c3d2024ae6a72619e3784aacdb1436-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal\" (UID: \"16c3d2024ae6a72619e3784aacdb1436\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" Apr 22 14:15:53.784145 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.784115 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a7f3a71818e5c0ac42d79312a43ec6c3-config\") pod \"kube-apiserver-proxy-ip-10-0-130-255.ec2.internal\" (UID: \"a7f3a71818e5c0ac42d79312a43ec6c3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-255.ec2.internal" Apr 22 14:15:53.784237 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.784169 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/16c3d2024ae6a72619e3784aacdb1436-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal\" (UID: \"16c3d2024ae6a72619e3784aacdb1436\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" Apr 22 14:15:53.784237 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.784184 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a7f3a71818e5c0ac42d79312a43ec6c3-config\") pod \"kube-apiserver-proxy-ip-10-0-130-255.ec2.internal\" (UID: \"a7f3a71818e5c0ac42d79312a43ec6c3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-255.ec2.internal" Apr 22 14:15:53.784237 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.784193 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/16c3d2024ae6a72619e3784aacdb1436-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal\" (UID: \"16c3d2024ae6a72619e3784aacdb1436\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" Apr 22 14:15:53.879291 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:53.879199 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Apr 22 14:15:53.938719 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.938684 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" Apr 22 14:15:53.944737 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:53.944719 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-255.ec2.internal" Apr 22 14:15:53.979344 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:53.979306 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Apr 22 14:15:54.079849 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:54.079813 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Apr 22 14:15:54.180367 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:54.180290 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Apr 22 14:15:54.280833 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:54.280778 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Apr 22 14:15:54.301204 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:54.301167 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 14:15:54.301350 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:54.301319 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 14:15:54.380816 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:54.380755 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 14:15:54.380962 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:54.380911 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Apr 22 14:15:54.395015 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:54.394982 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 14:10:53 +0000 UTC" deadline="2027-12-22 18:32:43.160758026 +0000 UTC" Apr 22 14:15:54.395015 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:54.395010 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14620h16m48.765751261s" Apr 22 14:15:54.398732 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:54.398711 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 14:15:54.424375 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:54.424214 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-snw7p" Apr 22 14:15:54.435006 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:54.434947 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-snw7p" Apr 22 14:15:54.481721 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:54.481676 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Apr 22 14:15:54.582291 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:54.582263 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Apr 22 14:15:54.588491 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:54.588437 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16c3d2024ae6a72619e3784aacdb1436.slice/crio-2e0ea42b37c3222b55bac45b9e06dd42111b9bb5561589a83f8eb81bc6840058 WatchSource:0}: Error finding container 2e0ea42b37c3222b55bac45b9e06dd42111b9bb5561589a83f8eb81bc6840058: Status 404 returned error can't find the container with id 2e0ea42b37c3222b55bac45b9e06dd42111b9bb5561589a83f8eb81bc6840058 Apr 22 14:15:54.589160 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:54.589139 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7f3a71818e5c0ac42d79312a43ec6c3.slice/crio-3d00d1f1933005e12119ecdee9b9283f5b00ecd5a677fac193f199b066020010 WatchSource:0}: Error finding container 3d00d1f1933005e12119ecdee9b9283f5b00ecd5a677fac193f199b066020010: Status 404 returned error can't find the container with id 3d00d1f1933005e12119ecdee9b9283f5b00ecd5a677fac193f199b066020010 Apr 22 14:15:54.593396 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:54.593381 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:15:54.657119 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:54.657088 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:54.682605 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:54.682561 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Apr 22 14:15:54.783047 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:54.783017 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Apr 22 14:15:54.792886 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:54.792866 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:54.881365 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:54.881336 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" Apr 22 14:15:54.899488 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:54.899445 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 14:15:54.900885 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:54.900868 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-255.ec2.internal" Apr 22 14:15:54.906899 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:54.906875 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:54.911055 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:54.911032 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 14:15:55.329960 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.329929 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:55.362050 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.361947 2579 apiserver.go:52] "Watching apiserver" Apr 22 14:15:55.369063 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.369036 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 14:15:55.370813 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.370787 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-l652r","openshift-cluster-node-tuning-operator/tuned-hh4lt","openshift-dns/node-resolver-9gzdh","openshift-image-registry/node-ca-2wcll","openshift-multus/multus-886zv","openshift-multus/multus-additional-cni-plugins-5v8nc","openshift-multus/network-metrics-daemon-9f46g","openshift-network-operator/iptables-alerter-2jw2j","kube-system/kube-apiserver-proxy-ip-10-0-130-255.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal","openshift-network-diagnostics/network-check-target-2wdfd","openshift-ovn-kubernetes/ovnkube-node-ttkqb"] Apr 22 14:15:55.373142 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.373123 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:15:55.373228 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:55.373191 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9f46g" podUID="11aa4ec9-8bc6-4d4d-b02e-6441451ad30d" Apr 22 14:15:55.375309 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.375287 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.377382 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.377343 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9gzdh" Apr 22 14:15:55.377808 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.377783 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-szcqw\"" Apr 22 14:15:55.377897 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.377792 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 14:15:55.377897 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.377862 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:15:55.379542 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.379523 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2wcll" Apr 22 14:15:55.379998 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.379981 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 14:15:55.380098 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.380079 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 14:15:55.380156 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.380081 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-sxftm\"" Apr 22 14:15:55.381681 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.381663 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-886zv" Apr 22 14:15:55.381775 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.381743 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 14:15:55.382250 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.382031 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 14:15:55.382250 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.382137 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 14:15:55.382250 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.382143 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2dvwk\"" Apr 22 14:15:55.383905 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.383879 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.384033 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.383938 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 14:15:55.384396 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.384235 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-b4c54\"" Apr 22 14:15:55.384396 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.384240 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 14:15:55.384396 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.384347 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 14:15:55.384396 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.384348 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 14:15:55.386063 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.386013 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-l652r" Apr 22 14:15:55.386201 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.386161 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-5sv9s\"" Apr 22 14:15:55.386361 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.386339 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 14:15:55.386559 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.386538 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 14:15:55.388298 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.388279 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2jw2j" Apr 22 14:15:55.388506 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.388487 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 14:15:55.388731 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.388610 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 14:15:55.389159 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.389135 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8vqk8\"" Apr 22 14:15:55.391058 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391038 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 14:15:55.391164 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391148 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:15:55.391224 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391199 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-etc-sysconfig\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.391285 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391241 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-etc-kubernetes\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.391285 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391270 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-lib-modules\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.391382 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391299 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-host\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.391382 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391333 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f7e86ec-be29-4af2-a455-86acd6f13c8f-tmp\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.391382 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391365 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-host-var-lib-cni-bin\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.391539 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391396 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-host-var-lib-kubelet\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.391539 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391425 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bfa3b731-8569-4e90-9bb7-4b1898b25baa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.391539 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391477 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bfa3b731-8569-4e90-9bb7-4b1898b25baa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.391539 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391509 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs\") pod \"network-metrics-daemon-9f46g\" (UID: \"11aa4ec9-8bc6-4d4d-b02e-6441451ad30d\") " pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:15:55.391715 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391537 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bfa3b731-8569-4e90-9bb7-4b1898b25baa-system-cni-dir\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.391715 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391552 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-7qfm6\"" Apr 22 14:15:55.391715 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391566 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ac56b2b8-1a22-40f8-a9b6-5a85a4f1f547-host\") pod \"node-ca-2wcll\" (UID: \"ac56b2b8-1a22-40f8-a9b6-5a85a4f1f547\") " pod="openshift-image-registry/node-ca-2wcll" Apr 22 14:15:55.391715 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391596 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f76jg\" (UniqueName: \"kubernetes.io/projected/ac56b2b8-1a22-40f8-a9b6-5a85a4f1f547-kube-api-access-f76jg\") pod \"node-ca-2wcll\" (UID: \"ac56b2b8-1a22-40f8-a9b6-5a85a4f1f547\") " pod="openshift-image-registry/node-ca-2wcll" Apr 22 14:15:55.391715 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391622 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-multus-cni-dir\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.391715 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391655 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-host-var-lib-cni-multus\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.391715 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391685 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-etc-kubernetes\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.392016 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391716 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-var-lib-kubelet\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.392016 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391728 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 14:15:55.392016 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391816 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-hostroot\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.392016 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391854 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bfa3b731-8569-4e90-9bb7-4b1898b25baa-os-release\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.392016 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391880 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-etc-modprobe-d\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.392016 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.391925 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b6ddc308-4c95-4675-8dae-ecfb2c08991e-hosts-file\") pod \"node-resolver-9gzdh\" (UID: \"b6ddc308-4c95-4675-8dae-ecfb2c08991e\") " pod="openshift-dns/node-resolver-9gzdh" Apr 22 14:15:55.392272 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.392148 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-cni-binary-copy\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.392272 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.392194 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-host-run-k8s-cni-cncf-io\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.392365 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.392301 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-multus-conf-dir\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.392412 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.392364 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bfa3b731-8569-4e90-9bb7-4b1898b25baa-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.392473 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.392416 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-etc-sysctl-d\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.392523 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.392497 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-etc-systemd\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.392573 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.392548 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b6ddc308-4c95-4675-8dae-ecfb2c08991e-tmp-dir\") pod \"node-resolver-9gzdh\" (UID: \"b6ddc308-4c95-4675-8dae-ecfb2c08991e\") " pod="openshift-dns/node-resolver-9gzdh" Apr 22 14:15:55.392654 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.392631 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-multus-socket-dir-parent\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.392749 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.392691 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-host-run-netns\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.392749 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.392720 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bhtx\" (UniqueName: \"kubernetes.io/projected/b6ddc308-4c95-4675-8dae-ecfb2c08991e-kube-api-access-5bhtx\") pod \"node-resolver-9gzdh\" (UID: \"b6ddc308-4c95-4675-8dae-ecfb2c08991e\") " pod="openshift-dns/node-resolver-9gzdh" Apr 22 14:15:55.392868 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.392764 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-os-release\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.392868 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.392805 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-multus-daemon-config\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.392868 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.392853 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-host-run-multus-certs\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.393010 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.392887 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc88m\" (UniqueName: \"kubernetes.io/projected/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-kube-api-access-qc88m\") pod \"network-metrics-daemon-9f46g\" (UID: \"11aa4ec9-8bc6-4d4d-b02e-6441451ad30d\") " pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:15:55.393010 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.392944 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-etc-sysctl-conf\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.393111 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.392975 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-run\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.393228 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.393207 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4f7e86ec-be29-4af2-a455-86acd6f13c8f-etc-tuned\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.393298 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.393277 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjcxh\" (UniqueName: \"kubernetes.io/projected/4f7e86ec-be29-4af2-a455-86acd6f13c8f-kube-api-access-jjcxh\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.393408 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.393389 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ac56b2b8-1a22-40f8-a9b6-5a85a4f1f547-serviceca\") pod \"node-ca-2wcll\" (UID: \"ac56b2b8-1a22-40f8-a9b6-5a85a4f1f547\") " pod="openshift-image-registry/node-ca-2wcll" Apr 22 14:15:55.393526 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.393470 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-system-cni-dir\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.393601 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.393538 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-cnibin\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.393601 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.393589 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bfa3b731-8569-4e90-9bb7-4b1898b25baa-cnibin\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.393826 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.393656 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bfa3b731-8569-4e90-9bb7-4b1898b25baa-cni-binary-copy\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.393826 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.393793 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-sys\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.393925 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.393846 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:15:55.393925 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.393862 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2sk6\" (UniqueName: \"kubernetes.io/projected/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-kube-api-access-f2sk6\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.393925 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.393893 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.394115 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:55.393931 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wdfd" podUID="ace85c9c-a18e-408d-b8d6-d5d2d99bc11a" Apr 22 14:15:55.394115 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.393896 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75sdm\" (UniqueName: \"kubernetes.io/projected/bfa3b731-8569-4e90-9bb7-4b1898b25baa-kube-api-access-75sdm\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.396697 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.396682 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 14:15:55.396778 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.396684 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-9qgh5\"" Apr 22 14:15:55.396937 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.396921 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 14:15:55.397005 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.396991 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.397241 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.397125 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 14:15:55.399339 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.399321 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-llt9k\"" Apr 22 14:15:55.399498 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.399477 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 14:15:55.400494 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.400477 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 14:15:55.400494 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.400491 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 14:15:55.400672 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.400496 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 14:15:55.400672 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.400505 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 14:15:55.400672 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.400517 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 14:15:55.435686 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.435638 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 14:10:54 +0000 UTC" deadline="2027-11-06 22:40:39.686920215 +0000 UTC" Apr 22 14:15:55.435769 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.435693 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13520h24m44.251233787s" Apr 22 14:15:55.483302 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.483271 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 14:15:55.494899 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.494863 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-multus-daemon-config\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.494899 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.494899 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-host-run-multus-certs\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.495118 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.494916 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-run\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.495118 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.494941 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjcxh\" (UniqueName: \"kubernetes.io/projected/4f7e86ec-be29-4af2-a455-86acd6f13c8f-kube-api-access-jjcxh\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.495118 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.494992 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-host-run-multus-certs\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.495118 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495026 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-run\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.495118 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495042 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sg2j\" (UniqueName: \"kubernetes.io/projected/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-kube-api-access-6sg2j\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.495118 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495076 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ac56b2b8-1a22-40f8-a9b6-5a85a4f1f547-serviceca\") pod \"node-ca-2wcll\" (UID: \"ac56b2b8-1a22-40f8-a9b6-5a85a4f1f547\") " pod="openshift-image-registry/node-ca-2wcll" Apr 22 14:15:55.495118 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495101 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-system-cni-dir\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.495118 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495123 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-cnibin\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.495428 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495168 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bfa3b731-8569-4e90-9bb7-4b1898b25baa-cnibin\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.495428 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495228 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bfa3b731-8569-4e90-9bb7-4b1898b25baa-cni-binary-copy\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.495428 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495238 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-system-cni-dir\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.495428 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495254 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-lib-modules\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.495428 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495279 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-cnibin\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.495428 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495277 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-host\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.495428 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495293 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bfa3b731-8569-4e90-9bb7-4b1898b25baa-cnibin\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.495428 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495340 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-host\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.495428 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495411 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-lib-modules\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.495838 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495462 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f7e86ec-be29-4af2-a455-86acd6f13c8f-tmp\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.495838 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495491 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2sk6\" (UniqueName: \"kubernetes.io/projected/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-kube-api-access-f2sk6\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.495838 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495517 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75sdm\" (UniqueName: \"kubernetes.io/projected/bfa3b731-8569-4e90-9bb7-4b1898b25baa-kube-api-access-75sdm\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.495838 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495542 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-etc-sysconfig\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.495838 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495564 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ac56b2b8-1a22-40f8-a9b6-5a85a4f1f547-serviceca\") pod \"node-ca-2wcll\" (UID: \"ac56b2b8-1a22-40f8-a9b6-5a85a4f1f547\") " pod="openshift-image-registry/node-ca-2wcll" Apr 22 14:15:55.495838 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495568 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-multus-daemon-config\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.495838 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495571 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-host-cni-netd\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.495838 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495652 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/746fce17-4f0b-447d-b5e3-8a8e878c5498-sys-fs\") pod \"aws-ebs-csi-driver-node-97qzw\" (UID: \"746fce17-4f0b-447d-b5e3-8a8e878c5498\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.495838 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495682 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbw8j\" (UniqueName: \"kubernetes.io/projected/746fce17-4f0b-447d-b5e3-8a8e878c5498-kube-api-access-sbw8j\") pod \"aws-ebs-csi-driver-node-97qzw\" (UID: \"746fce17-4f0b-447d-b5e3-8a8e878c5498\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.495838 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495687 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-etc-sysconfig\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.495838 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495712 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-host-var-lib-kubelet\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.495838 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495740 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs\") pod \"network-metrics-daemon-9f46g\" (UID: \"11aa4ec9-8bc6-4d4d-b02e-6441451ad30d\") " pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:15:55.495838 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495764 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-var-lib-openvswitch\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.495838 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495788 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/746fce17-4f0b-447d-b5e3-8a8e878c5498-kubelet-dir\") pod \"aws-ebs-csi-driver-node-97qzw\" (UID: \"746fce17-4f0b-447d-b5e3-8a8e878c5498\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.495838 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495795 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-host-var-lib-kubelet\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.495838 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495812 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-host-kubelet\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.496521 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495847 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bfa3b731-8569-4e90-9bb7-4b1898b25baa-cni-binary-copy\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.496521 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:55.495877 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:55.496521 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495911 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f76jg\" (UniqueName: \"kubernetes.io/projected/ac56b2b8-1a22-40f8-a9b6-5a85a4f1f547-kube-api-access-f76jg\") pod \"node-ca-2wcll\" (UID: \"ac56b2b8-1a22-40f8-a9b6-5a85a4f1f547\") " pod="openshift-image-registry/node-ca-2wcll" Apr 22 14:15:55.496521 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495940 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-multus-cni-dir\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.496521 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495928 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 14:15:55.496521 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.495969 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-etc-kubernetes\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.496521 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:55.496028 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs podName:11aa4ec9-8bc6-4d4d-b02e-6441451ad30d nodeName:}" failed. No retries permitted until 2026-04-22 14:15:55.995983012 +0000 UTC m=+2.971948438 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs") pod "network-metrics-daemon-9f46g" (UID: "11aa4ec9-8bc6-4d4d-b02e-6441451ad30d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:55.496521 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.496030 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-etc-kubernetes\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.496521 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.496095 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-host-cni-bin\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.496521 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.496172 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f9ffd782-8eeb-457e-afbf-6985dab9b00d-agent-certs\") pod \"konnectivity-agent-l652r\" (UID: \"f9ffd782-8eeb-457e-afbf-6985dab9b00d\") " pod="kube-system/konnectivity-agent-l652r" Apr 22 14:15:55.496521 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.496157 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-multus-cni-dir\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.496521 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.496248 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/746fce17-4f0b-447d-b5e3-8a8e878c5498-registration-dir\") pod \"aws-ebs-csi-driver-node-97qzw\" (UID: \"746fce17-4f0b-447d-b5e3-8a8e878c5498\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.496521 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.496397 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-hostroot\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.497026 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.496728 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-hostroot\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.497026 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.496764 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bfa3b731-8569-4e90-9bb7-4b1898b25baa-os-release\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.497026 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.496792 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-etc-modprobe-d\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.497026 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.496839 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7a3b551c-cc42-4cf1-9c6a-1f8e29af3205-host-slash\") pod \"iptables-alerter-2jw2j\" (UID: \"7a3b551c-cc42-4cf1-9c6a-1f8e29af3205\") " pod="openshift-network-operator/iptables-alerter-2jw2j" Apr 22 14:15:55.497026 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.496861 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-run-systemd\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.497026 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.496880 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-node-log\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.497026 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.496903 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-host-run-k8s-cni-cncf-io\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.497026 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.496909 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bfa3b731-8569-4e90-9bb7-4b1898b25baa-os-release\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.497026 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.496927 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-multus-conf-dir\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.497026 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.496942 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-etc-modprobe-d\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.497026 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.496955 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-etc-sysctl-d\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.497026 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.496980 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-etc-systemd\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.497593 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497036 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-multus-conf-dir\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.497593 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497049 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-etc-systemd\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.497593 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497084 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-run-openvswitch\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.497593 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497090 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-host-run-k8s-cni-cncf-io\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.497593 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497111 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-etc-sysctl-d\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.497593 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497122 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.497593 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497233 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-ovn-node-metrics-cert\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.497593 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497266 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-host-run-netns\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.497593 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497295 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bhtx\" (UniqueName: \"kubernetes.io/projected/b6ddc308-4c95-4675-8dae-ecfb2c08991e-kube-api-access-5bhtx\") pod \"node-resolver-9gzdh\" (UID: \"b6ddc308-4c95-4675-8dae-ecfb2c08991e\") " pod="openshift-dns/node-resolver-9gzdh" Apr 22 14:15:55.497593 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497302 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-host-run-netns\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.497593 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497313 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-systemd-units\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.497593 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497384 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-os-release\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.497593 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497414 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qc88m\" (UniqueName: \"kubernetes.io/projected/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-kube-api-access-qc88m\") pod \"network-metrics-daemon-9f46g\" (UID: \"11aa4ec9-8bc6-4d4d-b02e-6441451ad30d\") " pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:15:55.497593 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497439 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-etc-sysctl-conf\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.497593 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497477 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4f7e86ec-be29-4af2-a455-86acd6f13c8f-etc-tuned\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.497593 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497519 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-os-release\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.497593 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497504 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn4wx\" (UniqueName: \"kubernetes.io/projected/7a3b551c-cc42-4cf1-9c6a-1f8e29af3205-kube-api-access-fn4wx\") pod \"iptables-alerter-2jw2j\" (UID: \"7a3b551c-cc42-4cf1-9c6a-1f8e29af3205\") " pod="openshift-network-operator/iptables-alerter-2jw2j" Apr 22 14:15:55.498355 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497575 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-sys\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.498355 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497637 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-sys\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.498355 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497663 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-log-socket\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.498355 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497700 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/746fce17-4f0b-447d-b5e3-8a8e878c5498-socket-dir\") pod \"aws-ebs-csi-driver-node-97qzw\" (UID: \"746fce17-4f0b-447d-b5e3-8a8e878c5498\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.498355 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497726 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/746fce17-4f0b-447d-b5e3-8a8e878c5498-device-dir\") pod \"aws-ebs-csi-driver-node-97qzw\" (UID: \"746fce17-4f0b-447d-b5e3-8a8e878c5498\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.498355 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497671 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-etc-sysctl-conf\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.498355 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497776 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-etc-kubernetes\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.498355 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497809 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-host-slash\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.498355 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497832 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-etc-kubernetes\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.498355 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497835 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-ovnkube-config\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.498355 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497869 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f9ffd782-8eeb-457e-afbf-6985dab9b00d-konnectivity-ca\") pod \"konnectivity-agent-l652r\" (UID: \"f9ffd782-8eeb-457e-afbf-6985dab9b00d\") " pod="kube-system/konnectivity-agent-l652r" Apr 22 14:15:55.498355 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497897 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-host-var-lib-cni-bin\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.498355 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497923 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bfa3b731-8569-4e90-9bb7-4b1898b25baa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.498355 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497951 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bfa3b731-8569-4e90-9bb7-4b1898b25baa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.498355 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497979 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-host-run-ovn-kubernetes\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.498355 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498018 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-env-overrides\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.498355 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498044 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bfa3b731-8569-4e90-9bb7-4b1898b25baa-system-cni-dir\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.499132 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.497968 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-host-var-lib-cni-bin\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.499132 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498071 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ac56b2b8-1a22-40f8-a9b6-5a85a4f1f547-host\") pod \"node-ca-2wcll\" (UID: \"ac56b2b8-1a22-40f8-a9b6-5a85a4f1f547\") " pod="openshift-image-registry/node-ca-2wcll" Apr 22 14:15:55.499132 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498116 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ac56b2b8-1a22-40f8-a9b6-5a85a4f1f547-host\") pod \"node-ca-2wcll\" (UID: \"ac56b2b8-1a22-40f8-a9b6-5a85a4f1f547\") " pod="openshift-image-registry/node-ca-2wcll" Apr 22 14:15:55.499132 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498126 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-host-var-lib-cni-multus\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.499132 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498133 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bfa3b731-8569-4e90-9bb7-4b1898b25baa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.499132 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498153 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-var-lib-kubelet\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.499132 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498170 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bfa3b731-8569-4e90-9bb7-4b1898b25baa-system-cni-dir\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.499132 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498192 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-host-var-lib-cni-multus\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.499132 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498188 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7a3b551c-cc42-4cf1-9c6a-1f8e29af3205-iptables-alerter-script\") pod \"iptables-alerter-2jw2j\" (UID: \"7a3b551c-cc42-4cf1-9c6a-1f8e29af3205\") " pod="openshift-network-operator/iptables-alerter-2jw2j" Apr 22 14:15:55.499132 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498224 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f7e86ec-be29-4af2-a455-86acd6f13c8f-var-lib-kubelet\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.499132 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498227 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-etc-openvswitch\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.499132 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498264 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-run-ovn\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.499132 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498320 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b6ddc308-4c95-4675-8dae-ecfb2c08991e-hosts-file\") pod \"node-resolver-9gzdh\" (UID: \"b6ddc308-4c95-4675-8dae-ecfb2c08991e\") " pod="openshift-dns/node-resolver-9gzdh" Apr 22 14:15:55.499132 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498347 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-host-run-netns\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.499132 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498371 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/746fce17-4f0b-447d-b5e3-8a8e878c5498-etc-selinux\") pod \"aws-ebs-csi-driver-node-97qzw\" (UID: \"746fce17-4f0b-447d-b5e3-8a8e878c5498\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.499132 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498397 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-cni-binary-copy\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.499132 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498419 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b6ddc308-4c95-4675-8dae-ecfb2c08991e-hosts-file\") pod \"node-resolver-9gzdh\" (UID: \"b6ddc308-4c95-4675-8dae-ecfb2c08991e\") " pod="openshift-dns/node-resolver-9gzdh" Apr 22 14:15:55.499803 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498421 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bfa3b731-8569-4e90-9bb7-4b1898b25baa-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.499803 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498493 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b6ddc308-4c95-4675-8dae-ecfb2c08991e-tmp-dir\") pod \"node-resolver-9gzdh\" (UID: \"b6ddc308-4c95-4675-8dae-ecfb2c08991e\") " pod="openshift-dns/node-resolver-9gzdh" Apr 22 14:15:55.499803 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498522 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrtrc\" (UniqueName: \"kubernetes.io/projected/ace85c9c-a18e-408d-b8d6-d5d2d99bc11a-kube-api-access-zrtrc\") pod \"network-check-target-2wdfd\" (UID: \"ace85c9c-a18e-408d-b8d6-d5d2d99bc11a\") " pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:15:55.499803 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498556 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-ovnkube-script-lib\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.499803 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498588 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-multus-socket-dir-parent\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.499803 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498625 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bfa3b731-8569-4e90-9bb7-4b1898b25baa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.499803 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498693 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-multus-socket-dir-parent\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.499803 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498957 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-cni-binary-copy\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.499803 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.498992 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b6ddc308-4c95-4675-8dae-ecfb2c08991e-tmp-dir\") pod \"node-resolver-9gzdh\" (UID: \"b6ddc308-4c95-4675-8dae-ecfb2c08991e\") " pod="openshift-dns/node-resolver-9gzdh" Apr 22 14:15:55.499803 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.499052 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bfa3b731-8569-4e90-9bb7-4b1898b25baa-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.499803 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.499687 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f7e86ec-be29-4af2-a455-86acd6f13c8f-tmp\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.500500 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.500482 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4f7e86ec-be29-4af2-a455-86acd6f13c8f-etc-tuned\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.504606 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.504495 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjcxh\" (UniqueName: \"kubernetes.io/projected/4f7e86ec-be29-4af2-a455-86acd6f13c8f-kube-api-access-jjcxh\") pod \"tuned-hh4lt\" (UID: \"4f7e86ec-be29-4af2-a455-86acd6f13c8f\") " pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.506708 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.506686 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2sk6\" (UniqueName: \"kubernetes.io/projected/b22e95d1-d9a7-4282-b3bd-1e77a23c5db0-kube-api-access-f2sk6\") pod \"multus-886zv\" (UID: \"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0\") " pod="openshift-multus/multus-886zv" Apr 22 14:15:55.506708 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.506690 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75sdm\" (UniqueName: \"kubernetes.io/projected/bfa3b731-8569-4e90-9bb7-4b1898b25baa-kube-api-access-75sdm\") pod \"multus-additional-cni-plugins-5v8nc\" (UID: \"bfa3b731-8569-4e90-9bb7-4b1898b25baa\") " pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.507993 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.507971 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f76jg\" (UniqueName: \"kubernetes.io/projected/ac56b2b8-1a22-40f8-a9b6-5a85a4f1f547-kube-api-access-f76jg\") pod \"node-ca-2wcll\" (UID: \"ac56b2b8-1a22-40f8-a9b6-5a85a4f1f547\") " pod="openshift-image-registry/node-ca-2wcll" Apr 22 14:15:55.508375 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.508348 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bhtx\" (UniqueName: \"kubernetes.io/projected/b6ddc308-4c95-4675-8dae-ecfb2c08991e-kube-api-access-5bhtx\") pod \"node-resolver-9gzdh\" (UID: \"b6ddc308-4c95-4675-8dae-ecfb2c08991e\") " pod="openshift-dns/node-resolver-9gzdh" Apr 22 14:15:55.508442 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.508401 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc88m\" (UniqueName: \"kubernetes.io/projected/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-kube-api-access-qc88m\") pod \"network-metrics-daemon-9f46g\" (UID: \"11aa4ec9-8bc6-4d4d-b02e-6441451ad30d\") " pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:15:55.517763 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.517715 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" event={"ID":"16c3d2024ae6a72619e3784aacdb1436","Type":"ContainerStarted","Data":"2e0ea42b37c3222b55bac45b9e06dd42111b9bb5561589a83f8eb81bc6840058"} Apr 22 14:15:55.518816 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.518795 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-255.ec2.internal" event={"ID":"a7f3a71818e5c0ac42d79312a43ec6c3","Type":"ContainerStarted","Data":"3d00d1f1933005e12119ecdee9b9283f5b00ecd5a677fac193f199b066020010"} Apr 22 14:15:55.599508 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599417 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6sg2j\" (UniqueName: \"kubernetes.io/projected/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-kube-api-access-6sg2j\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.599508 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599476 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-host-cni-netd\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.599508 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599498 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/746fce17-4f0b-447d-b5e3-8a8e878c5498-sys-fs\") pod \"aws-ebs-csi-driver-node-97qzw\" (UID: \"746fce17-4f0b-447d-b5e3-8a8e878c5498\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.599758 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599522 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbw8j\" (UniqueName: \"kubernetes.io/projected/746fce17-4f0b-447d-b5e3-8a8e878c5498-kube-api-access-sbw8j\") pod \"aws-ebs-csi-driver-node-97qzw\" (UID: \"746fce17-4f0b-447d-b5e3-8a8e878c5498\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.599758 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599561 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-var-lib-openvswitch\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.599758 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599587 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-host-cni-netd\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.599758 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599627 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/746fce17-4f0b-447d-b5e3-8a8e878c5498-sys-fs\") pod \"aws-ebs-csi-driver-node-97qzw\" (UID: \"746fce17-4f0b-447d-b5e3-8a8e878c5498\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.599758 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599593 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/746fce17-4f0b-447d-b5e3-8a8e878c5498-kubelet-dir\") pod \"aws-ebs-csi-driver-node-97qzw\" (UID: \"746fce17-4f0b-447d-b5e3-8a8e878c5498\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.599758 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599666 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/746fce17-4f0b-447d-b5e3-8a8e878c5498-kubelet-dir\") pod \"aws-ebs-csi-driver-node-97qzw\" (UID: \"746fce17-4f0b-447d-b5e3-8a8e878c5498\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.599758 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599638 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-var-lib-openvswitch\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.599758 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599689 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-host-kubelet\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.599758 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599717 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-host-cni-bin\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.599758 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599738 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f9ffd782-8eeb-457e-afbf-6985dab9b00d-agent-certs\") pod \"konnectivity-agent-l652r\" (UID: \"f9ffd782-8eeb-457e-afbf-6985dab9b00d\") " pod="kube-system/konnectivity-agent-l652r" Apr 22 14:15:55.599758 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599740 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-host-kubelet\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.599758 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599760 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/746fce17-4f0b-447d-b5e3-8a8e878c5498-registration-dir\") pod \"aws-ebs-csi-driver-node-97qzw\" (UID: \"746fce17-4f0b-447d-b5e3-8a8e878c5498\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.600251 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599786 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-host-cni-bin\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.600251 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599790 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7a3b551c-cc42-4cf1-9c6a-1f8e29af3205-host-slash\") pod \"iptables-alerter-2jw2j\" (UID: \"7a3b551c-cc42-4cf1-9c6a-1f8e29af3205\") " pod="openshift-network-operator/iptables-alerter-2jw2j" Apr 22 14:15:55.600251 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599824 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7a3b551c-cc42-4cf1-9c6a-1f8e29af3205-host-slash\") pod \"iptables-alerter-2jw2j\" (UID: \"7a3b551c-cc42-4cf1-9c6a-1f8e29af3205\") " pod="openshift-network-operator/iptables-alerter-2jw2j" Apr 22 14:15:55.600251 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599830 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-run-systemd\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.600251 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599865 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-run-systemd\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.600251 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599866 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-node-log\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.600251 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599893 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-node-log\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.600251 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599889 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/746fce17-4f0b-447d-b5e3-8a8e878c5498-registration-dir\") pod \"aws-ebs-csi-driver-node-97qzw\" (UID: \"746fce17-4f0b-447d-b5e3-8a8e878c5498\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.600251 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599908 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-run-openvswitch\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.600251 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599935 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.600251 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599939 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-run-openvswitch\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.600251 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599961 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-ovn-node-metrics-cert\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.600251 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599980 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.600251 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.599990 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-systemd-units\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.600251 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600019 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fn4wx\" (UniqueName: \"kubernetes.io/projected/7a3b551c-cc42-4cf1-9c6a-1f8e29af3205-kube-api-access-fn4wx\") pod \"iptables-alerter-2jw2j\" (UID: \"7a3b551c-cc42-4cf1-9c6a-1f8e29af3205\") " pod="openshift-network-operator/iptables-alerter-2jw2j" Apr 22 14:15:55.600251 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600087 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-systemd-units\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.600251 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600131 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-log-socket\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.601027 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600171 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/746fce17-4f0b-447d-b5e3-8a8e878c5498-socket-dir\") pod \"aws-ebs-csi-driver-node-97qzw\" (UID: \"746fce17-4f0b-447d-b5e3-8a8e878c5498\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.601027 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600199 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/746fce17-4f0b-447d-b5e3-8a8e878c5498-device-dir\") pod \"aws-ebs-csi-driver-node-97qzw\" (UID: \"746fce17-4f0b-447d-b5e3-8a8e878c5498\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.601027 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600234 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-host-slash\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.601027 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600189 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-log-socket\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.601027 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600259 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-ovnkube-config\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.601027 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600278 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/746fce17-4f0b-447d-b5e3-8a8e878c5498-device-dir\") pod \"aws-ebs-csi-driver-node-97qzw\" (UID: \"746fce17-4f0b-447d-b5e3-8a8e878c5498\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.601027 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600284 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f9ffd782-8eeb-457e-afbf-6985dab9b00d-konnectivity-ca\") pod \"konnectivity-agent-l652r\" (UID: \"f9ffd782-8eeb-457e-afbf-6985dab9b00d\") " pod="kube-system/konnectivity-agent-l652r" Apr 22 14:15:55.601027 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600301 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-host-slash\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.601027 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600295 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/746fce17-4f0b-447d-b5e3-8a8e878c5498-socket-dir\") pod \"aws-ebs-csi-driver-node-97qzw\" (UID: \"746fce17-4f0b-447d-b5e3-8a8e878c5498\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.601027 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600321 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-host-run-ovn-kubernetes\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.601027 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600366 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-host-run-ovn-kubernetes\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.601027 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600381 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-env-overrides\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.601027 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600426 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7a3b551c-cc42-4cf1-9c6a-1f8e29af3205-iptables-alerter-script\") pod \"iptables-alerter-2jw2j\" (UID: \"7a3b551c-cc42-4cf1-9c6a-1f8e29af3205\") " pod="openshift-network-operator/iptables-alerter-2jw2j" Apr 22 14:15:55.601027 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600466 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-etc-openvswitch\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.601027 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600492 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-run-ovn\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.601027 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600516 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-host-run-netns\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.601027 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600614 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/746fce17-4f0b-447d-b5e3-8a8e878c5498-etc-selinux\") pod \"aws-ebs-csi-driver-node-97qzw\" (UID: \"746fce17-4f0b-447d-b5e3-8a8e878c5498\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.601722 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600655 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrtrc\" (UniqueName: \"kubernetes.io/projected/ace85c9c-a18e-408d-b8d6-d5d2d99bc11a-kube-api-access-zrtrc\") pod \"network-check-target-2wdfd\" (UID: \"ace85c9c-a18e-408d-b8d6-d5d2d99bc11a\") " pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:15:55.601722 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600688 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-ovnkube-script-lib\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.601722 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600889 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-ovnkube-config\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.601722 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600913 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f9ffd782-8eeb-457e-afbf-6985dab9b00d-konnectivity-ca\") pod \"konnectivity-agent-l652r\" (UID: \"f9ffd782-8eeb-457e-afbf-6985dab9b00d\") " pod="kube-system/konnectivity-agent-l652r" Apr 22 14:15:55.601722 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600954 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/746fce17-4f0b-447d-b5e3-8a8e878c5498-etc-selinux\") pod \"aws-ebs-csi-driver-node-97qzw\" (UID: \"746fce17-4f0b-447d-b5e3-8a8e878c5498\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.601722 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.600997 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-host-run-netns\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.601722 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.601183 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-etc-openvswitch\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.601722 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.601235 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-run-ovn\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.601722 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.601276 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-env-overrides\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.601722 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.601356 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-ovnkube-script-lib\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.601722 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.601487 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7a3b551c-cc42-4cf1-9c6a-1f8e29af3205-iptables-alerter-script\") pod \"iptables-alerter-2jw2j\" (UID: \"7a3b551c-cc42-4cf1-9c6a-1f8e29af3205\") " pod="openshift-network-operator/iptables-alerter-2jw2j" Apr 22 14:15:55.602564 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.602543 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-ovn-node-metrics-cert\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.602922 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.602903 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f9ffd782-8eeb-457e-afbf-6985dab9b00d-agent-certs\") pod \"konnectivity-agent-l652r\" (UID: \"f9ffd782-8eeb-457e-afbf-6985dab9b00d\") " pod="kube-system/konnectivity-agent-l652r" Apr 22 14:15:55.606201 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:55.606180 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:55.606295 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:55.606206 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:55.606295 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:55.606220 2579 projected.go:194] Error preparing data for projected volume kube-api-access-zrtrc for pod openshift-network-diagnostics/network-check-target-2wdfd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:55.606399 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:55.606312 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ace85c9c-a18e-408d-b8d6-d5d2d99bc11a-kube-api-access-zrtrc podName:ace85c9c-a18e-408d-b8d6-d5d2d99bc11a nodeName:}" failed. No retries permitted until 2026-04-22 14:15:56.106293432 +0000 UTC m=+3.082258850 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zrtrc" (UniqueName: "kubernetes.io/projected/ace85c9c-a18e-408d-b8d6-d5d2d99bc11a-kube-api-access-zrtrc") pod "network-check-target-2wdfd" (UID: "ace85c9c-a18e-408d-b8d6-d5d2d99bc11a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:55.608744 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.608717 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn4wx\" (UniqueName: \"kubernetes.io/projected/7a3b551c-cc42-4cf1-9c6a-1f8e29af3205-kube-api-access-fn4wx\") pod \"iptables-alerter-2jw2j\" (UID: \"7a3b551c-cc42-4cf1-9c6a-1f8e29af3205\") " pod="openshift-network-operator/iptables-alerter-2jw2j" Apr 22 14:15:55.609027 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.609007 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbw8j\" (UniqueName: \"kubernetes.io/projected/746fce17-4f0b-447d-b5e3-8a8e878c5498-kube-api-access-sbw8j\") pod \"aws-ebs-csi-driver-node-97qzw\" (UID: \"746fce17-4f0b-447d-b5e3-8a8e878c5498\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.609122 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.609012 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sg2j\" (UniqueName: \"kubernetes.io/projected/6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f-kube-api-access-6sg2j\") pod \"ovnkube-node-ttkqb\" (UID: \"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:55.687073 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.687036 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" Apr 22 14:15:55.693851 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.693828 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9gzdh" Apr 22 14:15:55.703854 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.703826 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2wcll" Apr 22 14:15:55.708601 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.708579 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-886zv" Apr 22 14:15:55.714222 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.714201 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5v8nc" Apr 22 14:15:55.720902 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.720880 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-l652r" Apr 22 14:15:55.727398 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.727368 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2jw2j" Apr 22 14:15:55.739076 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.739052 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" Apr 22 14:15:55.743825 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:55.743804 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:15:56.004040 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:56.004006 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs\") pod \"network-metrics-daemon-9f46g\" (UID: \"11aa4ec9-8bc6-4d4d-b02e-6441451ad30d\") " pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:15:56.004214 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:56.004136 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:56.004278 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:56.004210 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs podName:11aa4ec9-8bc6-4d4d-b02e-6441451ad30d nodeName:}" failed. No retries permitted until 2026-04-22 14:15:57.004187682 +0000 UTC m=+3.980153105 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs") pod "network-metrics-daemon-9f46g" (UID: "11aa4ec9-8bc6-4d4d-b02e-6441451ad30d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:56.204738 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:56.204709 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrtrc\" (UniqueName: \"kubernetes.io/projected/ace85c9c-a18e-408d-b8d6-d5d2d99bc11a-kube-api-access-zrtrc\") pod \"network-check-target-2wdfd\" (UID: \"ace85c9c-a18e-408d-b8d6-d5d2d99bc11a\") " pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:15:56.204898 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:56.204872 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:56.204898 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:56.204888 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:56.204898 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:56.204897 2579 projected.go:194] Error preparing data for projected volume kube-api-access-zrtrc for pod openshift-network-diagnostics/network-check-target-2wdfd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:56.205015 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:56.204946 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ace85c9c-a18e-408d-b8d6-d5d2d99bc11a-kube-api-access-zrtrc podName:ace85c9c-a18e-408d-b8d6-d5d2d99bc11a nodeName:}" failed. No retries permitted until 2026-04-22 14:15:57.204932895 +0000 UTC m=+4.180898315 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-zrtrc" (UniqueName: "kubernetes.io/projected/ace85c9c-a18e-408d-b8d6-d5d2d99bc11a-kube-api-access-zrtrc") pod "network-check-target-2wdfd" (UID: "ace85c9c-a18e-408d-b8d6-d5d2d99bc11a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:56.221815 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:56.221794 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb22e95d1_d9a7_4282_b3bd_1e77a23c5db0.slice/crio-3431d281c339adefa84b57d09aac49bda2b2c94e59c73c6c09d4a9df602a48aa WatchSource:0}: Error finding container 3431d281c339adefa84b57d09aac49bda2b2c94e59c73c6c09d4a9df602a48aa: Status 404 returned error can't find the container with id 3431d281c339adefa84b57d09aac49bda2b2c94e59c73c6c09d4a9df602a48aa Apr 22 14:15:56.222750 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:56.222729 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9ffd782_8eeb_457e_afbf_6985dab9b00d.slice/crio-04ba4ac801aad5e0d78ee80638bb4545180c2fca6ee473ff6a0464129a6307e4 WatchSource:0}: Error finding container 04ba4ac801aad5e0d78ee80638bb4545180c2fca6ee473ff6a0464129a6307e4: Status 404 returned error can't find the container with id 04ba4ac801aad5e0d78ee80638bb4545180c2fca6ee473ff6a0464129a6307e4 Apr 22 14:15:56.224172 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:56.224157 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod746fce17_4f0b_447d_b5e3_8a8e878c5498.slice/crio-295226b4300bf072bbb72fbe6339b437b92d237b6e7267c8fe4390f1425ea3c4 WatchSource:0}: Error finding container 295226b4300bf072bbb72fbe6339b437b92d237b6e7267c8fe4390f1425ea3c4: Status 404 returned error can't find the container with id 295226b4300bf072bbb72fbe6339b437b92d237b6e7267c8fe4390f1425ea3c4 Apr 22 14:15:56.225110 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:56.224983 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f7e86ec_be29_4af2_a455_86acd6f13c8f.slice/crio-5769af81ce52c46087e0f442f8686e10deab506e86e65097aa873b18319c584e WatchSource:0}: Error finding container 5769af81ce52c46087e0f442f8686e10deab506e86e65097aa873b18319c584e: Status 404 returned error can't find the container with id 5769af81ce52c46087e0f442f8686e10deab506e86e65097aa873b18319c584e Apr 22 14:15:56.227809 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:56.227782 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfa3b731_8569_4e90_9bb7_4b1898b25baa.slice/crio-abedcafa7867c65f266ca12d8c1919cc1839a09d0b62337e0a73bb886b0627f7 WatchSource:0}: Error finding container abedcafa7867c65f266ca12d8c1919cc1839a09d0b62337e0a73bb886b0627f7: Status 404 returned error can't find the container with id abedcafa7867c65f266ca12d8c1919cc1839a09d0b62337e0a73bb886b0627f7 Apr 22 14:15:56.228903 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:56.228875 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6ddc308_4c95_4675_8dae_ecfb2c08991e.slice/crio-80a86202582539cb700e224342a9647e1bb79f1e2282468e24c1479ea362d9cf WatchSource:0}: Error finding container 80a86202582539cb700e224342a9647e1bb79f1e2282468e24c1479ea362d9cf: Status 404 returned error can't find the container with id 80a86202582539cb700e224342a9647e1bb79f1e2282468e24c1479ea362d9cf Apr 22 14:15:56.229732 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:56.229709 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac56b2b8_1a22_40f8_a9b6_5a85a4f1f547.slice/crio-e6945b592aea0e86f3526764f94e2373c5fb25994749ab95e3bfb2b82027c634 WatchSource:0}: Error finding container e6945b592aea0e86f3526764f94e2373c5fb25994749ab95e3bfb2b82027c634: Status 404 returned error can't find the container with id e6945b592aea0e86f3526764f94e2373c5fb25994749ab95e3bfb2b82027c634 Apr 22 14:15:56.230487 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:56.230147 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a3b551c_cc42_4cf1_9c6a_1f8e29af3205.slice/crio-49e5cd4df5e35deb163e3316b903329813a3f46ee922e1488028a39fad138e27 WatchSource:0}: Error finding container 49e5cd4df5e35deb163e3316b903329813a3f46ee922e1488028a39fad138e27: Status 404 returned error can't find the container with id 49e5cd4df5e35deb163e3316b903329813a3f46ee922e1488028a39fad138e27 Apr 22 14:15:56.231935 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:15:56.231912 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a4ee0b1_bb53_4f98_96c3_633c3ae1dd9f.slice/crio-d8a24782137cebdbcd31463cc75e2bd722f4414db22a6ff84b110339ad475fdf WatchSource:0}: Error finding container d8a24782137cebdbcd31463cc75e2bd722f4414db22a6ff84b110339ad475fdf: Status 404 returned error can't find the container with id d8a24782137cebdbcd31463cc75e2bd722f4414db22a6ff84b110339ad475fdf Apr 22 14:15:56.436232 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:56.436189 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 14:10:54 +0000 UTC" deadline="2027-12-26 00:50:22.249185497 +0000 UTC" Apr 22 14:15:56.436232 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:56.436229 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14698h34m25.812960034s" Apr 22 14:15:56.513190 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:56.513119 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:15:56.513319 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:56.513214 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wdfd" podUID="ace85c9c-a18e-408d-b8d6-d5d2d99bc11a" Apr 22 14:15:56.522385 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:56.522348 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-255.ec2.internal" event={"ID":"a7f3a71818e5c0ac42d79312a43ec6c3","Type":"ContainerStarted","Data":"e418e2f295335717953edafc2e15f4fbb9452c5c639c2db6cc3e66838bdc8a69"} Apr 22 14:15:56.523959 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:56.523931 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" event={"ID":"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f","Type":"ContainerStarted","Data":"d8a24782137cebdbcd31463cc75e2bd722f4414db22a6ff84b110339ad475fdf"} Apr 22 14:15:56.524924 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:56.524894 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2wcll" event={"ID":"ac56b2b8-1a22-40f8-a9b6-5a85a4f1f547","Type":"ContainerStarted","Data":"e6945b592aea0e86f3526764f94e2373c5fb25994749ab95e3bfb2b82027c634"} Apr 22 14:15:56.526432 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:56.526397 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2jw2j" event={"ID":"7a3b551c-cc42-4cf1-9c6a-1f8e29af3205","Type":"ContainerStarted","Data":"49e5cd4df5e35deb163e3316b903329813a3f46ee922e1488028a39fad138e27"} Apr 22 14:15:56.527423 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:56.527401 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9gzdh" event={"ID":"b6ddc308-4c95-4675-8dae-ecfb2c08991e","Type":"ContainerStarted","Data":"80a86202582539cb700e224342a9647e1bb79f1e2282468e24c1479ea362d9cf"} Apr 22 14:15:56.528512 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:56.528488 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5v8nc" event={"ID":"bfa3b731-8569-4e90-9bb7-4b1898b25baa","Type":"ContainerStarted","Data":"abedcafa7867c65f266ca12d8c1919cc1839a09d0b62337e0a73bb886b0627f7"} Apr 22 14:15:56.529670 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:56.529646 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-l652r" event={"ID":"f9ffd782-8eeb-457e-afbf-6985dab9b00d","Type":"ContainerStarted","Data":"04ba4ac801aad5e0d78ee80638bb4545180c2fca6ee473ff6a0464129a6307e4"} Apr 22 14:15:56.532701 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:56.532673 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" event={"ID":"4f7e86ec-be29-4af2-a455-86acd6f13c8f","Type":"ContainerStarted","Data":"5769af81ce52c46087e0f442f8686e10deab506e86e65097aa873b18319c584e"} Apr 22 14:15:56.535091 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:56.535066 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" event={"ID":"746fce17-4f0b-447d-b5e3-8a8e878c5498","Type":"ContainerStarted","Data":"295226b4300bf072bbb72fbe6339b437b92d237b6e7267c8fe4390f1425ea3c4"} Apr 22 14:15:56.535934 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:56.535874 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-255.ec2.internal" podStartSLOduration=2.535858526 podStartE2EDuration="2.535858526s" podCreationTimestamp="2026-04-22 14:15:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:15:56.535838716 +0000 UTC m=+3.511804157" watchObservedRunningTime="2026-04-22 14:15:56.535858526 +0000 UTC m=+3.511823969" Apr 22 14:15:56.536300 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:56.536277 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-886zv" event={"ID":"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0","Type":"ContainerStarted","Data":"3431d281c339adefa84b57d09aac49bda2b2c94e59c73c6c09d4a9df602a48aa"} Apr 22 14:15:57.011879 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:57.011837 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs\") pod \"network-metrics-daemon-9f46g\" (UID: \"11aa4ec9-8bc6-4d4d-b02e-6441451ad30d\") " pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:15:57.012057 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:57.012001 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:57.012122 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:57.012064 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs podName:11aa4ec9-8bc6-4d4d-b02e-6441451ad30d nodeName:}" failed. No retries permitted until 2026-04-22 14:15:59.012045905 +0000 UTC m=+5.988011331 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs") pod "network-metrics-daemon-9f46g" (UID: "11aa4ec9-8bc6-4d4d-b02e-6441451ad30d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:57.213621 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:57.213590 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrtrc\" (UniqueName: \"kubernetes.io/projected/ace85c9c-a18e-408d-b8d6-d5d2d99bc11a-kube-api-access-zrtrc\") pod \"network-check-target-2wdfd\" (UID: \"ace85c9c-a18e-408d-b8d6-d5d2d99bc11a\") " pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:15:57.213823 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:57.213803 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:57.213886 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:57.213830 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:57.213886 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:57.213843 2579 projected.go:194] Error preparing data for projected volume kube-api-access-zrtrc for pod openshift-network-diagnostics/network-check-target-2wdfd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:57.213990 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:57.213900 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ace85c9c-a18e-408d-b8d6-d5d2d99bc11a-kube-api-access-zrtrc podName:ace85c9c-a18e-408d-b8d6-d5d2d99bc11a nodeName:}" failed. No retries permitted until 2026-04-22 14:15:59.213881338 +0000 UTC m=+6.189846766 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-zrtrc" (UniqueName: "kubernetes.io/projected/ace85c9c-a18e-408d-b8d6-d5d2d99bc11a-kube-api-access-zrtrc") pod "network-check-target-2wdfd" (UID: "ace85c9c-a18e-408d-b8d6-d5d2d99bc11a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:57.513299 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:57.513263 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:15:57.513769 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:57.513411 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9f46g" podUID="11aa4ec9-8bc6-4d4d-b02e-6441451ad30d" Apr 22 14:15:57.557335 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:57.557298 2579 generic.go:358] "Generic (PLEG): container finished" podID="16c3d2024ae6a72619e3784aacdb1436" containerID="b15da36cf7b0da9f357b8acd14b69772d840ba79f1e8f6336db83e8bdf0a9398" exitCode=0 Apr 22 14:15:57.558399 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:57.558369 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" event={"ID":"16c3d2024ae6a72619e3784aacdb1436","Type":"ContainerDied","Data":"b15da36cf7b0da9f357b8acd14b69772d840ba79f1e8f6336db83e8bdf0a9398"} Apr 22 14:15:58.512996 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:58.512959 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:15:58.513181 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:58.513090 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wdfd" podUID="ace85c9c-a18e-408d-b8d6-d5d2d99bc11a" Apr 22 14:15:58.585676 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:58.585632 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" event={"ID":"16c3d2024ae6a72619e3784aacdb1436","Type":"ContainerStarted","Data":"da53d95fb789d9d42afcb8da6f6d6fe918bac6acbd683be6c6c5a96a8285fe2f"} Apr 22 14:15:59.027095 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:59.027055 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs\") pod \"network-metrics-daemon-9f46g\" (UID: \"11aa4ec9-8bc6-4d4d-b02e-6441451ad30d\") " pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:15:59.027262 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:59.027199 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:59.027340 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:59.027263 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs podName:11aa4ec9-8bc6-4d4d-b02e-6441451ad30d nodeName:}" failed. No retries permitted until 2026-04-22 14:16:03.027241457 +0000 UTC m=+10.003206888 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs") pod "network-metrics-daemon-9f46g" (UID: "11aa4ec9-8bc6-4d4d-b02e-6441451ad30d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:59.229526 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:59.229258 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrtrc\" (UniqueName: \"kubernetes.io/projected/ace85c9c-a18e-408d-b8d6-d5d2d99bc11a-kube-api-access-zrtrc\") pod \"network-check-target-2wdfd\" (UID: \"ace85c9c-a18e-408d-b8d6-d5d2d99bc11a\") " pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:15:59.229526 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:59.229467 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:59.229526 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:59.229487 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:59.229526 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:59.229501 2579 projected.go:194] Error preparing data for projected volume kube-api-access-zrtrc for pod openshift-network-diagnostics/network-check-target-2wdfd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:59.229855 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:59.229560 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ace85c9c-a18e-408d-b8d6-d5d2d99bc11a-kube-api-access-zrtrc podName:ace85c9c-a18e-408d-b8d6-d5d2d99bc11a nodeName:}" failed. No retries permitted until 2026-04-22 14:16:03.229542615 +0000 UTC m=+10.205508035 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-zrtrc" (UniqueName: "kubernetes.io/projected/ace85c9c-a18e-408d-b8d6-d5d2d99bc11a-kube-api-access-zrtrc") pod "network-check-target-2wdfd" (UID: "ace85c9c-a18e-408d-b8d6-d5d2d99bc11a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:59.513283 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:15:59.513248 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:15:59.513486 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:15:59.513399 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9f46g" podUID="11aa4ec9-8bc6-4d4d-b02e-6441451ad30d" Apr 22 14:16:00.513172 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:00.513087 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:16:00.513643 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:00.513215 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wdfd" podUID="ace85c9c-a18e-408d-b8d6-d5d2d99bc11a" Apr 22 14:16:01.512774 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:01.512734 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:16:01.512935 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:01.512884 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9f46g" podUID="11aa4ec9-8bc6-4d4d-b02e-6441451ad30d" Apr 22 14:16:02.513544 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:02.513513 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:16:02.514028 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:02.513631 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wdfd" podUID="ace85c9c-a18e-408d-b8d6-d5d2d99bc11a" Apr 22 14:16:03.064592 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:03.064544 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs\") pod \"network-metrics-daemon-9f46g\" (UID: \"11aa4ec9-8bc6-4d4d-b02e-6441451ad30d\") " pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:16:03.064795 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:03.064705 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:16:03.064795 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:03.064767 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs podName:11aa4ec9-8bc6-4d4d-b02e-6441451ad30d nodeName:}" failed. No retries permitted until 2026-04-22 14:16:11.064749603 +0000 UTC m=+18.040715029 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs") pod "network-metrics-daemon-9f46g" (UID: "11aa4ec9-8bc6-4d4d-b02e-6441451ad30d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:16:03.266009 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:03.265967 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrtrc\" (UniqueName: \"kubernetes.io/projected/ace85c9c-a18e-408d-b8d6-d5d2d99bc11a-kube-api-access-zrtrc\") pod \"network-check-target-2wdfd\" (UID: \"ace85c9c-a18e-408d-b8d6-d5d2d99bc11a\") " pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:16:03.266207 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:03.266163 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:16:03.266207 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:03.266194 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:16:03.266207 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:03.266207 2579 projected.go:194] Error preparing data for projected volume kube-api-access-zrtrc for pod openshift-network-diagnostics/network-check-target-2wdfd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:16:03.266354 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:03.266274 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ace85c9c-a18e-408d-b8d6-d5d2d99bc11a-kube-api-access-zrtrc podName:ace85c9c-a18e-408d-b8d6-d5d2d99bc11a nodeName:}" failed. No retries permitted until 2026-04-22 14:16:11.26625504 +0000 UTC m=+18.242220468 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-zrtrc" (UniqueName: "kubernetes.io/projected/ace85c9c-a18e-408d-b8d6-d5d2d99bc11a-kube-api-access-zrtrc") pod "network-check-target-2wdfd" (UID: "ace85c9c-a18e-408d-b8d6-d5d2d99bc11a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:16:03.514986 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:03.514446 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:16:03.514986 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:03.514593 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9f46g" podUID="11aa4ec9-8bc6-4d4d-b02e-6441451ad30d" Apr 22 14:16:04.512830 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:04.512777 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:16:04.513017 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:04.512911 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wdfd" podUID="ace85c9c-a18e-408d-b8d6-d5d2d99bc11a" Apr 22 14:16:05.513429 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:05.513391 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:16:05.513957 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:05.513531 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9f46g" podUID="11aa4ec9-8bc6-4d4d-b02e-6441451ad30d" Apr 22 14:16:06.513380 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:06.513348 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:16:06.513551 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:06.513469 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wdfd" podUID="ace85c9c-a18e-408d-b8d6-d5d2d99bc11a" Apr 22 14:16:07.513030 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:07.512995 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:16:07.513240 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:07.513151 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9f46g" podUID="11aa4ec9-8bc6-4d4d-b02e-6441451ad30d" Apr 22 14:16:08.513120 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:08.512997 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:16:08.513120 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:08.513098 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wdfd" podUID="ace85c9c-a18e-408d-b8d6-d5d2d99bc11a" Apr 22 14:16:09.513425 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:09.513373 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:16:09.513987 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:09.513533 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9f46g" podUID="11aa4ec9-8bc6-4d4d-b02e-6441451ad30d" Apr 22 14:16:10.513174 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:10.513133 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:16:10.513368 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:10.513259 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wdfd" podUID="ace85c9c-a18e-408d-b8d6-d5d2d99bc11a" Apr 22 14:16:11.118126 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:11.118083 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs\") pod \"network-metrics-daemon-9f46g\" (UID: \"11aa4ec9-8bc6-4d4d-b02e-6441451ad30d\") " pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:16:11.118564 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:11.118216 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:16:11.118564 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:11.118277 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs podName:11aa4ec9-8bc6-4d4d-b02e-6441451ad30d nodeName:}" failed. No retries permitted until 2026-04-22 14:16:27.118262185 +0000 UTC m=+34.094227603 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs") pod "network-metrics-daemon-9f46g" (UID: "11aa4ec9-8bc6-4d4d-b02e-6441451ad30d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:16:11.319354 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:11.319319 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrtrc\" (UniqueName: \"kubernetes.io/projected/ace85c9c-a18e-408d-b8d6-d5d2d99bc11a-kube-api-access-zrtrc\") pod \"network-check-target-2wdfd\" (UID: \"ace85c9c-a18e-408d-b8d6-d5d2d99bc11a\") " pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:16:11.319522 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:11.319495 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:16:11.319522 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:11.319520 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:16:11.319594 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:11.319530 2579 projected.go:194] Error preparing data for projected volume kube-api-access-zrtrc for pod openshift-network-diagnostics/network-check-target-2wdfd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:16:11.319594 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:11.319589 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ace85c9c-a18e-408d-b8d6-d5d2d99bc11a-kube-api-access-zrtrc podName:ace85c9c-a18e-408d-b8d6-d5d2d99bc11a nodeName:}" failed. No retries permitted until 2026-04-22 14:16:27.31957334 +0000 UTC m=+34.295538759 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-zrtrc" (UniqueName: "kubernetes.io/projected/ace85c9c-a18e-408d-b8d6-d5d2d99bc11a-kube-api-access-zrtrc") pod "network-check-target-2wdfd" (UID: "ace85c9c-a18e-408d-b8d6-d5d2d99bc11a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:16:11.512725 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:11.512674 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:16:11.512897 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:11.512829 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9f46g" podUID="11aa4ec9-8bc6-4d4d-b02e-6441451ad30d" Apr 22 14:16:12.512875 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:12.512836 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:16:12.513300 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:12.512975 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wdfd" podUID="ace85c9c-a18e-408d-b8d6-d5d2d99bc11a" Apr 22 14:16:13.514540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:13.514286 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:16:13.515235 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:13.514654 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9f46g" podUID="11aa4ec9-8bc6-4d4d-b02e-6441451ad30d" Apr 22 14:16:13.611174 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:13.611141 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2wcll" event={"ID":"ac56b2b8-1a22-40f8-a9b6-5a85a4f1f547","Type":"ContainerStarted","Data":"404fe1a23aeb03d39ea7a0eaeb3f09869344fd60b8bb80fcfcbb9f6521e8a760"} Apr 22 14:16:13.615885 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:13.615851 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5v8nc" event={"ID":"bfa3b731-8569-4e90-9bb7-4b1898b25baa","Type":"ContainerStarted","Data":"5bd957d050e8a4af6f931f42b45eb1dbdb5b73bfb5363e63f6ed9f378a5569f3"} Apr 22 14:16:13.617372 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:13.617345 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-l652r" event={"ID":"f9ffd782-8eeb-457e-afbf-6985dab9b00d","Type":"ContainerStarted","Data":"b39f2a5e94bf13e9afb73f2bededb44d2044a94651631dcd11fde49a237c0ddf"} Apr 22 14:16:13.620248 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:13.620159 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" event={"ID":"4f7e86ec-be29-4af2-a455-86acd6f13c8f","Type":"ContainerStarted","Data":"26fac13b90850aa566f5dcb784d82bf2a8e434c4a2b245417166a536a95afb19"} Apr 22 14:16:13.621904 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:13.621881 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" event={"ID":"746fce17-4f0b-447d-b5e3-8a8e878c5498","Type":"ContainerStarted","Data":"7742101f05eb8b3ceb7bfdf172e6a7f1dfced438e027a8afead52b21e5252605"} Apr 22 14:16:13.623353 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:13.623329 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-886zv" event={"ID":"b22e95d1-d9a7-4282-b3bd-1e77a23c5db0","Type":"ContainerStarted","Data":"75e002875f0c8c2131f27efd10b6e6927fc3b0c036b4129e1136a623913fa780"} Apr 22 14:16:13.625019 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:13.624995 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" event={"ID":"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f","Type":"ContainerStarted","Data":"96b8ef76138187ed2b6060822d2a08879045c1f96eec251f303afc34b3e28bac"} Apr 22 14:16:13.628907 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:13.628876 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" podStartSLOduration=19.628863818 podStartE2EDuration="19.628863818s" podCreationTimestamp="2026-04-22 14:15:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:15:58.603862296 +0000 UTC m=+5.579827737" watchObservedRunningTime="2026-04-22 14:16:13.628863818 +0000 UTC m=+20.604829261" Apr 22 14:16:13.629282 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:13.629255 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2wcll" podStartSLOduration=3.766058194 podStartE2EDuration="20.62924598s" podCreationTimestamp="2026-04-22 14:15:53 +0000 UTC" firstStartedPulling="2026-04-22 14:15:56.231238933 +0000 UTC m=+3.207204351" lastFinishedPulling="2026-04-22 14:16:13.094426714 +0000 UTC m=+20.070392137" observedRunningTime="2026-04-22 14:16:13.628679703 +0000 UTC m=+20.604645146" watchObservedRunningTime="2026-04-22 14:16:13.62924598 +0000 UTC m=+20.605211422" Apr 22 14:16:13.647487 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:13.647435 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-hh4lt" podStartSLOduration=3.752712773 podStartE2EDuration="20.647422048s" podCreationTimestamp="2026-04-22 14:15:53 +0000 UTC" firstStartedPulling="2026-04-22 14:15:56.226864258 +0000 UTC m=+3.202829684" lastFinishedPulling="2026-04-22 14:16:13.12157354 +0000 UTC m=+20.097538959" observedRunningTime="2026-04-22 14:16:13.646983873 +0000 UTC m=+20.622949324" watchObservedRunningTime="2026-04-22 14:16:13.647422048 +0000 UTC m=+20.623387487" Apr 22 14:16:13.688605 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:13.688389 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-886zv" podStartSLOduration=3.603074548 podStartE2EDuration="20.688370606s" podCreationTimestamp="2026-04-22 14:15:53 +0000 UTC" firstStartedPulling="2026-04-22 14:15:56.22353069 +0000 UTC m=+3.199496111" lastFinishedPulling="2026-04-22 14:16:13.30882674 +0000 UTC m=+20.284792169" observedRunningTime="2026-04-22 14:16:13.688172871 +0000 UTC m=+20.664138311" watchObservedRunningTime="2026-04-22 14:16:13.688370606 +0000 UTC m=+20.664336048" Apr 22 14:16:13.706304 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:13.706262 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-l652r" podStartSLOduration=3.809297671 podStartE2EDuration="20.706248959s" podCreationTimestamp="2026-04-22 14:15:53 +0000 UTC" firstStartedPulling="2026-04-22 14:15:56.224620949 +0000 UTC m=+3.200586369" lastFinishedPulling="2026-04-22 14:16:13.121572235 +0000 UTC m=+20.097537657" observedRunningTime="2026-04-22 14:16:13.706024966 +0000 UTC m=+20.681990406" watchObservedRunningTime="2026-04-22 14:16:13.706248959 +0000 UTC m=+20.682214398" Apr 22 14:16:14.513295 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:14.513115 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:16:14.513487 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:14.513381 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wdfd" podUID="ace85c9c-a18e-408d-b8d6-d5d2d99bc11a" Apr 22 14:16:14.628490 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:14.628407 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 14:16:14.628888 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:14.628709 2579 generic.go:358] "Generic (PLEG): container finished" podID="6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f" containerID="5fb78404d7fc4437f063410b20df77f2792851c989c8ae1d46f5d1ddd4f8398a" exitCode=1 Apr 22 14:16:14.628888 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:14.628744 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" event={"ID":"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f","Type":"ContainerStarted","Data":"63785e958f4ff80e7f4732e8cdec141c34ef5598d20bd8d3899dbb2e7b8e74f6"} Apr 22 14:16:14.628888 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:14.628780 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" event={"ID":"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f","Type":"ContainerStarted","Data":"791021f861ce170826d879d1aee1c8621546036ebc42a2265704272f9b31953d"} Apr 22 14:16:14.628888 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:14.628794 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" event={"ID":"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f","Type":"ContainerStarted","Data":"83c64f4a85ea7df253becf3710f77edc7457977c5910fbe7d86563e9cd31bc26"} Apr 22 14:16:14.628888 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:14.628803 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" event={"ID":"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f","Type":"ContainerStarted","Data":"623ab7212105db07f29aa1f737fb7fde79919497e605a6050b87eb4edd9825ff"} Apr 22 14:16:14.628888 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:14.628811 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" event={"ID":"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f","Type":"ContainerDied","Data":"5fb78404d7fc4437f063410b20df77f2792851c989c8ae1d46f5d1ddd4f8398a"} Apr 22 14:16:14.629967 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:14.629943 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9gzdh" event={"ID":"b6ddc308-4c95-4675-8dae-ecfb2c08991e","Type":"ContainerStarted","Data":"ecb8ee30cf73edc776defc972fa464bec464dded2b02219e22bda0794760be29"} Apr 22 14:16:14.631176 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:14.631156 2579 generic.go:358] "Generic (PLEG): container finished" podID="bfa3b731-8569-4e90-9bb7-4b1898b25baa" containerID="5bd957d050e8a4af6f931f42b45eb1dbdb5b73bfb5363e63f6ed9f378a5569f3" exitCode=0 Apr 22 14:16:14.631264 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:14.631250 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5v8nc" event={"ID":"bfa3b731-8569-4e90-9bb7-4b1898b25baa","Type":"ContainerDied","Data":"5bd957d050e8a4af6f931f42b45eb1dbdb5b73bfb5363e63f6ed9f378a5569f3"} Apr 22 14:16:14.646301 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:14.646265 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9gzdh" podStartSLOduration=4.566972452 podStartE2EDuration="21.646253468s" podCreationTimestamp="2026-04-22 14:15:53 +0000 UTC" firstStartedPulling="2026-04-22 14:15:56.23091028 +0000 UTC m=+3.206875700" lastFinishedPulling="2026-04-22 14:16:13.310191298 +0000 UTC m=+20.286156716" observedRunningTime="2026-04-22 14:16:14.645934403 +0000 UTC m=+21.621899845" watchObservedRunningTime="2026-04-22 14:16:14.646253468 +0000 UTC m=+21.622218908" Apr 22 14:16:14.971411 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:14.971388 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 14:16:15.449913 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:15.449792 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T14:16:14.971402662Z","UUID":"f18cac87-01bc-44d3-ac6e-18ed6b197d12","Handler":null,"Name":"","Endpoint":""} Apr 22 14:16:15.453584 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:15.453560 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 14:16:15.453584 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:15.453592 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 14:16:15.513577 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:15.513542 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:16:15.513751 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:15.513684 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9f46g" podUID="11aa4ec9-8bc6-4d4d-b02e-6441451ad30d" Apr 22 14:16:15.634866 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:15.634831 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" event={"ID":"746fce17-4f0b-447d-b5e3-8a8e878c5498","Type":"ContainerStarted","Data":"db020646ac410aaf58fd62814c26277e07b268410c71ff187220c7d137a6e50b"} Apr 22 14:16:15.636488 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:15.636443 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2jw2j" event={"ID":"7a3b551c-cc42-4cf1-9c6a-1f8e29af3205","Type":"ContainerStarted","Data":"e1706db2be4efc563c487af195300f13d33a390afb793eb16324bfe848bf050a"} Apr 22 14:16:15.656214 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:15.656168 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2jw2j" podStartSLOduration=5.610155251 podStartE2EDuration="22.656153225s" podCreationTimestamp="2026-04-22 14:15:53 +0000 UTC" firstStartedPulling="2026-04-22 14:15:56.232924174 +0000 UTC m=+3.208889606" lastFinishedPulling="2026-04-22 14:16:13.278922162 +0000 UTC m=+20.254887580" observedRunningTime="2026-04-22 14:16:15.656039745 +0000 UTC m=+22.632005186" watchObservedRunningTime="2026-04-22 14:16:15.656153225 +0000 UTC m=+22.632118665" Apr 22 14:16:16.171546 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:16.171521 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-l652r" Apr 22 14:16:16.172265 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:16.172235 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-l652r" Apr 22 14:16:16.512812 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:16.512780 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:16:16.512983 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:16.512900 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wdfd" podUID="ace85c9c-a18e-408d-b8d6-d5d2d99bc11a" Apr 22 14:16:16.642417 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:16.642334 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 14:16:16.642980 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:16.642745 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" event={"ID":"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f","Type":"ContainerStarted","Data":"db9e43c6b29603c41c6e58f1f843521043e8ed49f1ae01861472a2a797d14caf"} Apr 22 14:16:16.644817 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:16.644787 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" event={"ID":"746fce17-4f0b-447d-b5e3-8a8e878c5498","Type":"ContainerStarted","Data":"75dfd198518131a8108e8c3a318f5ea2ed119eec37843ab29486365ea4da8a25"} Apr 22 14:16:16.644982 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:16.644966 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-l652r" Apr 22 14:16:16.645477 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:16.645429 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-l652r" Apr 22 14:16:16.662127 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:16.662077 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-97qzw" podStartSLOduration=3.770192605 podStartE2EDuration="23.662065435s" podCreationTimestamp="2026-04-22 14:15:53 +0000 UTC" firstStartedPulling="2026-04-22 14:15:56.22626181 +0000 UTC m=+3.202227229" lastFinishedPulling="2026-04-22 14:16:16.118134638 +0000 UTC m=+23.094100059" observedRunningTime="2026-04-22 14:16:16.661545378 +0000 UTC m=+23.637510819" watchObservedRunningTime="2026-04-22 14:16:16.662065435 +0000 UTC m=+23.638030875" Apr 22 14:16:17.513497 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:17.513465 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:16:17.513688 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:17.513606 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9f46g" podUID="11aa4ec9-8bc6-4d4d-b02e-6441451ad30d" Apr 22 14:16:18.512630 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:18.512597 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:16:18.513168 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:18.513106 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wdfd" podUID="ace85c9c-a18e-408d-b8d6-d5d2d99bc11a" Apr 22 14:16:19.513052 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:19.513022 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:16:19.513408 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:19.513118 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9f46g" podUID="11aa4ec9-8bc6-4d4d-b02e-6441451ad30d" Apr 22 14:16:19.652379 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:19.652345 2579 generic.go:358] "Generic (PLEG): container finished" podID="bfa3b731-8569-4e90-9bb7-4b1898b25baa" containerID="d68f709fa5f7342d69f7a0bbd56da7964409d5f56c8431612c6edf89707d1b45" exitCode=0 Apr 22 14:16:19.652541 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:19.652434 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5v8nc" event={"ID":"bfa3b731-8569-4e90-9bb7-4b1898b25baa","Type":"ContainerDied","Data":"d68f709fa5f7342d69f7a0bbd56da7964409d5f56c8431612c6edf89707d1b45"} Apr 22 14:16:19.655599 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:19.655495 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 14:16:19.655792 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:19.655771 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" event={"ID":"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f","Type":"ContainerStarted","Data":"50c360f3b2f96bbc9a2422a7e9df0b21f44ebf6ac2d1ffd9e0585c1646ab3fa9"} Apr 22 14:16:19.656082 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:19.656060 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:16:19.656192 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:19.656093 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:16:19.656192 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:19.656108 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:16:19.656295 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:19.656239 2579 scope.go:117] "RemoveContainer" containerID="5fb78404d7fc4437f063410b20df77f2792851c989c8ae1d46f5d1ddd4f8398a" Apr 22 14:16:19.670828 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:19.670806 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:16:19.671015 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:19.670983 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:16:20.512901 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:20.512724 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:16:20.513027 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:20.512966 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wdfd" podUID="ace85c9c-a18e-408d-b8d6-d5d2d99bc11a" Apr 22 14:16:20.586488 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:20.586397 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2wdfd"] Apr 22 14:16:20.589424 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:20.589400 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9f46g"] Apr 22 14:16:20.589542 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:20.589529 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:16:20.589687 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:20.589655 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9f46g" podUID="11aa4ec9-8bc6-4d4d-b02e-6441451ad30d" Apr 22 14:16:20.660827 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:20.660800 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 14:16:20.661166 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:20.661139 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" event={"ID":"6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f","Type":"ContainerStarted","Data":"a0f0848bf353f1b98fe65b5ae173f54145234a85cdda6f349039157d967ea6d9"} Apr 22 14:16:20.665717 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:20.665690 2579 generic.go:358] "Generic (PLEG): container finished" podID="bfa3b731-8569-4e90-9bb7-4b1898b25baa" containerID="8e095d00d245123cc84c286e914822fa07a22241f6c86e72f321482845ef49a9" exitCode=0 Apr 22 14:16:20.665838 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:20.665769 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:16:20.665838 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:20.665766 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5v8nc" event={"ID":"bfa3b731-8569-4e90-9bb7-4b1898b25baa","Type":"ContainerDied","Data":"8e095d00d245123cc84c286e914822fa07a22241f6c86e72f321482845ef49a9"} Apr 22 14:16:20.665951 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:20.665880 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wdfd" podUID="ace85c9c-a18e-408d-b8d6-d5d2d99bc11a" Apr 22 14:16:20.692722 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:20.692682 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" podStartSLOduration=10.566810856 podStartE2EDuration="27.692668328s" podCreationTimestamp="2026-04-22 14:15:53 +0000 UTC" firstStartedPulling="2026-04-22 14:15:56.233509184 +0000 UTC m=+3.209474602" lastFinishedPulling="2026-04-22 14:16:13.359366654 +0000 UTC m=+20.335332074" observedRunningTime="2026-04-22 14:16:20.691249716 +0000 UTC m=+27.667215156" watchObservedRunningTime="2026-04-22 14:16:20.692668328 +0000 UTC m=+27.668633765" Apr 22 14:16:21.669780 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:21.669746 2579 generic.go:358] "Generic (PLEG): container finished" podID="bfa3b731-8569-4e90-9bb7-4b1898b25baa" containerID="d02f91705b282d7afcff36174f656aa4fbdd37fbf0601575ee701035ca9ec108" exitCode=0 Apr 22 14:16:21.670137 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:21.669823 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5v8nc" event={"ID":"bfa3b731-8569-4e90-9bb7-4b1898b25baa","Type":"ContainerDied","Data":"d02f91705b282d7afcff36174f656aa4fbdd37fbf0601575ee701035ca9ec108"} Apr 22 14:16:22.513009 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:22.512979 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:16:22.513171 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:22.513013 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:16:22.513171 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:22.513104 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wdfd" podUID="ace85c9c-a18e-408d-b8d6-d5d2d99bc11a" Apr 22 14:16:22.513281 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:22.513226 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9f46g" podUID="11aa4ec9-8bc6-4d4d-b02e-6441451ad30d" Apr 22 14:16:24.513049 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:24.513013 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:16:24.513594 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:24.513013 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:16:24.513594 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:24.513135 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9f46g" podUID="11aa4ec9-8bc6-4d4d-b02e-6441451ad30d" Apr 22 14:16:24.513594 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:24.513212 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wdfd" podUID="ace85c9c-a18e-408d-b8d6-d5d2d99bc11a" Apr 22 14:16:26.391505 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.391235 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeReady" Apr 22 14:16:26.391929 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.391576 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 14:16:26.440685 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.440654 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-bjv7t"] Apr 22 14:16:26.445782 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.445751 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8dbtw"] Apr 22 14:16:26.445944 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.445918 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bjv7t" Apr 22 14:16:26.448546 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.448472 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 14:16:26.448546 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.448533 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 14:16:26.448780 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.448551 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7gvsw\"" Apr 22 14:16:26.448780 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.448541 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8dbtw" Apr 22 14:16:26.450968 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.450948 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 14:16:26.451098 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.450983 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 14:16:26.451098 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.450949 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 14:16:26.451098 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.451035 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-n9tgv\"" Apr 22 14:16:26.457816 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.456779 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bjv7t"] Apr 22 14:16:26.458470 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.457820 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8dbtw"] Apr 22 14:16:26.513546 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.513511 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:16:26.513712 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.513511 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:16:26.516382 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.516350 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 14:16:26.516546 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.516394 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 14:16:26.516546 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.516415 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 14:16:26.516546 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.516487 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7xm6q\"" Apr 22 14:16:26.516546 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.516350 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-4bnk6\"" Apr 22 14:16:26.527517 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.527437 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbvvf\" (UniqueName: \"kubernetes.io/projected/15b22104-ccf3-493a-b1a4-c4984e2ead3b-kube-api-access-qbvvf\") pod \"ingress-canary-8dbtw\" (UID: \"15b22104-ccf3-493a-b1a4-c4984e2ead3b\") " pod="openshift-ingress-canary/ingress-canary-8dbtw" Apr 22 14:16:26.527517 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.527514 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twqsl\" (UniqueName: \"kubernetes.io/projected/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-kube-api-access-twqsl\") pod \"dns-default-bjv7t\" (UID: \"7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69\") " pod="openshift-dns/dns-default-bjv7t" Apr 22 14:16:26.527692 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.527545 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15b22104-ccf3-493a-b1a4-c4984e2ead3b-cert\") pod \"ingress-canary-8dbtw\" (UID: \"15b22104-ccf3-493a-b1a4-c4984e2ead3b\") " pod="openshift-ingress-canary/ingress-canary-8dbtw" Apr 22 14:16:26.527692 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.527603 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-metrics-tls\") pod \"dns-default-bjv7t\" (UID: \"7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69\") " pod="openshift-dns/dns-default-bjv7t" Apr 22 14:16:26.527692 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.527630 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-config-volume\") pod \"dns-default-bjv7t\" (UID: \"7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69\") " pod="openshift-dns/dns-default-bjv7t" Apr 22 14:16:26.527692 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.527652 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-tmp-dir\") pod \"dns-default-bjv7t\" (UID: \"7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69\") " pod="openshift-dns/dns-default-bjv7t" Apr 22 14:16:26.628150 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.628118 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-metrics-tls\") pod \"dns-default-bjv7t\" (UID: \"7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69\") " pod="openshift-dns/dns-default-bjv7t" Apr 22 14:16:26.628150 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.628153 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-config-volume\") pod \"dns-default-bjv7t\" (UID: \"7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69\") " pod="openshift-dns/dns-default-bjv7t" Apr 22 14:16:26.628383 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.628172 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-tmp-dir\") pod \"dns-default-bjv7t\" (UID: \"7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69\") " pod="openshift-dns/dns-default-bjv7t" Apr 22 14:16:26.628383 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.628195 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbvvf\" (UniqueName: \"kubernetes.io/projected/15b22104-ccf3-493a-b1a4-c4984e2ead3b-kube-api-access-qbvvf\") pod \"ingress-canary-8dbtw\" (UID: \"15b22104-ccf3-493a-b1a4-c4984e2ead3b\") " pod="openshift-ingress-canary/ingress-canary-8dbtw" Apr 22 14:16:26.628383 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.628234 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twqsl\" (UniqueName: \"kubernetes.io/projected/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-kube-api-access-twqsl\") pod \"dns-default-bjv7t\" (UID: \"7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69\") " pod="openshift-dns/dns-default-bjv7t" Apr 22 14:16:26.628383 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:26.628276 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:26.628383 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:26.628358 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-metrics-tls podName:7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:27.128337143 +0000 UTC m=+34.104302582 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-metrics-tls") pod "dns-default-bjv7t" (UID: "7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69") : secret "dns-default-metrics-tls" not found Apr 22 14:16:26.628383 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:26.628357 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:26.628684 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:26.628397 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15b22104-ccf3-493a-b1a4-c4984e2ead3b-cert podName:15b22104-ccf3-493a-b1a4-c4984e2ead3b nodeName:}" failed. No retries permitted until 2026-04-22 14:16:27.128387699 +0000 UTC m=+34.104353132 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15b22104-ccf3-493a-b1a4-c4984e2ead3b-cert") pod "ingress-canary-8dbtw" (UID: "15b22104-ccf3-493a-b1a4-c4984e2ead3b") : secret "canary-serving-cert" not found Apr 22 14:16:26.628684 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.628280 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15b22104-ccf3-493a-b1a4-c4984e2ead3b-cert\") pod \"ingress-canary-8dbtw\" (UID: \"15b22104-ccf3-493a-b1a4-c4984e2ead3b\") " pod="openshift-ingress-canary/ingress-canary-8dbtw" Apr 22 14:16:26.628684 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.628663 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-tmp-dir\") pod \"dns-default-bjv7t\" (UID: \"7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69\") " pod="openshift-dns/dns-default-bjv7t" Apr 22 14:16:26.629037 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.629021 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-config-volume\") pod \"dns-default-bjv7t\" (UID: \"7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69\") " pod="openshift-dns/dns-default-bjv7t" Apr 22 14:16:26.641808 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.641737 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twqsl\" (UniqueName: \"kubernetes.io/projected/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-kube-api-access-twqsl\") pod \"dns-default-bjv7t\" (UID: \"7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69\") " pod="openshift-dns/dns-default-bjv7t" Apr 22 14:16:26.641808 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:26.641793 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbvvf\" (UniqueName: \"kubernetes.io/projected/15b22104-ccf3-493a-b1a4-c4984e2ead3b-kube-api-access-qbvvf\") pod \"ingress-canary-8dbtw\" (UID: \"15b22104-ccf3-493a-b1a4-c4984e2ead3b\") " pod="openshift-ingress-canary/ingress-canary-8dbtw" Apr 22 14:16:27.024468 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.024419 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-ws9fs"] Apr 22 14:16:27.028944 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.028917 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ws9fs" Apr 22 14:16:27.031743 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.031657 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 14:16:27.031743 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.031742 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 14:16:27.031928 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.031750 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 14:16:27.031928 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.031789 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4htz2\"" Apr 22 14:16:27.031928 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.031907 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 14:16:27.038619 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.038567 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ws9fs"] Apr 22 14:16:27.131072 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.131031 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-metrics-tls\") pod \"dns-default-bjv7t\" (UID: \"7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69\") " pod="openshift-dns/dns-default-bjv7t" Apr 22 14:16:27.131072 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.131078 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs\") pod \"network-metrics-daemon-9f46g\" (UID: \"11aa4ec9-8bc6-4d4d-b02e-6441451ad30d\") " pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:16:27.131317 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.131103 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-data-volume\") pod \"insights-runtime-extractor-ws9fs\" (UID: \"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f\") " pod="openshift-insights/insights-runtime-extractor-ws9fs" Apr 22 14:16:27.131317 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.131135 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15b22104-ccf3-493a-b1a4-c4984e2ead3b-cert\") pod \"ingress-canary-8dbtw\" (UID: \"15b22104-ccf3-493a-b1a4-c4984e2ead3b\") " pod="openshift-ingress-canary/ingress-canary-8dbtw" Apr 22 14:16:27.131317 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.131162 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ws9fs\" (UID: \"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f\") " pod="openshift-insights/insights-runtime-extractor-ws9fs" Apr 22 14:16:27.131317 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:27.131188 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:27.131317 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.131199 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ws9fs\" (UID: \"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f\") " pod="openshift-insights/insights-runtime-extractor-ws9fs" Apr 22 14:16:27.131317 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:27.131219 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:16:27.131317 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.131239 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-crio-socket\") pod \"insights-runtime-extractor-ws9fs\" (UID: \"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f\") " pod="openshift-insights/insights-runtime-extractor-ws9fs" Apr 22 14:16:27.131317 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:27.131255 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:27.131317 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:27.131257 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-metrics-tls podName:7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:28.131238934 +0000 UTC m=+35.107204368 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-metrics-tls") pod "dns-default-bjv7t" (UID: "7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69") : secret "dns-default-metrics-tls" not found Apr 22 14:16:27.131317 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:27.131305 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs podName:11aa4ec9-8bc6-4d4d-b02e-6441451ad30d nodeName:}" failed. No retries permitted until 2026-04-22 14:16:59.131286652 +0000 UTC m=+66.107252076 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs") pod "network-metrics-daemon-9f46g" (UID: "11aa4ec9-8bc6-4d4d-b02e-6441451ad30d") : secret "metrics-daemon-secret" not found Apr 22 14:16:27.131317 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:27.131324 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15b22104-ccf3-493a-b1a4-c4984e2ead3b-cert podName:15b22104-ccf3-493a-b1a4-c4984e2ead3b nodeName:}" failed. No retries permitted until 2026-04-22 14:16:28.131314293 +0000 UTC m=+35.107279710 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15b22104-ccf3-493a-b1a4-c4984e2ead3b-cert") pod "ingress-canary-8dbtw" (UID: "15b22104-ccf3-493a-b1a4-c4984e2ead3b") : secret "canary-serving-cert" not found Apr 22 14:16:27.131790 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.131346 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ql25\" (UniqueName: \"kubernetes.io/projected/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-kube-api-access-8ql25\") pod \"insights-runtime-extractor-ws9fs\" (UID: \"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f\") " pod="openshift-insights/insights-runtime-extractor-ws9fs" Apr 22 14:16:27.231679 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.231644 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ws9fs\" (UID: \"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f\") " pod="openshift-insights/insights-runtime-extractor-ws9fs" Apr 22 14:16:27.231679 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.231684 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ws9fs\" (UID: \"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f\") " pod="openshift-insights/insights-runtime-extractor-ws9fs" Apr 22 14:16:27.231908 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.231701 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-crio-socket\") pod \"insights-runtime-extractor-ws9fs\" (UID: \"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f\") " pod="openshift-insights/insights-runtime-extractor-ws9fs" Apr 22 14:16:27.231908 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.231720 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ql25\" (UniqueName: \"kubernetes.io/projected/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-kube-api-access-8ql25\") pod \"insights-runtime-extractor-ws9fs\" (UID: \"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f\") " pod="openshift-insights/insights-runtime-extractor-ws9fs" Apr 22 14:16:27.231908 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.231769 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-data-volume\") pod \"insights-runtime-extractor-ws9fs\" (UID: \"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f\") " pod="openshift-insights/insights-runtime-extractor-ws9fs" Apr 22 14:16:27.231908 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:27.231810 2579 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 14:16:27.231908 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:27.231889 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-insights-runtime-extractor-tls podName:2a82bf3e-7d0c-4072-926d-b4ef0afaa57f nodeName:}" failed. No retries permitted until 2026-04-22 14:16:27.731869729 +0000 UTC m=+34.707835173 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-insights-runtime-extractor-tls") pod "insights-runtime-extractor-ws9fs" (UID: "2a82bf3e-7d0c-4072-926d-b4ef0afaa57f") : secret "insights-runtime-extractor-tls" not found Apr 22 14:16:27.231908 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.231888 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-crio-socket\") pod \"insights-runtime-extractor-ws9fs\" (UID: \"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f\") " pod="openshift-insights/insights-runtime-extractor-ws9fs" Apr 22 14:16:27.232294 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.232068 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-data-volume\") pod \"insights-runtime-extractor-ws9fs\" (UID: \"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f\") " pod="openshift-insights/insights-runtime-extractor-ws9fs" Apr 22 14:16:27.232294 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.232178 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ws9fs\" (UID: \"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f\") " pod="openshift-insights/insights-runtime-extractor-ws9fs" Apr 22 14:16:27.241705 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.241683 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ql25\" (UniqueName: \"kubernetes.io/projected/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-kube-api-access-8ql25\") pod \"insights-runtime-extractor-ws9fs\" (UID: \"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f\") " pod="openshift-insights/insights-runtime-extractor-ws9fs" Apr 22 14:16:27.332472 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.332398 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrtrc\" (UniqueName: \"kubernetes.io/projected/ace85c9c-a18e-408d-b8d6-d5d2d99bc11a-kube-api-access-zrtrc\") pod \"network-check-target-2wdfd\" (UID: \"ace85c9c-a18e-408d-b8d6-d5d2d99bc11a\") " pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:16:27.335661 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.335636 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrtrc\" (UniqueName: \"kubernetes.io/projected/ace85c9c-a18e-408d-b8d6-d5d2d99bc11a-kube-api-access-zrtrc\") pod \"network-check-target-2wdfd\" (UID: \"ace85c9c-a18e-408d-b8d6-d5d2d99bc11a\") " pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:16:27.425923 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.425896 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:16:27.592243 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.592210 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2wdfd"] Apr 22 14:16:27.596745 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:16:27.596719 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podace85c9c_a18e_408d_b8d6_d5d2d99bc11a.slice/crio-d23b2e812f22ce1c378d37deceaa97a41dc728d2d24eee84a7820fb15aa3d3d4 WatchSource:0}: Error finding container d23b2e812f22ce1c378d37deceaa97a41dc728d2d24eee84a7820fb15aa3d3d4: Status 404 returned error can't find the container with id d23b2e812f22ce1c378d37deceaa97a41dc728d2d24eee84a7820fb15aa3d3d4 Apr 22 14:16:27.683776 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.683740 2579 generic.go:358] "Generic (PLEG): container finished" podID="bfa3b731-8569-4e90-9bb7-4b1898b25baa" containerID="c0b9a324fff02cc462d0a819dc01dc036a2b02d1816af2551eee0c9581f50a11" exitCode=0 Apr 22 14:16:27.683944 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.683816 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5v8nc" event={"ID":"bfa3b731-8569-4e90-9bb7-4b1898b25baa","Type":"ContainerDied","Data":"c0b9a324fff02cc462d0a819dc01dc036a2b02d1816af2551eee0c9581f50a11"} Apr 22 14:16:27.684874 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.684850 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2wdfd" event={"ID":"ace85c9c-a18e-408d-b8d6-d5d2d99bc11a","Type":"ContainerStarted","Data":"d23b2e812f22ce1c378d37deceaa97a41dc728d2d24eee84a7820fb15aa3d3d4"} Apr 22 14:16:27.735137 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:27.735096 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ws9fs\" (UID: \"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f\") " pod="openshift-insights/insights-runtime-extractor-ws9fs" Apr 22 14:16:27.735292 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:27.735259 2579 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 14:16:27.735358 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:27.735352 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-insights-runtime-extractor-tls podName:2a82bf3e-7d0c-4072-926d-b4ef0afaa57f nodeName:}" failed. No retries permitted until 2026-04-22 14:16:28.735330674 +0000 UTC m=+35.711296136 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-insights-runtime-extractor-tls") pod "insights-runtime-extractor-ws9fs" (UID: "2a82bf3e-7d0c-4072-926d-b4ef0afaa57f") : secret "insights-runtime-extractor-tls" not found Apr 22 14:16:28.137558 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:28.137474 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-metrics-tls\") pod \"dns-default-bjv7t\" (UID: \"7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69\") " pod="openshift-dns/dns-default-bjv7t" Apr 22 14:16:28.137558 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:28.137557 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15b22104-ccf3-493a-b1a4-c4984e2ead3b-cert\") pod \"ingress-canary-8dbtw\" (UID: \"15b22104-ccf3-493a-b1a4-c4984e2ead3b\") " pod="openshift-ingress-canary/ingress-canary-8dbtw" Apr 22 14:16:28.137745 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:28.137632 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:28.137745 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:28.137682 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:28.137745 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:28.137697 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-metrics-tls podName:7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:30.137679968 +0000 UTC m=+37.113645389 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-metrics-tls") pod "dns-default-bjv7t" (UID: "7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69") : secret "dns-default-metrics-tls" not found Apr 22 14:16:28.137745 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:28.137723 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15b22104-ccf3-493a-b1a4-c4984e2ead3b-cert podName:15b22104-ccf3-493a-b1a4-c4984e2ead3b nodeName:}" failed. No retries permitted until 2026-04-22 14:16:30.137712377 +0000 UTC m=+37.113677795 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15b22104-ccf3-493a-b1a4-c4984e2ead3b-cert") pod "ingress-canary-8dbtw" (UID: "15b22104-ccf3-493a-b1a4-c4984e2ead3b") : secret "canary-serving-cert" not found Apr 22 14:16:28.690503 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:28.690443 2579 generic.go:358] "Generic (PLEG): container finished" podID="bfa3b731-8569-4e90-9bb7-4b1898b25baa" containerID="6daf15d33c5ab74bd8160e7174b899f1a92ed551f4dbd3e0ebd3dacef61e72a7" exitCode=0 Apr 22 14:16:28.691052 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:28.690520 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5v8nc" event={"ID":"bfa3b731-8569-4e90-9bb7-4b1898b25baa","Type":"ContainerDied","Data":"6daf15d33c5ab74bd8160e7174b899f1a92ed551f4dbd3e0ebd3dacef61e72a7"} Apr 22 14:16:28.742803 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:28.742698 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ws9fs\" (UID: \"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f\") " pod="openshift-insights/insights-runtime-extractor-ws9fs" Apr 22 14:16:28.742933 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:28.742866 2579 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 14:16:28.742997 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:28.742947 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-insights-runtime-extractor-tls podName:2a82bf3e-7d0c-4072-926d-b4ef0afaa57f nodeName:}" failed. No retries permitted until 2026-04-22 14:16:30.742926022 +0000 UTC m=+37.718891449 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-insights-runtime-extractor-tls") pod "insights-runtime-extractor-ws9fs" (UID: "2a82bf3e-7d0c-4072-926d-b4ef0afaa57f") : secret "insights-runtime-extractor-tls" not found Apr 22 14:16:28.932321 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:28.932295 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9gzdh_b6ddc308-4c95-4675-8dae-ecfb2c08991e/dns-node-resolver/0.log" Apr 22 14:16:29.401163 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:29.401125 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xnbvm"] Apr 22 14:16:29.419996 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:29.419964 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xnbvm"] Apr 22 14:16:29.420135 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:29.420013 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-xnbvm" Apr 22 14:16:29.422838 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:29.422815 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 14:16:29.422970 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:29.422861 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 14:16:29.424015 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:29.423993 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 14:16:29.424015 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:29.424004 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-h844w\"" Apr 22 14:16:29.424155 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:29.423991 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 14:16:29.549003 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:29.548972 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/05691d28-fd74-49ec-8cee-4b35ceac661d-signing-key\") pod \"service-ca-865cb79987-xnbvm\" (UID: \"05691d28-fd74-49ec-8cee-4b35ceac661d\") " pod="openshift-service-ca/service-ca-865cb79987-xnbvm" Apr 22 14:16:29.549184 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:29.549014 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7kn8\" (UniqueName: \"kubernetes.io/projected/05691d28-fd74-49ec-8cee-4b35ceac661d-kube-api-access-d7kn8\") pod \"service-ca-865cb79987-xnbvm\" (UID: \"05691d28-fd74-49ec-8cee-4b35ceac661d\") " pod="openshift-service-ca/service-ca-865cb79987-xnbvm" Apr 22 14:16:29.549184 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:29.549156 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/05691d28-fd74-49ec-8cee-4b35ceac661d-signing-cabundle\") pod \"service-ca-865cb79987-xnbvm\" (UID: \"05691d28-fd74-49ec-8cee-4b35ceac661d\") " pod="openshift-service-ca/service-ca-865cb79987-xnbvm" Apr 22 14:16:29.650258 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:29.650216 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/05691d28-fd74-49ec-8cee-4b35ceac661d-signing-cabundle\") pod \"service-ca-865cb79987-xnbvm\" (UID: \"05691d28-fd74-49ec-8cee-4b35ceac661d\") " pod="openshift-service-ca/service-ca-865cb79987-xnbvm" Apr 22 14:16:29.650417 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:29.650278 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/05691d28-fd74-49ec-8cee-4b35ceac661d-signing-key\") pod \"service-ca-865cb79987-xnbvm\" (UID: \"05691d28-fd74-49ec-8cee-4b35ceac661d\") " pod="openshift-service-ca/service-ca-865cb79987-xnbvm" Apr 22 14:16:29.650417 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:29.650306 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7kn8\" (UniqueName: \"kubernetes.io/projected/05691d28-fd74-49ec-8cee-4b35ceac661d-kube-api-access-d7kn8\") pod \"service-ca-865cb79987-xnbvm\" (UID: \"05691d28-fd74-49ec-8cee-4b35ceac661d\") " pod="openshift-service-ca/service-ca-865cb79987-xnbvm" Apr 22 14:16:29.650971 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:29.650949 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/05691d28-fd74-49ec-8cee-4b35ceac661d-signing-cabundle\") pod \"service-ca-865cb79987-xnbvm\" (UID: \"05691d28-fd74-49ec-8cee-4b35ceac661d\") " pod="openshift-service-ca/service-ca-865cb79987-xnbvm" Apr 22 14:16:29.654307 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:29.654240 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/05691d28-fd74-49ec-8cee-4b35ceac661d-signing-key\") pod \"service-ca-865cb79987-xnbvm\" (UID: \"05691d28-fd74-49ec-8cee-4b35ceac661d\") " pod="openshift-service-ca/service-ca-865cb79987-xnbvm" Apr 22 14:16:29.658987 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:29.658962 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7kn8\" (UniqueName: \"kubernetes.io/projected/05691d28-fd74-49ec-8cee-4b35ceac661d-kube-api-access-d7kn8\") pod \"service-ca-865cb79987-xnbvm\" (UID: \"05691d28-fd74-49ec-8cee-4b35ceac661d\") " pod="openshift-service-ca/service-ca-865cb79987-xnbvm" Apr 22 14:16:29.696467 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:29.696421 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5v8nc" event={"ID":"bfa3b731-8569-4e90-9bb7-4b1898b25baa","Type":"ContainerStarted","Data":"88d1e1996d067ea0417e5d2ceb936f0ee9c2db6c9188a4a3704b53efbb1fc515"} Apr 22 14:16:29.720491 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:29.720261 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5v8nc" podStartSLOduration=5.788787603 podStartE2EDuration="36.720240076s" podCreationTimestamp="2026-04-22 14:15:53 +0000 UTC" firstStartedPulling="2026-04-22 14:15:56.229911288 +0000 UTC m=+3.205876712" lastFinishedPulling="2026-04-22 14:16:27.161363764 +0000 UTC m=+34.137329185" observedRunningTime="2026-04-22 14:16:29.718547175 +0000 UTC m=+36.694512618" watchObservedRunningTime="2026-04-22 14:16:29.720240076 +0000 UTC m=+36.696205516" Apr 22 14:16:29.728884 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:29.728853 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-xnbvm" Apr 22 14:16:29.933215 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:29.933140 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2wcll_ac56b2b8-1a22-40f8-a9b6-5a85a4f1f547/node-ca/0.log" Apr 22 14:16:30.154757 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:30.154717 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15b22104-ccf3-493a-b1a4-c4984e2ead3b-cert\") pod \"ingress-canary-8dbtw\" (UID: \"15b22104-ccf3-493a-b1a4-c4984e2ead3b\") " pod="openshift-ingress-canary/ingress-canary-8dbtw" Apr 22 14:16:30.154920 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:30.154802 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-metrics-tls\") pod \"dns-default-bjv7t\" (UID: \"7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69\") " pod="openshift-dns/dns-default-bjv7t" Apr 22 14:16:30.154920 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:30.154823 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:30.154920 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:30.154892 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15b22104-ccf3-493a-b1a4-c4984e2ead3b-cert podName:15b22104-ccf3-493a-b1a4-c4984e2ead3b nodeName:}" failed. No retries permitted until 2026-04-22 14:16:34.154872732 +0000 UTC m=+41.130838172 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15b22104-ccf3-493a-b1a4-c4984e2ead3b-cert") pod "ingress-canary-8dbtw" (UID: "15b22104-ccf3-493a-b1a4-c4984e2ead3b") : secret "canary-serving-cert" not found Apr 22 14:16:30.155065 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:30.154933 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:30.155065 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:30.154981 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-metrics-tls podName:7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:34.154969515 +0000 UTC m=+41.130934941 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-metrics-tls") pod "dns-default-bjv7t" (UID: "7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69") : secret "dns-default-metrics-tls" not found Apr 22 14:16:30.562100 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:30.561921 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xnbvm"] Apr 22 14:16:30.616595 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:16:30.616556 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05691d28_fd74_49ec_8cee_4b35ceac661d.slice/crio-17c8d31b4f71f05d7450e6538ec5b976de0805370228b2714c4aca3adbf0fc0c WatchSource:0}: Error finding container 17c8d31b4f71f05d7450e6538ec5b976de0805370228b2714c4aca3adbf0fc0c: Status 404 returned error can't find the container with id 17c8d31b4f71f05d7450e6538ec5b976de0805370228b2714c4aca3adbf0fc0c Apr 22 14:16:30.699153 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:30.699088 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-xnbvm" event={"ID":"05691d28-fd74-49ec-8cee-4b35ceac661d","Type":"ContainerStarted","Data":"17c8d31b4f71f05d7450e6538ec5b976de0805370228b2714c4aca3adbf0fc0c"} Apr 22 14:16:30.759410 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:30.759385 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ws9fs\" (UID: \"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f\") " pod="openshift-insights/insights-runtime-extractor-ws9fs" Apr 22 14:16:30.759530 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:30.759517 2579 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 14:16:30.759576 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:30.759567 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-insights-runtime-extractor-tls podName:2a82bf3e-7d0c-4072-926d-b4ef0afaa57f nodeName:}" failed. No retries permitted until 2026-04-22 14:16:34.759554383 +0000 UTC m=+41.735519801 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-insights-runtime-extractor-tls") pod "insights-runtime-extractor-ws9fs" (UID: "2a82bf3e-7d0c-4072-926d-b4ef0afaa57f") : secret "insights-runtime-extractor-tls" not found Apr 22 14:16:31.702242 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:31.702206 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2wdfd" event={"ID":"ace85c9c-a18e-408d-b8d6-d5d2d99bc11a","Type":"ContainerStarted","Data":"f545870ba80ce6ea79a9b6e300e1c036de5cfa57e4c3e5b9ff915a7dd971176d"} Apr 22 14:16:31.702684 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:31.702351 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:16:31.720491 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:31.720423 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-2wdfd" podStartSLOduration=35.677879208 podStartE2EDuration="38.720403625s" podCreationTimestamp="2026-04-22 14:15:53 +0000 UTC" firstStartedPulling="2026-04-22 14:16:27.598606817 +0000 UTC m=+34.574572236" lastFinishedPulling="2026-04-22 14:16:30.641131231 +0000 UTC m=+37.617096653" observedRunningTime="2026-04-22 14:16:31.719248285 +0000 UTC m=+38.695213740" watchObservedRunningTime="2026-04-22 14:16:31.720403625 +0000 UTC m=+38.696369067" Apr 22 14:16:33.707718 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:33.707679 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-xnbvm" event={"ID":"05691d28-fd74-49ec-8cee-4b35ceac661d","Type":"ContainerStarted","Data":"40c1bfeb75cdea23ac4efc6cf56e7bc19d1b0ea4ccbf4bd832b1e819d46db2b3"} Apr 22 14:16:33.730966 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:33.730917 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-xnbvm" podStartSLOduration=2.388091467 podStartE2EDuration="4.730902058s" podCreationTimestamp="2026-04-22 14:16:29 +0000 UTC" firstStartedPulling="2026-04-22 14:16:30.63115262 +0000 UTC m=+37.607118038" lastFinishedPulling="2026-04-22 14:16:32.973963196 +0000 UTC m=+39.949928629" observedRunningTime="2026-04-22 14:16:33.729613332 +0000 UTC m=+40.705578775" watchObservedRunningTime="2026-04-22 14:16:33.730902058 +0000 UTC m=+40.706867498" Apr 22 14:16:34.184945 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:34.184847 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15b22104-ccf3-493a-b1a4-c4984e2ead3b-cert\") pod \"ingress-canary-8dbtw\" (UID: \"15b22104-ccf3-493a-b1a4-c4984e2ead3b\") " pod="openshift-ingress-canary/ingress-canary-8dbtw" Apr 22 14:16:34.185092 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:34.184942 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-metrics-tls\") pod \"dns-default-bjv7t\" (UID: \"7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69\") " pod="openshift-dns/dns-default-bjv7t" Apr 22 14:16:34.185092 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:34.185024 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:34.185092 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:34.185058 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:34.185250 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:34.185118 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15b22104-ccf3-493a-b1a4-c4984e2ead3b-cert podName:15b22104-ccf3-493a-b1a4-c4984e2ead3b nodeName:}" failed. No retries permitted until 2026-04-22 14:16:42.185098114 +0000 UTC m=+49.161063539 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15b22104-ccf3-493a-b1a4-c4984e2ead3b-cert") pod "ingress-canary-8dbtw" (UID: "15b22104-ccf3-493a-b1a4-c4984e2ead3b") : secret "canary-serving-cert" not found Apr 22 14:16:34.185531 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:34.185515 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-metrics-tls podName:7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:42.185128903 +0000 UTC m=+49.161094326 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-metrics-tls") pod "dns-default-bjv7t" (UID: "7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69") : secret "dns-default-metrics-tls" not found Apr 22 14:16:34.790262 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:34.790219 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ws9fs\" (UID: \"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f\") " pod="openshift-insights/insights-runtime-extractor-ws9fs" Apr 22 14:16:34.790705 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:34.790397 2579 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 14:16:34.790705 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:16:34.790478 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-insights-runtime-extractor-tls podName:2a82bf3e-7d0c-4072-926d-b4ef0afaa57f nodeName:}" failed. No retries permitted until 2026-04-22 14:16:42.790441886 +0000 UTC m=+49.766407324 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-insights-runtime-extractor-tls") pod "insights-runtime-extractor-ws9fs" (UID: "2a82bf3e-7d0c-4072-926d-b4ef0afaa57f") : secret "insights-runtime-extractor-tls" not found Apr 22 14:16:42.245116 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:42.245075 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15b22104-ccf3-493a-b1a4-c4984e2ead3b-cert\") pod \"ingress-canary-8dbtw\" (UID: \"15b22104-ccf3-493a-b1a4-c4984e2ead3b\") " pod="openshift-ingress-canary/ingress-canary-8dbtw" Apr 22 14:16:42.245745 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:42.245138 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-metrics-tls\") pod \"dns-default-bjv7t\" (UID: \"7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69\") " pod="openshift-dns/dns-default-bjv7t" Apr 22 14:16:42.248081 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:42.248048 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69-metrics-tls\") pod \"dns-default-bjv7t\" (UID: \"7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69\") " pod="openshift-dns/dns-default-bjv7t" Apr 22 14:16:42.248177 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:42.248156 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15b22104-ccf3-493a-b1a4-c4984e2ead3b-cert\") pod \"ingress-canary-8dbtw\" (UID: \"15b22104-ccf3-493a-b1a4-c4984e2ead3b\") " pod="openshift-ingress-canary/ingress-canary-8dbtw" Apr 22 14:16:42.359136 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:42.359096 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bjv7t" Apr 22 14:16:42.366044 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:42.366021 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8dbtw" Apr 22 14:16:42.498034 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:42.497961 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bjv7t"] Apr 22 14:16:42.502601 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:16:42.502569 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ca0f0f5_3ecb_4a3e_ab6c_6cce9ed1ed69.slice/crio-167792fcdf946d2a98dc26ce8cd6b733e4131d2a293302f29639b43325f959ee WatchSource:0}: Error finding container 167792fcdf946d2a98dc26ce8cd6b733e4131d2a293302f29639b43325f959ee: Status 404 returned error can't find the container with id 167792fcdf946d2a98dc26ce8cd6b733e4131d2a293302f29639b43325f959ee Apr 22 14:16:42.510845 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:42.510819 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8dbtw"] Apr 22 14:16:42.514201 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:16:42.514177 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15b22104_ccf3_493a_b1a4_c4984e2ead3b.slice/crio-6814335ac895629a1625a10f8a03fa6a42786b3ab235866bd8e9377738ae54ad WatchSource:0}: Error finding container 6814335ac895629a1625a10f8a03fa6a42786b3ab235866bd8e9377738ae54ad: Status 404 returned error can't find the container with id 6814335ac895629a1625a10f8a03fa6a42786b3ab235866bd8e9377738ae54ad Apr 22 14:16:42.724702 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:42.724666 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bjv7t" event={"ID":"7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69","Type":"ContainerStarted","Data":"167792fcdf946d2a98dc26ce8cd6b733e4131d2a293302f29639b43325f959ee"} Apr 22 14:16:42.725653 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:42.725628 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8dbtw" event={"ID":"15b22104-ccf3-493a-b1a4-c4984e2ead3b","Type":"ContainerStarted","Data":"6814335ac895629a1625a10f8a03fa6a42786b3ab235866bd8e9377738ae54ad"} Apr 22 14:16:42.848861 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:42.848769 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ws9fs\" (UID: \"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f\") " pod="openshift-insights/insights-runtime-extractor-ws9fs" Apr 22 14:16:42.850965 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:42.850938 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2a82bf3e-7d0c-4072-926d-b4ef0afaa57f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ws9fs\" (UID: \"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f\") " pod="openshift-insights/insights-runtime-extractor-ws9fs" Apr 22 14:16:42.941246 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:42.941209 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ws9fs" Apr 22 14:16:43.081730 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:43.081671 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ws9fs"] Apr 22 14:16:43.086799 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:16:43.086765 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a82bf3e_7d0c_4072_926d_b4ef0afaa57f.slice/crio-fcbad26a392921691ff6a970da37f8e416938ebde19056bb538eac8f577dc367 WatchSource:0}: Error finding container fcbad26a392921691ff6a970da37f8e416938ebde19056bb538eac8f577dc367: Status 404 returned error can't find the container with id fcbad26a392921691ff6a970da37f8e416938ebde19056bb538eac8f577dc367 Apr 22 14:16:43.732573 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:43.732538 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ws9fs" event={"ID":"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f","Type":"ContainerStarted","Data":"6226a9d63a561bbd05d41db7470e4e79ec303cfbb7e69dcf3514ed1ce9109178"} Apr 22 14:16:43.732981 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:43.732580 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ws9fs" event={"ID":"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f","Type":"ContainerStarted","Data":"fcbad26a392921691ff6a970da37f8e416938ebde19056bb538eac8f577dc367"} Apr 22 14:16:45.738753 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:45.738716 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bjv7t" event={"ID":"7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69","Type":"ContainerStarted","Data":"92d290b77f859983b8e775a31304548757caed9e0edcfe1006b5e7d80fc70be2"} Apr 22 14:16:45.738753 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:45.738755 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bjv7t" event={"ID":"7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69","Type":"ContainerStarted","Data":"411c2e94bdcd2625482282a595620c45a0a20ccda77eeb6373441013718d1365"} Apr 22 14:16:45.739208 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:45.738821 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-bjv7t" Apr 22 14:16:45.740415 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:45.740392 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ws9fs" event={"ID":"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f","Type":"ContainerStarted","Data":"5d8b6557254973431961b619a354051667b2d991ef07ea6b6186a4825c6b286d"} Apr 22 14:16:45.741653 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:45.741630 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8dbtw" event={"ID":"15b22104-ccf3-493a-b1a4-c4984e2ead3b","Type":"ContainerStarted","Data":"26c6e13404801c2faa60da6b84f38be0c149f5e45fb97c1b598de380852f3dbf"} Apr 22 14:16:45.761754 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:45.761703 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-bjv7t" podStartSLOduration=16.924493073 podStartE2EDuration="19.761690238s" podCreationTimestamp="2026-04-22 14:16:26 +0000 UTC" firstStartedPulling="2026-04-22 14:16:42.504544242 +0000 UTC m=+49.480509659" lastFinishedPulling="2026-04-22 14:16:45.341741389 +0000 UTC m=+52.317706824" observedRunningTime="2026-04-22 14:16:45.761406362 +0000 UTC m=+52.737371802" watchObservedRunningTime="2026-04-22 14:16:45.761690238 +0000 UTC m=+52.737655678" Apr 22 14:16:45.779784 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:45.779741 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8dbtw" podStartSLOduration=16.95369043 podStartE2EDuration="19.779727041s" podCreationTimestamp="2026-04-22 14:16:26 +0000 UTC" firstStartedPulling="2026-04-22 14:16:42.515871795 +0000 UTC m=+49.491837213" lastFinishedPulling="2026-04-22 14:16:45.341908401 +0000 UTC m=+52.317873824" observedRunningTime="2026-04-22 14:16:45.779092569 +0000 UTC m=+52.755058009" watchObservedRunningTime="2026-04-22 14:16:45.779727041 +0000 UTC m=+52.755692480" Apr 22 14:16:47.748544 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:47.748507 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ws9fs" event={"ID":"2a82bf3e-7d0c-4072-926d-b4ef0afaa57f","Type":"ContainerStarted","Data":"a29a637945bbc60101c2916b44a3a6413ff8a8deef98a85a9e6b00d3bafed5d1"} Apr 22 14:16:47.770708 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:47.770658 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-ws9fs" podStartSLOduration=16.731142741 podStartE2EDuration="20.770644876s" podCreationTimestamp="2026-04-22 14:16:27 +0000 UTC" firstStartedPulling="2026-04-22 14:16:43.217668859 +0000 UTC m=+50.193634281" lastFinishedPulling="2026-04-22 14:16:47.257170997 +0000 UTC m=+54.233136416" observedRunningTime="2026-04-22 14:16:47.769802753 +0000 UTC m=+54.745768195" watchObservedRunningTime="2026-04-22 14:16:47.770644876 +0000 UTC m=+54.746610316" Apr 22 14:16:51.680861 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:51.680832 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ttkqb" Apr 22 14:16:51.709608 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:51.709578 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-4cwk9"] Apr 22 14:16:51.712936 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:51.712919 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-4cwk9" Apr 22 14:16:51.715825 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:51.715789 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 14:16:51.715825 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:51.715817 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-x25ht\"" Apr 22 14:16:51.715986 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:51.715817 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 14:16:51.725983 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:51.725964 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-4cwk9"] Apr 22 14:16:51.809895 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:51.809857 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxn9w\" (UniqueName: \"kubernetes.io/projected/17fffa76-898d-4138-9000-2a55d4ce4960-kube-api-access-hxn9w\") pod \"downloads-6bcc868b7-4cwk9\" (UID: \"17fffa76-898d-4138-9000-2a55d4ce4960\") " pod="openshift-console/downloads-6bcc868b7-4cwk9" Apr 22 14:16:51.856389 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:51.856355 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7ff5958d8f-z6c4r"] Apr 22 14:16:51.859660 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:51.859643 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:51.862493 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:51.862467 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 14:16:51.862493 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:51.862484 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 14:16:51.862682 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:51.862488 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fd7vw\"" Apr 22 14:16:51.863235 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:51.863219 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 14:16:51.870994 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:51.870977 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 14:16:51.874042 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:51.874023 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7ff5958d8f-z6c4r"] Apr 22 14:16:51.910628 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:51.910601 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxn9w\" (UniqueName: \"kubernetes.io/projected/17fffa76-898d-4138-9000-2a55d4ce4960-kube-api-access-hxn9w\") pod \"downloads-6bcc868b7-4cwk9\" (UID: \"17fffa76-898d-4138-9000-2a55d4ce4960\") " pod="openshift-console/downloads-6bcc868b7-4cwk9" Apr 22 14:16:51.921737 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:51.921708 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxn9w\" (UniqueName: \"kubernetes.io/projected/17fffa76-898d-4138-9000-2a55d4ce4960-kube-api-access-hxn9w\") pod \"downloads-6bcc868b7-4cwk9\" (UID: \"17fffa76-898d-4138-9000-2a55d4ce4960\") " pod="openshift-console/downloads-6bcc868b7-4cwk9" Apr 22 14:16:52.011307 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.011274 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79d69b78-10e2-4e01-9575-73c2f9dcfc79-trusted-ca\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.011307 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.011312 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79d69b78-10e2-4e01-9575-73c2f9dcfc79-bound-sa-token\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.011535 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.011358 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79d69b78-10e2-4e01-9575-73c2f9dcfc79-registry-tls\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.011535 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.011376 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/79d69b78-10e2-4e01-9575-73c2f9dcfc79-registry-certificates\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.011535 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.011397 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/79d69b78-10e2-4e01-9575-73c2f9dcfc79-ca-trust-extracted\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.011535 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.011422 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hvlg\" (UniqueName: \"kubernetes.io/projected/79d69b78-10e2-4e01-9575-73c2f9dcfc79-kube-api-access-5hvlg\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.011535 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.011491 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/79d69b78-10e2-4e01-9575-73c2f9dcfc79-image-registry-private-configuration\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.011535 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.011515 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/79d69b78-10e2-4e01-9575-73c2f9dcfc79-installation-pull-secrets\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.021317 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.021292 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-4cwk9" Apr 22 14:16:52.111841 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.111809 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/79d69b78-10e2-4e01-9575-73c2f9dcfc79-ca-trust-extracted\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.111841 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.111843 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hvlg\" (UniqueName: \"kubernetes.io/projected/79d69b78-10e2-4e01-9575-73c2f9dcfc79-kube-api-access-5hvlg\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.112071 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.111868 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/79d69b78-10e2-4e01-9575-73c2f9dcfc79-image-registry-private-configuration\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.112071 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.111885 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/79d69b78-10e2-4e01-9575-73c2f9dcfc79-installation-pull-secrets\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.112071 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.111911 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79d69b78-10e2-4e01-9575-73c2f9dcfc79-trusted-ca\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.112071 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.111934 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79d69b78-10e2-4e01-9575-73c2f9dcfc79-bound-sa-token\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.112071 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.111976 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79d69b78-10e2-4e01-9575-73c2f9dcfc79-registry-tls\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.112071 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.111999 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/79d69b78-10e2-4e01-9575-73c2f9dcfc79-registry-certificates\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.112368 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.112232 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/79d69b78-10e2-4e01-9575-73c2f9dcfc79-ca-trust-extracted\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.113155 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.113129 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/79d69b78-10e2-4e01-9575-73c2f9dcfc79-registry-certificates\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.113390 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.113348 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79d69b78-10e2-4e01-9575-73c2f9dcfc79-trusted-ca\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.114532 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.114444 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/79d69b78-10e2-4e01-9575-73c2f9dcfc79-image-registry-private-configuration\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.114657 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.114591 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/79d69b78-10e2-4e01-9575-73c2f9dcfc79-installation-pull-secrets\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.114838 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.114814 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79d69b78-10e2-4e01-9575-73c2f9dcfc79-registry-tls\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.123058 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.123036 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79d69b78-10e2-4e01-9575-73c2f9dcfc79-bound-sa-token\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.123193 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.123176 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hvlg\" (UniqueName: \"kubernetes.io/projected/79d69b78-10e2-4e01-9575-73c2f9dcfc79-kube-api-access-5hvlg\") pod \"image-registry-7ff5958d8f-z6c4r\" (UID: \"79d69b78-10e2-4e01-9575-73c2f9dcfc79\") " pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.139413 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.139387 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-4cwk9"] Apr 22 14:16:52.143051 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:16:52.143028 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17fffa76_898d_4138_9000_2a55d4ce4960.slice/crio-1dab78d6a08e2db27a4d6035eac8d6841d1f63a444a717154a48253aa7650980 WatchSource:0}: Error finding container 1dab78d6a08e2db27a4d6035eac8d6841d1f63a444a717154a48253aa7650980: Status 404 returned error can't find the container with id 1dab78d6a08e2db27a4d6035eac8d6841d1f63a444a717154a48253aa7650980 Apr 22 14:16:52.168920 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.168895 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.284311 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.284227 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7ff5958d8f-z6c4r"] Apr 22 14:16:52.287115 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:16:52.287085 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79d69b78_10e2_4e01_9575_73c2f9dcfc79.slice/crio-184e0067fc89066038d8f5573f5ef805e1d317e932efa8c613dc92b8eef30b71 WatchSource:0}: Error finding container 184e0067fc89066038d8f5573f5ef805e1d317e932efa8c613dc92b8eef30b71: Status 404 returned error can't find the container with id 184e0067fc89066038d8f5573f5ef805e1d317e932efa8c613dc92b8eef30b71 Apr 22 14:16:52.765045 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.765005 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-4cwk9" event={"ID":"17fffa76-898d-4138-9000-2a55d4ce4960","Type":"ContainerStarted","Data":"1dab78d6a08e2db27a4d6035eac8d6841d1f63a444a717154a48253aa7650980"} Apr 22 14:16:52.766611 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.766583 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" event={"ID":"79d69b78-10e2-4e01-9575-73c2f9dcfc79","Type":"ContainerStarted","Data":"00647b1f366e6d53fe7a27a3b80e36fb2eb414a16bfd7a09236eb432f1915689"} Apr 22 14:16:52.766611 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.766611 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" event={"ID":"79d69b78-10e2-4e01-9575-73c2f9dcfc79","Type":"ContainerStarted","Data":"184e0067fc89066038d8f5573f5ef805e1d317e932efa8c613dc92b8eef30b71"} Apr 22 14:16:52.766791 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.766748 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:16:52.797057 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:52.797008 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" podStartSLOduration=1.796988098 podStartE2EDuration="1.796988098s" podCreationTimestamp="2026-04-22 14:16:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:16:52.795698871 +0000 UTC m=+59.771664312" watchObservedRunningTime="2026-04-22 14:16:52.796988098 +0000 UTC m=+59.772953540" Apr 22 14:16:53.181066 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.180977 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56d4fb7f84-k98cp"] Apr 22 14:16:53.185932 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.185906 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:16:53.188844 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.188823 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 14:16:53.188844 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.188832 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 14:16:53.189054 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.188875 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 14:16:53.189054 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.188888 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 14:16:53.189170 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.189146 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 14:16:53.189937 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.189916 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-bhrwc\"" Apr 22 14:16:53.193555 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.193532 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56d4fb7f84-k98cp"] Apr 22 14:16:53.321664 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.321624 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0252c970-3240-4c6a-ba9f-f19024134322-oauth-serving-cert\") pod \"console-56d4fb7f84-k98cp\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:16:53.321664 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.321672 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0252c970-3240-4c6a-ba9f-f19024134322-console-serving-cert\") pod \"console-56d4fb7f84-k98cp\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:16:53.321870 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.321692 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0252c970-3240-4c6a-ba9f-f19024134322-console-config\") pod \"console-56d4fb7f84-k98cp\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:16:53.321870 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.321731 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0252c970-3240-4c6a-ba9f-f19024134322-service-ca\") pod \"console-56d4fb7f84-k98cp\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:16:53.321870 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.321777 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdv7b\" (UniqueName: \"kubernetes.io/projected/0252c970-3240-4c6a-ba9f-f19024134322-kube-api-access-vdv7b\") pod \"console-56d4fb7f84-k98cp\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:16:53.321870 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.321804 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0252c970-3240-4c6a-ba9f-f19024134322-console-oauth-config\") pod \"console-56d4fb7f84-k98cp\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:16:53.423132 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.423104 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0252c970-3240-4c6a-ba9f-f19024134322-oauth-serving-cert\") pod \"console-56d4fb7f84-k98cp\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:16:53.423275 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.423144 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0252c970-3240-4c6a-ba9f-f19024134322-console-serving-cert\") pod \"console-56d4fb7f84-k98cp\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:16:53.423275 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.423161 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0252c970-3240-4c6a-ba9f-f19024134322-console-config\") pod \"console-56d4fb7f84-k98cp\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:16:53.423275 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.423204 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0252c970-3240-4c6a-ba9f-f19024134322-service-ca\") pod \"console-56d4fb7f84-k98cp\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:16:53.423275 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.423251 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdv7b\" (UniqueName: \"kubernetes.io/projected/0252c970-3240-4c6a-ba9f-f19024134322-kube-api-access-vdv7b\") pod \"console-56d4fb7f84-k98cp\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:16:53.423480 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.423291 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0252c970-3240-4c6a-ba9f-f19024134322-console-oauth-config\") pod \"console-56d4fb7f84-k98cp\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:16:53.426797 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.426778 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 14:16:53.426927 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.426818 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 14:16:53.426927 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.426876 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 14:16:53.426927 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.426892 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 14:16:53.426927 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.426897 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 14:16:53.432669 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.432620 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdv7b\" (UniqueName: \"kubernetes.io/projected/0252c970-3240-4c6a-ba9f-f19024134322-kube-api-access-vdv7b\") pod \"console-56d4fb7f84-k98cp\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:16:53.434761 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.434738 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0252c970-3240-4c6a-ba9f-f19024134322-oauth-serving-cert\") pod \"console-56d4fb7f84-k98cp\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:16:53.434849 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.434808 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0252c970-3240-4c6a-ba9f-f19024134322-service-ca\") pod \"console-56d4fb7f84-k98cp\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:16:53.434905 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.434858 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0252c970-3240-4c6a-ba9f-f19024134322-console-config\") pod \"console-56d4fb7f84-k98cp\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:16:53.436331 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.436308 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0252c970-3240-4c6a-ba9f-f19024134322-console-oauth-config\") pod \"console-56d4fb7f84-k98cp\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:16:53.436444 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.436427 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0252c970-3240-4c6a-ba9f-f19024134322-console-serving-cert\") pod \"console-56d4fb7f84-k98cp\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:16:53.500866 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.500833 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-bhrwc\"" Apr 22 14:16:53.509065 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.509034 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:16:53.644794 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.644755 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56d4fb7f84-k98cp"] Apr 22 14:16:53.648025 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:16:53.647993 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0252c970_3240_4c6a_ba9f_f19024134322.slice/crio-6b43475cd7ea1d041a2c2707d6a8fa5d5e6b38f9e1fc29d769205914d6392217 WatchSource:0}: Error finding container 6b43475cd7ea1d041a2c2707d6a8fa5d5e6b38f9e1fc29d769205914d6392217: Status 404 returned error can't find the container with id 6b43475cd7ea1d041a2c2707d6a8fa5d5e6b38f9e1fc29d769205914d6392217 Apr 22 14:16:53.771471 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:53.771333 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56d4fb7f84-k98cp" event={"ID":"0252c970-3240-4c6a-ba9f-f19024134322","Type":"ContainerStarted","Data":"6b43475cd7ea1d041a2c2707d6a8fa5d5e6b38f9e1fc29d769205914d6392217"} Apr 22 14:16:55.746903 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:55.746873 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-bjv7t" Apr 22 14:16:57.786283 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:57.786232 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56d4fb7f84-k98cp" event={"ID":"0252c970-3240-4c6a-ba9f-f19024134322","Type":"ContainerStarted","Data":"5bcdf75acb8ff8228e4064b658f857f63e0cf1f04eb40993da0fa7c2f25e9896"} Apr 22 14:16:57.804151 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:57.804091 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56d4fb7f84-k98cp" podStartSLOduration=1.572221307 podStartE2EDuration="4.804072138s" podCreationTimestamp="2026-04-22 14:16:53 +0000 UTC" firstStartedPulling="2026-04-22 14:16:53.650146863 +0000 UTC m=+60.626112289" lastFinishedPulling="2026-04-22 14:16:56.881997688 +0000 UTC m=+63.857963120" observedRunningTime="2026-04-22 14:16:57.804050737 +0000 UTC m=+64.780016177" watchObservedRunningTime="2026-04-22 14:16:57.804072138 +0000 UTC m=+64.780037579" Apr 22 14:16:59.169791 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:59.169749 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs\") pod \"network-metrics-daemon-9f46g\" (UID: \"11aa4ec9-8bc6-4d4d-b02e-6441451ad30d\") " pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:16:59.172255 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:59.172226 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11aa4ec9-8bc6-4d4d-b02e-6441451ad30d-metrics-certs\") pod \"network-metrics-daemon-9f46g\" (UID: \"11aa4ec9-8bc6-4d4d-b02e-6441451ad30d\") " pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:16:59.233560 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:59.233526 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-4bnk6\"" Apr 22 14:16:59.241805 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:59.241769 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9f46g" Apr 22 14:16:59.380798 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:59.380756 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9f46g"] Apr 22 14:16:59.384555 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:16:59.384523 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11aa4ec9_8bc6_4d4d_b02e_6441451ad30d.slice/crio-38b532ee9adc374f8c2d06a3af9a47317a24c7135ba8f9a7669f9222884c7f92 WatchSource:0}: Error finding container 38b532ee9adc374f8c2d06a3af9a47317a24c7135ba8f9a7669f9222884c7f92: Status 404 returned error can't find the container with id 38b532ee9adc374f8c2d06a3af9a47317a24c7135ba8f9a7669f9222884c7f92 Apr 22 14:16:59.792699 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:16:59.792647 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9f46g" event={"ID":"11aa4ec9-8bc6-4d4d-b02e-6441451ad30d","Type":"ContainerStarted","Data":"38b532ee9adc374f8c2d06a3af9a47317a24c7135ba8f9a7669f9222884c7f92"} Apr 22 14:17:01.801269 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:01.801231 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9f46g" event={"ID":"11aa4ec9-8bc6-4d4d-b02e-6441451ad30d","Type":"ContainerStarted","Data":"2d64bca4fb6782e0656150461f8ea539a63a4a8ab6c6089276a34e6b7f104113"} Apr 22 14:17:01.801750 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:01.801277 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9f46g" event={"ID":"11aa4ec9-8bc6-4d4d-b02e-6441451ad30d","Type":"ContainerStarted","Data":"0f71dd9d4a2f57a433bc9755f6a6cceb152a979128f698ddf388cfaf97f62bc5"} Apr 22 14:17:01.821990 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:01.821928 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9f46g" podStartSLOduration=67.561514598 podStartE2EDuration="1m8.82190848s" podCreationTimestamp="2026-04-22 14:15:53 +0000 UTC" firstStartedPulling="2026-04-22 14:16:59.386989085 +0000 UTC m=+66.362954507" lastFinishedPulling="2026-04-22 14:17:00.647382956 +0000 UTC m=+67.623348389" observedRunningTime="2026-04-22 14:17:01.820597454 +0000 UTC m=+68.796562894" watchObservedRunningTime="2026-04-22 14:17:01.82190848 +0000 UTC m=+68.797873921" Apr 22 14:17:02.707235 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:02.707204 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-2wdfd" Apr 22 14:17:03.509656 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:03.509622 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:17:03.510091 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:03.509671 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:17:03.516883 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:03.516859 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:17:03.813344 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:03.813266 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:17:09.827504 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:09.827443 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-4cwk9" event={"ID":"17fffa76-898d-4138-9000-2a55d4ce4960","Type":"ContainerStarted","Data":"71af6b1b5f8b2bde5c35ebd960e2d9ec08ea4860dc59778ac5beb3f93857cce9"} Apr 22 14:17:09.827955 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:09.827683 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-4cwk9" Apr 22 14:17:09.838708 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:09.838674 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-4cwk9" Apr 22 14:17:09.848432 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:09.848373 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-4cwk9" podStartSLOduration=2.100379936 podStartE2EDuration="18.848355757s" podCreationTimestamp="2026-04-22 14:16:51 +0000 UTC" firstStartedPulling="2026-04-22 14:16:52.144762413 +0000 UTC m=+59.120727830" lastFinishedPulling="2026-04-22 14:17:08.892738232 +0000 UTC m=+75.868703651" observedRunningTime="2026-04-22 14:17:09.846690504 +0000 UTC m=+76.822655945" watchObservedRunningTime="2026-04-22 14:17:09.848355757 +0000 UTC m=+76.824321198" Apr 22 14:17:13.776590 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:13.776556 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7ff5958d8f-z6c4r" Apr 22 14:17:14.139072 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:14.138988 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56d4fb7f84-k98cp"] Apr 22 14:17:19.445283 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.445247 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-46kdf"] Apr 22 14:17:19.508762 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.508730 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-46kdf"] Apr 22 14:17:19.508927 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.508860 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-46kdf" Apr 22 14:17:19.511792 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.511761 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 14:17:19.511983 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.511950 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 14:17:19.512960 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.512934 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 14:17:19.512960 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.512953 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 14:17:19.513113 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.513095 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 14:17:19.513219 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.513205 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-jnmdj\"" Apr 22 14:17:19.599531 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.599494 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/07ac2127-6b42-4a9f-88f5-cc04565a32dd-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-46kdf\" (UID: \"07ac2127-6b42-4a9f-88f5-cc04565a32dd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-46kdf" Apr 22 14:17:19.599710 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.599560 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/07ac2127-6b42-4a9f-88f5-cc04565a32dd-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-46kdf\" (UID: \"07ac2127-6b42-4a9f-88f5-cc04565a32dd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-46kdf" Apr 22 14:17:19.599710 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.599654 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/07ac2127-6b42-4a9f-88f5-cc04565a32dd-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-46kdf\" (UID: \"07ac2127-6b42-4a9f-88f5-cc04565a32dd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-46kdf" Apr 22 14:17:19.599710 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.599683 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mcns\" (UniqueName: \"kubernetes.io/projected/07ac2127-6b42-4a9f-88f5-cc04565a32dd-kube-api-access-4mcns\") pod \"prometheus-operator-5676c8c784-46kdf\" (UID: \"07ac2127-6b42-4a9f-88f5-cc04565a32dd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-46kdf" Apr 22 14:17:19.700414 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.700326 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/07ac2127-6b42-4a9f-88f5-cc04565a32dd-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-46kdf\" (UID: \"07ac2127-6b42-4a9f-88f5-cc04565a32dd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-46kdf" Apr 22 14:17:19.700414 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.700403 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/07ac2127-6b42-4a9f-88f5-cc04565a32dd-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-46kdf\" (UID: \"07ac2127-6b42-4a9f-88f5-cc04565a32dd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-46kdf" Apr 22 14:17:19.700659 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.700503 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mcns\" (UniqueName: \"kubernetes.io/projected/07ac2127-6b42-4a9f-88f5-cc04565a32dd-kube-api-access-4mcns\") pod \"prometheus-operator-5676c8c784-46kdf\" (UID: \"07ac2127-6b42-4a9f-88f5-cc04565a32dd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-46kdf" Apr 22 14:17:19.700659 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.700552 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/07ac2127-6b42-4a9f-88f5-cc04565a32dd-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-46kdf\" (UID: \"07ac2127-6b42-4a9f-88f5-cc04565a32dd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-46kdf" Apr 22 14:17:19.701154 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.701128 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/07ac2127-6b42-4a9f-88f5-cc04565a32dd-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-46kdf\" (UID: \"07ac2127-6b42-4a9f-88f5-cc04565a32dd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-46kdf" Apr 22 14:17:19.702731 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.702713 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/07ac2127-6b42-4a9f-88f5-cc04565a32dd-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-46kdf\" (UID: \"07ac2127-6b42-4a9f-88f5-cc04565a32dd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-46kdf" Apr 22 14:17:19.702805 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.702729 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/07ac2127-6b42-4a9f-88f5-cc04565a32dd-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-46kdf\" (UID: \"07ac2127-6b42-4a9f-88f5-cc04565a32dd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-46kdf" Apr 22 14:17:19.709263 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.709241 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mcns\" (UniqueName: \"kubernetes.io/projected/07ac2127-6b42-4a9f-88f5-cc04565a32dd-kube-api-access-4mcns\") pod \"prometheus-operator-5676c8c784-46kdf\" (UID: \"07ac2127-6b42-4a9f-88f5-cc04565a32dd\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-46kdf" Apr 22 14:17:19.822512 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.822478 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-46kdf" Apr 22 14:17:19.956437 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:19.956346 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-46kdf"] Apr 22 14:17:19.959066 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:17:19.959043 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07ac2127_6b42_4a9f_88f5_cc04565a32dd.slice/crio-9cb0540ed8cf350d2ceb8d65455657b36bf595dc66cb47f81f185ce2958babec WatchSource:0}: Error finding container 9cb0540ed8cf350d2ceb8d65455657b36bf595dc66cb47f81f185ce2958babec: Status 404 returned error can't find the container with id 9cb0540ed8cf350d2ceb8d65455657b36bf595dc66cb47f81f185ce2958babec Apr 22 14:17:20.860326 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:20.860284 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-46kdf" event={"ID":"07ac2127-6b42-4a9f-88f5-cc04565a32dd","Type":"ContainerStarted","Data":"9cb0540ed8cf350d2ceb8d65455657b36bf595dc66cb47f81f185ce2958babec"} Apr 22 14:17:22.867528 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:22.867494 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-46kdf" event={"ID":"07ac2127-6b42-4a9f-88f5-cc04565a32dd","Type":"ContainerStarted","Data":"a3a4a203417c0abe8fb48aa0902351de08849b85c0cfebcde981ec63c5f67ec6"} Apr 22 14:17:22.867528 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:22.867531 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-46kdf" event={"ID":"07ac2127-6b42-4a9f-88f5-cc04565a32dd","Type":"ContainerStarted","Data":"53b6183d75e58fe79c927e4c7788f71e0624124ce51b0b7899982dcf8a54d723"} Apr 22 14:17:22.885392 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:22.885328 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-46kdf" podStartSLOduration=1.9747046190000002 podStartE2EDuration="3.885308187s" podCreationTimestamp="2026-04-22 14:17:19 +0000 UTC" firstStartedPulling="2026-04-22 14:17:19.960931271 +0000 UTC m=+86.936896690" lastFinishedPulling="2026-04-22 14:17:21.871534836 +0000 UTC m=+88.847500258" observedRunningTime="2026-04-22 14:17:22.885063319 +0000 UTC m=+89.861028752" watchObservedRunningTime="2026-04-22 14:17:22.885308187 +0000 UTC m=+89.861273627" Apr 22 14:17:25.045394 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.045359 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-9zhfn"] Apr 22 14:17:25.067388 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.067354 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-nxqcw"] Apr 22 14:17:25.067618 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.067556 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" Apr 22 14:17:25.070287 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.070260 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 14:17:25.070564 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.070541 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 14:17:25.070810 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.070725 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 14:17:25.071951 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.071685 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-8mdqp\"" Apr 22 14:17:25.082843 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.082817 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-9zhfn"] Apr 22 14:17:25.082965 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.082940 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.085640 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.085618 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 14:17:25.086005 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.085978 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-s69rl\"" Apr 22 14:17:25.086102 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.086027 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 14:17:25.087104 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.087085 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 14:17:25.144318 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.144282 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-sys\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.144318 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.144328 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-9zhfn\" (UID: \"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" Apr 22 14:17:25.144580 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.144356 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2js7\" (UniqueName: \"kubernetes.io/projected/36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f-kube-api-access-m2js7\") pod \"kube-state-metrics-69db897b98-9zhfn\" (UID: \"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" Apr 22 14:17:25.144580 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.144466 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-9zhfn\" (UID: \"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" Apr 22 14:17:25.144580 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.144506 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-node-exporter-tls\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.144580 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.144532 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l5sj\" (UniqueName: \"kubernetes.io/projected/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-kube-api-access-9l5sj\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.144784 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.144589 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-node-exporter-textfile\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.144784 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.144633 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-node-exporter-accelerators-collector-config\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.144784 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.144671 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-9zhfn\" (UID: \"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" Apr 22 14:17:25.144784 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.144741 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.144784 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.144778 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-9zhfn\" (UID: \"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" Apr 22 14:17:25.145038 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.144842 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9zhfn\" (UID: \"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" Apr 22 14:17:25.145038 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.144871 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-root\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.145038 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.144923 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-node-exporter-wtmp\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.145038 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.144948 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-metrics-client-ca\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.245790 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.245751 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-9zhfn\" (UID: \"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" Apr 22 14:17:25.245790 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.245796 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-node-exporter-tls\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.246087 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.245820 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9l5sj\" (UniqueName: \"kubernetes.io/projected/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-kube-api-access-9l5sj\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.246087 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.245918 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-node-exporter-textfile\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.246087 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.245979 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-node-exporter-accelerators-collector-config\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.246087 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.246024 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-9zhfn\" (UID: \"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" Apr 22 14:17:25.246087 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.246050 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.246351 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.246088 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-9zhfn\" (UID: \"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" Apr 22 14:17:25.246351 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.246117 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9zhfn\" (UID: \"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" Apr 22 14:17:25.246351 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.246145 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-root\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.246351 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.246175 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-node-exporter-wtmp\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.246351 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.246204 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-metrics-client-ca\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.246351 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.246285 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-sys\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.246351 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.246306 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-node-exporter-textfile\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.246687 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.246312 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-9zhfn\" (UID: \"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" Apr 22 14:17:25.246687 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.246396 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2js7\" (UniqueName: \"kubernetes.io/projected/36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f-kube-api-access-m2js7\") pod \"kube-state-metrics-69db897b98-9zhfn\" (UID: \"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" Apr 22 14:17:25.246687 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.246404 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-9zhfn\" (UID: \"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" Apr 22 14:17:25.246687 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:17:25.246551 2579 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 14:17:25.246687 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.246587 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-node-exporter-wtmp\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.246687 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:17:25.246624 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f-kube-state-metrics-tls podName:36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f nodeName:}" failed. No retries permitted until 2026-04-22 14:17:25.746597068 +0000 UTC m=+92.722562490 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-9zhfn" (UID: "36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f") : secret "kube-state-metrics-tls" not found Apr 22 14:17:25.246687 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.246673 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-root\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.247028 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.246881 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-sys\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.247485 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.247442 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-metrics-client-ca\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.247577 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.247511 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-node-exporter-accelerators-collector-config\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.247730 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.247709 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-9zhfn\" (UID: \"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" Apr 22 14:17:25.248720 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.248693 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-node-exporter-tls\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.249112 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.249089 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.249611 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.249594 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-9zhfn\" (UID: \"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" Apr 22 14:17:25.254743 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.254717 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-9zhfn\" (UID: \"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" Apr 22 14:17:25.261401 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.261373 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l5sj\" (UniqueName: \"kubernetes.io/projected/c4ab8bf9-0890-4bcc-af6a-32e0e67eed93-kube-api-access-9l5sj\") pod \"node-exporter-nxqcw\" (UID: \"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93\") " pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.263658 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.263631 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2js7\" (UniqueName: \"kubernetes.io/projected/36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f-kube-api-access-m2js7\") pod \"kube-state-metrics-69db897b98-9zhfn\" (UID: \"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" Apr 22 14:17:25.392298 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.392211 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nxqcw" Apr 22 14:17:25.402425 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:17:25.402399 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4ab8bf9_0890_4bcc_af6a_32e0e67eed93.slice/crio-b07fd773f4c484f36ea6d40d01d4824172a3bc271b1448493628d196c4cc7adb WatchSource:0}: Error finding container b07fd773f4c484f36ea6d40d01d4824172a3bc271b1448493628d196c4cc7adb: Status 404 returned error can't find the container with id b07fd773f4c484f36ea6d40d01d4824172a3bc271b1448493628d196c4cc7adb Apr 22 14:17:25.750055 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.750024 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9zhfn\" (UID: \"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" Apr 22 14:17:25.752301 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.752274 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9zhfn\" (UID: \"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" Apr 22 14:17:25.879220 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.879186 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nxqcw" event={"ID":"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93","Type":"ContainerStarted","Data":"b07fd773f4c484f36ea6d40d01d4824172a3bc271b1448493628d196c4cc7adb"} Apr 22 14:17:25.979058 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:25.978610 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" Apr 22 14:17:26.011501 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.010961 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 14:17:26.040002 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.039963 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 14:17:26.040179 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.040124 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.046557 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.046132 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 14:17:26.046557 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.046204 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 14:17:26.046557 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.046132 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 14:17:26.047636 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.047377 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 14:17:26.047636 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.047381 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 14:17:26.047636 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.047538 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 14:17:26.049477 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.048702 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-68fqf\"" Apr 22 14:17:26.050720 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.049927 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 14:17:26.050720 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.050434 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 14:17:26.064865 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.064604 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 14:17:26.141964 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.141936 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-9zhfn"] Apr 22 14:17:26.154114 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.154085 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/31dc4779-192f-40e7-acf5-dcf542039bd6-web-config\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.154274 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.154126 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/31dc4779-192f-40e7-acf5-dcf542039bd6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.154274 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.154152 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/31dc4779-192f-40e7-acf5-dcf542039bd6-config-out\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.154274 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.154221 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/31dc4779-192f-40e7-acf5-dcf542039bd6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.154274 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.154247 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k86m\" (UniqueName: \"kubernetes.io/projected/31dc4779-192f-40e7-acf5-dcf542039bd6-kube-api-access-5k86m\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.154472 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.154284 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31dc4779-192f-40e7-acf5-dcf542039bd6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.154472 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.154321 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/31dc4779-192f-40e7-acf5-dcf542039bd6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.154472 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.154350 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/31dc4779-192f-40e7-acf5-dcf542039bd6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.154472 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.154398 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/31dc4779-192f-40e7-acf5-dcf542039bd6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.154630 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.154489 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/31dc4779-192f-40e7-acf5-dcf542039bd6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.154630 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.154527 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/31dc4779-192f-40e7-acf5-dcf542039bd6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.154630 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.154559 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/31dc4779-192f-40e7-acf5-dcf542039bd6-config-volume\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.154630 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.154605 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/31dc4779-192f-40e7-acf5-dcf542039bd6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.255475 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.255238 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/31dc4779-192f-40e7-acf5-dcf542039bd6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.255475 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.255279 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/31dc4779-192f-40e7-acf5-dcf542039bd6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.255475 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.255299 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/31dc4779-192f-40e7-acf5-dcf542039bd6-config-volume\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.255475 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.255329 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/31dc4779-192f-40e7-acf5-dcf542039bd6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.255475 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.255358 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/31dc4779-192f-40e7-acf5-dcf542039bd6-web-config\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.255475 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.255384 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/31dc4779-192f-40e7-acf5-dcf542039bd6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.255475 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.255411 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/31dc4779-192f-40e7-acf5-dcf542039bd6-config-out\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.256175 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.255942 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/31dc4779-192f-40e7-acf5-dcf542039bd6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.256175 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.255988 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5k86m\" (UniqueName: \"kubernetes.io/projected/31dc4779-192f-40e7-acf5-dcf542039bd6-kube-api-access-5k86m\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.256175 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.256025 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31dc4779-192f-40e7-acf5-dcf542039bd6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.256175 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.255942 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/31dc4779-192f-40e7-acf5-dcf542039bd6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.256175 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.256056 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/31dc4779-192f-40e7-acf5-dcf542039bd6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.256175 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.256111 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/31dc4779-192f-40e7-acf5-dcf542039bd6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.256550 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.256187 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/31dc4779-192f-40e7-acf5-dcf542039bd6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.258419 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.258386 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31dc4779-192f-40e7-acf5-dcf542039bd6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.258601 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.258578 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/31dc4779-192f-40e7-acf5-dcf542039bd6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.258829 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.258804 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/31dc4779-192f-40e7-acf5-dcf542039bd6-config-volume\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.259081 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.259060 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/31dc4779-192f-40e7-acf5-dcf542039bd6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.259710 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.259684 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/31dc4779-192f-40e7-acf5-dcf542039bd6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.259852 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.259826 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/31dc4779-192f-40e7-acf5-dcf542039bd6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.259980 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.259932 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/31dc4779-192f-40e7-acf5-dcf542039bd6-config-out\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.260582 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.260558 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/31dc4779-192f-40e7-acf5-dcf542039bd6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.260681 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.260616 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/31dc4779-192f-40e7-acf5-dcf542039bd6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.260985 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.260964 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/31dc4779-192f-40e7-acf5-dcf542039bd6-web-config\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.262194 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.262136 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/31dc4779-192f-40e7-acf5-dcf542039bd6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.268070 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.268048 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k86m\" (UniqueName: \"kubernetes.io/projected/31dc4779-192f-40e7-acf5-dcf542039bd6-kube-api-access-5k86m\") pod \"alertmanager-main-0\" (UID: \"31dc4779-192f-40e7-acf5-dcf542039bd6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.322610 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:17:26.322573 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36e0bfc7_28c8_4bec_bcaa_5e1e03b1566f.slice/crio-cd1ca81142366a0a5cde6c6b8e68aa82ca1c475544e69aa5a78973cdb2fae287 WatchSource:0}: Error finding container cd1ca81142366a0a5cde6c6b8e68aa82ca1c475544e69aa5a78973cdb2fae287: Status 404 returned error can't find the container with id cd1ca81142366a0a5cde6c6b8e68aa82ca1c475544e69aa5a78973cdb2fae287 Apr 22 14:17:26.355718 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.355680 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 14:17:26.494515 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.494387 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 14:17:26.637381 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:17:26.637307 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31dc4779_192f_40e7_acf5_dcf542039bd6.slice/crio-2b66fc13f9ac0961924b75d873d062d3bbdd821f379bf0a0e78807511cf517ae WatchSource:0}: Error finding container 2b66fc13f9ac0961924b75d873d062d3bbdd821f379bf0a0e78807511cf517ae: Status 404 returned error can't find the container with id 2b66fc13f9ac0961924b75d873d062d3bbdd821f379bf0a0e78807511cf517ae Apr 22 14:17:26.883010 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.882981 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" event={"ID":"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f","Type":"ContainerStarted","Data":"cd1ca81142366a0a5cde6c6b8e68aa82ca1c475544e69aa5a78973cdb2fae287"} Apr 22 14:17:26.884368 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.884337 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nxqcw" event={"ID":"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93","Type":"ContainerStarted","Data":"2af5127975d64178de14be2d17ec2b4951fc208f6bca36fae6ac3c5340c3c421"} Apr 22 14:17:26.885439 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:26.885416 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31dc4779-192f-40e7-acf5-dcf542039bd6","Type":"ContainerStarted","Data":"2b66fc13f9ac0961924b75d873d062d3bbdd821f379bf0a0e78807511cf517ae"} Apr 22 14:17:27.891307 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:27.891267 2579 generic.go:358] "Generic (PLEG): container finished" podID="c4ab8bf9-0890-4bcc-af6a-32e0e67eed93" containerID="2af5127975d64178de14be2d17ec2b4951fc208f6bca36fae6ac3c5340c3c421" exitCode=0 Apr 22 14:17:27.891823 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:27.891358 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nxqcw" event={"ID":"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93","Type":"ContainerDied","Data":"2af5127975d64178de14be2d17ec2b4951fc208f6bca36fae6ac3c5340c3c421"} Apr 22 14:17:28.900268 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:28.900233 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" event={"ID":"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f","Type":"ContainerStarted","Data":"28b67a09c3ed8f2f9cec521bea32a42ba5fce7ad92e1d0036b7e5d0daee644c6"} Apr 22 14:17:28.900675 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:28.900279 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" event={"ID":"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f","Type":"ContainerStarted","Data":"0a156562f7123f41116e548670178c75fa502324df257ce939c65adccef08129"} Apr 22 14:17:28.902273 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:28.902243 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nxqcw" event={"ID":"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93","Type":"ContainerStarted","Data":"aa9bddbc628e99baec55900f4e05056650e2b26908f60ad443458e79f1425ef9"} Apr 22 14:17:28.902397 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:28.902278 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nxqcw" event={"ID":"c4ab8bf9-0890-4bcc-af6a-32e0e67eed93","Type":"ContainerStarted","Data":"08d15cbcc53a0a33946d1cf6fba603d01664d12fcf015efe7f40a16d403bca56"} Apr 22 14:17:28.903652 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:28.903627 2579 generic.go:358] "Generic (PLEG): container finished" podID="31dc4779-192f-40e7-acf5-dcf542039bd6" containerID="f663725e4ce9151114dbd9e38ab8b5beb0ebddd36753d2a8d15ead49f6682f0d" exitCode=0 Apr 22 14:17:28.903750 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:28.903667 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31dc4779-192f-40e7-acf5-dcf542039bd6","Type":"ContainerDied","Data":"f663725e4ce9151114dbd9e38ab8b5beb0ebddd36753d2a8d15ead49f6682f0d"} Apr 22 14:17:28.972911 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:28.972855 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-nxqcw" podStartSLOduration=2.718874639 podStartE2EDuration="3.972834968s" podCreationTimestamp="2026-04-22 14:17:25 +0000 UTC" firstStartedPulling="2026-04-22 14:17:25.404691281 +0000 UTC m=+92.380656698" lastFinishedPulling="2026-04-22 14:17:26.658651606 +0000 UTC m=+93.634617027" observedRunningTime="2026-04-22 14:17:28.935998597 +0000 UTC m=+95.911964039" watchObservedRunningTime="2026-04-22 14:17:28.972834968 +0000 UTC m=+95.948800410" Apr 22 14:17:29.541683 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.541642 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-67cfb96f85-ctd8m"] Apr 22 14:17:29.558893 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.558865 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-67cfb96f85-ctd8m"] Apr 22 14:17:29.559051 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.558997 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.562735 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.562709 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 14:17:29.562880 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.562821 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-q6rh9\"" Apr 22 14:17:29.563807 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.563783 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 14:17:29.563942 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.563892 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 14:17:29.564201 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.564179 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-4d19uovi4hjoc\"" Apr 22 14:17:29.564336 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.564318 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 14:17:29.586343 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.586313 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/2aaec84b-b5bb-45a2-9797-35532daa757c-secret-metrics-server-tls\") pod \"metrics-server-67cfb96f85-ctd8m\" (UID: \"2aaec84b-b5bb-45a2-9797-35532daa757c\") " pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.586534 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.586367 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aaec84b-b5bb-45a2-9797-35532daa757c-client-ca-bundle\") pod \"metrics-server-67cfb96f85-ctd8m\" (UID: \"2aaec84b-b5bb-45a2-9797-35532daa757c\") " pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.586534 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.586397 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aaec84b-b5bb-45a2-9797-35532daa757c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-67cfb96f85-ctd8m\" (UID: \"2aaec84b-b5bb-45a2-9797-35532daa757c\") " pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.586534 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.586518 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2flf6\" (UniqueName: \"kubernetes.io/projected/2aaec84b-b5bb-45a2-9797-35532daa757c-kube-api-access-2flf6\") pod \"metrics-server-67cfb96f85-ctd8m\" (UID: \"2aaec84b-b5bb-45a2-9797-35532daa757c\") " pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.586700 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.586567 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/2aaec84b-b5bb-45a2-9797-35532daa757c-secret-metrics-server-client-certs\") pod \"metrics-server-67cfb96f85-ctd8m\" (UID: \"2aaec84b-b5bb-45a2-9797-35532daa757c\") " pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.586700 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.586612 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/2aaec84b-b5bb-45a2-9797-35532daa757c-metrics-server-audit-profiles\") pod \"metrics-server-67cfb96f85-ctd8m\" (UID: \"2aaec84b-b5bb-45a2-9797-35532daa757c\") " pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.586700 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.586652 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/2aaec84b-b5bb-45a2-9797-35532daa757c-audit-log\") pod \"metrics-server-67cfb96f85-ctd8m\" (UID: \"2aaec84b-b5bb-45a2-9797-35532daa757c\") " pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.689117 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.688092 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2flf6\" (UniqueName: \"kubernetes.io/projected/2aaec84b-b5bb-45a2-9797-35532daa757c-kube-api-access-2flf6\") pod \"metrics-server-67cfb96f85-ctd8m\" (UID: \"2aaec84b-b5bb-45a2-9797-35532daa757c\") " pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.689117 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.688146 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/2aaec84b-b5bb-45a2-9797-35532daa757c-secret-metrics-server-client-certs\") pod \"metrics-server-67cfb96f85-ctd8m\" (UID: \"2aaec84b-b5bb-45a2-9797-35532daa757c\") " pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.689117 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.688185 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/2aaec84b-b5bb-45a2-9797-35532daa757c-metrics-server-audit-profiles\") pod \"metrics-server-67cfb96f85-ctd8m\" (UID: \"2aaec84b-b5bb-45a2-9797-35532daa757c\") " pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.689117 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.688213 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/2aaec84b-b5bb-45a2-9797-35532daa757c-audit-log\") pod \"metrics-server-67cfb96f85-ctd8m\" (UID: \"2aaec84b-b5bb-45a2-9797-35532daa757c\") " pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.689117 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.688270 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/2aaec84b-b5bb-45a2-9797-35532daa757c-secret-metrics-server-tls\") pod \"metrics-server-67cfb96f85-ctd8m\" (UID: \"2aaec84b-b5bb-45a2-9797-35532daa757c\") " pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.689117 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.688303 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aaec84b-b5bb-45a2-9797-35532daa757c-client-ca-bundle\") pod \"metrics-server-67cfb96f85-ctd8m\" (UID: \"2aaec84b-b5bb-45a2-9797-35532daa757c\") " pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.689117 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.688336 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aaec84b-b5bb-45a2-9797-35532daa757c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-67cfb96f85-ctd8m\" (UID: \"2aaec84b-b5bb-45a2-9797-35532daa757c\") " pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.689645 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.689192 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aaec84b-b5bb-45a2-9797-35532daa757c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-67cfb96f85-ctd8m\" (UID: \"2aaec84b-b5bb-45a2-9797-35532daa757c\") " pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.689702 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.689675 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/2aaec84b-b5bb-45a2-9797-35532daa757c-metrics-server-audit-profiles\") pod \"metrics-server-67cfb96f85-ctd8m\" (UID: \"2aaec84b-b5bb-45a2-9797-35532daa757c\") " pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.690152 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.690101 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/2aaec84b-b5bb-45a2-9797-35532daa757c-audit-log\") pod \"metrics-server-67cfb96f85-ctd8m\" (UID: \"2aaec84b-b5bb-45a2-9797-35532daa757c\") " pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.692500 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.692413 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/2aaec84b-b5bb-45a2-9797-35532daa757c-secret-metrics-server-tls\") pod \"metrics-server-67cfb96f85-ctd8m\" (UID: \"2aaec84b-b5bb-45a2-9797-35532daa757c\") " pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.692612 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.692513 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aaec84b-b5bb-45a2-9797-35532daa757c-client-ca-bundle\") pod \"metrics-server-67cfb96f85-ctd8m\" (UID: \"2aaec84b-b5bb-45a2-9797-35532daa757c\") " pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.692868 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.692840 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/2aaec84b-b5bb-45a2-9797-35532daa757c-secret-metrics-server-client-certs\") pod \"metrics-server-67cfb96f85-ctd8m\" (UID: \"2aaec84b-b5bb-45a2-9797-35532daa757c\") " pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.702913 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.701605 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-699d9cf6b6-q4cvw"] Apr 22 14:17:29.710133 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.710107 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2flf6\" (UniqueName: \"kubernetes.io/projected/2aaec84b-b5bb-45a2-9797-35532daa757c-kube-api-access-2flf6\") pod \"metrics-server-67cfb96f85-ctd8m\" (UID: \"2aaec84b-b5bb-45a2-9797-35532daa757c\") " pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.731059 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.731031 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-699d9cf6b6-q4cvw"] Apr 22 14:17:29.731205 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.731173 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:29.742085 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.742056 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 14:17:29.789089 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.789056 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-service-ca\") pod \"console-699d9cf6b6-q4cvw\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:29.789284 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.789102 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-www2f\" (UniqueName: \"kubernetes.io/projected/55543e02-fdb1-4183-b740-a59c692d6454-kube-api-access-www2f\") pod \"console-699d9cf6b6-q4cvw\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:29.789284 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.789153 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-trusted-ca-bundle\") pod \"console-699d9cf6b6-q4cvw\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:29.789284 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.789178 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-console-config\") pod \"console-699d9cf6b6-q4cvw\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:29.789284 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.789202 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-oauth-serving-cert\") pod \"console-699d9cf6b6-q4cvw\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:29.789500 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.789281 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55543e02-fdb1-4183-b740-a59c692d6454-console-serving-cert\") pod \"console-699d9cf6b6-q4cvw\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:29.789500 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.789362 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55543e02-fdb1-4183-b740-a59c692d6454-console-oauth-config\") pod \"console-699d9cf6b6-q4cvw\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:29.871561 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.871466 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:29.890550 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.890518 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55543e02-fdb1-4183-b740-a59c692d6454-console-oauth-config\") pod \"console-699d9cf6b6-q4cvw\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:29.890700 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.890586 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-service-ca\") pod \"console-699d9cf6b6-q4cvw\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:29.890700 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.890622 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-www2f\" (UniqueName: \"kubernetes.io/projected/55543e02-fdb1-4183-b740-a59c692d6454-kube-api-access-www2f\") pod \"console-699d9cf6b6-q4cvw\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:29.890700 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.890668 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-trusted-ca-bundle\") pod \"console-699d9cf6b6-q4cvw\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:29.890700 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.890692 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-console-config\") pod \"console-699d9cf6b6-q4cvw\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:29.890907 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.890716 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-oauth-serving-cert\") pod \"console-699d9cf6b6-q4cvw\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:29.890907 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.890755 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55543e02-fdb1-4183-b740-a59c692d6454-console-serving-cert\") pod \"console-699d9cf6b6-q4cvw\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:29.891956 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.891906 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-oauth-serving-cert\") pod \"console-699d9cf6b6-q4cvw\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:29.892105 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.892083 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-console-config\") pod \"console-699d9cf6b6-q4cvw\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:29.892171 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.892137 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-trusted-ca-bundle\") pod \"console-699d9cf6b6-q4cvw\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:29.892699 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.892672 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-service-ca\") pod \"console-699d9cf6b6-q4cvw\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:29.893866 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.893846 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55543e02-fdb1-4183-b740-a59c692d6454-console-oauth-config\") pod \"console-699d9cf6b6-q4cvw\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:29.894928 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.894900 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55543e02-fdb1-4183-b740-a59c692d6454-console-serving-cert\") pod \"console-699d9cf6b6-q4cvw\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:29.900735 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.900710 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-www2f\" (UniqueName: \"kubernetes.io/projected/55543e02-fdb1-4183-b740-a59c692d6454-kube-api-access-www2f\") pod \"console-699d9cf6b6-q4cvw\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:29.910051 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.910016 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" event={"ID":"36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f","Type":"ContainerStarted","Data":"34329fd8e32bf7b9bea4c8818a21403792773c1fd1f2cc49f28e82b601224029"} Apr 22 14:17:29.942044 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:29.940097 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-9zhfn" podStartSLOduration=2.729645729 podStartE2EDuration="4.94007822s" podCreationTimestamp="2026-04-22 14:17:25 +0000 UTC" firstStartedPulling="2026-04-22 14:17:26.324988313 +0000 UTC m=+93.300953731" lastFinishedPulling="2026-04-22 14:17:28.535420789 +0000 UTC m=+95.511386222" observedRunningTime="2026-04-22 14:17:29.933686513 +0000 UTC m=+96.909651953" watchObservedRunningTime="2026-04-22 14:17:29.94007822 +0000 UTC m=+96.916043662" Apr 22 14:17:30.026131 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:30.025805 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-67cfb96f85-ctd8m"] Apr 22 14:17:30.028932 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:17:30.028901 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aaec84b_b5bb_45a2_9797_35532daa757c.slice/crio-a02ce4d512e820f0ea3181ca5b3913e211b14470ee2a71814c4047a852da087a WatchSource:0}: Error finding container a02ce4d512e820f0ea3181ca5b3913e211b14470ee2a71814c4047a852da087a: Status 404 returned error can't find the container with id a02ce4d512e820f0ea3181ca5b3913e211b14470ee2a71814c4047a852da087a Apr 22 14:17:30.043264 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:30.043242 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:30.180928 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:30.180891 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-699d9cf6b6-q4cvw"] Apr 22 14:17:30.547842 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:17:30.547804 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55543e02_fdb1_4183_b740_a59c692d6454.slice/crio-556a4ae011b5cbe0ff1da613c797113635dda053cddf6d293e6e1c83035b38a2 WatchSource:0}: Error finding container 556a4ae011b5cbe0ff1da613c797113635dda053cddf6d293e6e1c83035b38a2: Status 404 returned error can't find the container with id 556a4ae011b5cbe0ff1da613c797113635dda053cddf6d293e6e1c83035b38a2 Apr 22 14:17:30.914395 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:30.914303 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699d9cf6b6-q4cvw" event={"ID":"55543e02-fdb1-4183-b740-a59c692d6454","Type":"ContainerStarted","Data":"072c9f3333b0163a96c53d5aa53f5bd4185a11fc5c1a5dcfbab1d71aa475bf42"} Apr 22 14:17:30.914395 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:30.914347 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699d9cf6b6-q4cvw" event={"ID":"55543e02-fdb1-4183-b740-a59c692d6454","Type":"ContainerStarted","Data":"556a4ae011b5cbe0ff1da613c797113635dda053cddf6d293e6e1c83035b38a2"} Apr 22 14:17:30.915716 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:30.915686 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" event={"ID":"2aaec84b-b5bb-45a2-9797-35532daa757c","Type":"ContainerStarted","Data":"a02ce4d512e820f0ea3181ca5b3913e211b14470ee2a71814c4047a852da087a"} Apr 22 14:17:30.935701 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:30.935612 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-699d9cf6b6-q4cvw" podStartSLOduration=1.935592223 podStartE2EDuration="1.935592223s" podCreationTimestamp="2026-04-22 14:17:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:17:30.933950252 +0000 UTC m=+97.909915717" watchObservedRunningTime="2026-04-22 14:17:30.935592223 +0000 UTC m=+97.911557664" Apr 22 14:17:31.922593 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:31.922460 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31dc4779-192f-40e7-acf5-dcf542039bd6","Type":"ContainerStarted","Data":"0799c56a8c68bd62cb6836dfe5fe91a5eb2b42c4f35f3b934f511ac2ed2bfee4"} Apr 22 14:17:31.922593 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:31.922504 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31dc4779-192f-40e7-acf5-dcf542039bd6","Type":"ContainerStarted","Data":"608d02164caf735c418a45768fab0a0f5c87fa620fc22c680bbd58a3b17e1bef"} Apr 22 14:17:31.922593 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:31.922519 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31dc4779-192f-40e7-acf5-dcf542039bd6","Type":"ContainerStarted","Data":"22447770d7ae33de1f47d2b3aa8699e9daf3104863e8df95136c6ecb2f03ff12"} Apr 22 14:17:31.922593 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:31.922532 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31dc4779-192f-40e7-acf5-dcf542039bd6","Type":"ContainerStarted","Data":"5d67b96a6a3c621450c91957d3a9b7b648cf145d6901afb3e9ba818c27189d0a"} Apr 22 14:17:31.922593 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:31.922549 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31dc4779-192f-40e7-acf5-dcf542039bd6","Type":"ContainerStarted","Data":"4dc0159174ca30f729bf647b0ba13a9d3843f9a77e848ec89a814a14eda1f73b"} Apr 22 14:17:32.927722 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:32.927632 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31dc4779-192f-40e7-acf5-dcf542039bd6","Type":"ContainerStarted","Data":"a82185fdf32a98c470b6324238e86621cae1a432b6e847ac161a65b880d79746"} Apr 22 14:17:32.928995 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:32.928967 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" event={"ID":"2aaec84b-b5bb-45a2-9797-35532daa757c","Type":"ContainerStarted","Data":"5bb4b6b2c171d6ef347f9f3ed0149113e032077df8e81d51921dc65737de8a99"} Apr 22 14:17:32.962942 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:32.962882 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.121657326 podStartE2EDuration="7.962863005s" podCreationTimestamp="2026-04-22 14:17:25 +0000 UTC" firstStartedPulling="2026-04-22 14:17:26.656002058 +0000 UTC m=+93.631967476" lastFinishedPulling="2026-04-22 14:17:32.497207721 +0000 UTC m=+99.473173155" observedRunningTime="2026-04-22 14:17:32.95995511 +0000 UTC m=+99.935920551" watchObservedRunningTime="2026-04-22 14:17:32.962863005 +0000 UTC m=+99.938828447" Apr 22 14:17:32.981241 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:32.981189 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" podStartSLOduration=1.8399920870000002 podStartE2EDuration="3.981171678s" podCreationTimestamp="2026-04-22 14:17:29 +0000 UTC" firstStartedPulling="2026-04-22 14:17:30.031474109 +0000 UTC m=+97.007439531" lastFinishedPulling="2026-04-22 14:17:32.172653689 +0000 UTC m=+99.148619122" observedRunningTime="2026-04-22 14:17:32.979701276 +0000 UTC m=+99.955666717" watchObservedRunningTime="2026-04-22 14:17:32.981171678 +0000 UTC m=+99.957137117" Apr 22 14:17:33.948307 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:33.948265 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-699d9cf6b6-q4cvw"] Apr 22 14:17:33.981694 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:33.981660 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-97d65848d-5xg44"] Apr 22 14:17:33.985075 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:33.985056 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:33.997676 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:33.997651 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-97d65848d-5xg44"] Apr 22 14:17:34.034882 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.034849 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-trusted-ca-bundle\") pod \"console-97d65848d-5xg44\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:34.034882 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.034881 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0818bdbc-9aec-4634-8ee1-72abdded7e9b-console-oauth-config\") pod \"console-97d65848d-5xg44\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:34.035121 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.034916 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25t46\" (UniqueName: \"kubernetes.io/projected/0818bdbc-9aec-4634-8ee1-72abdded7e9b-kube-api-access-25t46\") pod \"console-97d65848d-5xg44\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:34.035294 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.035270 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-oauth-serving-cert\") pod \"console-97d65848d-5xg44\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:34.035376 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.035359 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-console-config\") pod \"console-97d65848d-5xg44\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:34.035445 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.035403 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-service-ca\") pod \"console-97d65848d-5xg44\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:34.035445 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.035432 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0818bdbc-9aec-4634-8ee1-72abdded7e9b-console-serving-cert\") pod \"console-97d65848d-5xg44\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:34.136159 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.136118 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-trusted-ca-bundle\") pod \"console-97d65848d-5xg44\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:34.136159 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.136154 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0818bdbc-9aec-4634-8ee1-72abdded7e9b-console-oauth-config\") pod \"console-97d65848d-5xg44\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:34.136401 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.136179 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25t46\" (UniqueName: \"kubernetes.io/projected/0818bdbc-9aec-4634-8ee1-72abdded7e9b-kube-api-access-25t46\") pod \"console-97d65848d-5xg44\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:34.136401 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.136245 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-oauth-serving-cert\") pod \"console-97d65848d-5xg44\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:34.136401 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.136277 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-console-config\") pod \"console-97d65848d-5xg44\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:34.136401 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.136307 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-service-ca\") pod \"console-97d65848d-5xg44\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:34.136401 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.136333 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0818bdbc-9aec-4634-8ee1-72abdded7e9b-console-serving-cert\") pod \"console-97d65848d-5xg44\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:34.136997 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.136971 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-oauth-serving-cert\") pod \"console-97d65848d-5xg44\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:34.137104 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.137085 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-service-ca\") pod \"console-97d65848d-5xg44\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:34.137166 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.137143 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-console-config\") pod \"console-97d65848d-5xg44\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:34.137204 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.137150 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-trusted-ca-bundle\") pod \"console-97d65848d-5xg44\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:34.138853 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.138834 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0818bdbc-9aec-4634-8ee1-72abdded7e9b-console-oauth-config\") pod \"console-97d65848d-5xg44\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:34.138908 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.138891 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0818bdbc-9aec-4634-8ee1-72abdded7e9b-console-serving-cert\") pod \"console-97d65848d-5xg44\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:34.144961 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.144937 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-25t46\" (UniqueName: \"kubernetes.io/projected/0818bdbc-9aec-4634-8ee1-72abdded7e9b-kube-api-access-25t46\") pod \"console-97d65848d-5xg44\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:34.294392 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.294295 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:34.434606 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.434574 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-97d65848d-5xg44"] Apr 22 14:17:34.438894 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:17:34.438849 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0818bdbc_9aec_4634_8ee1_72abdded7e9b.slice/crio-e6ccd5f9ef700c58d03baae5feb769734cd5da00085a0ceba888cc5b353e437e WatchSource:0}: Error finding container e6ccd5f9ef700c58d03baae5feb769734cd5da00085a0ceba888cc5b353e437e: Status 404 returned error can't find the container with id e6ccd5f9ef700c58d03baae5feb769734cd5da00085a0ceba888cc5b353e437e Apr 22 14:17:34.936073 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.936036 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-97d65848d-5xg44" event={"ID":"0818bdbc-9aec-4634-8ee1-72abdded7e9b","Type":"ContainerStarted","Data":"6d058b8000be5a8aa092d94662e4aee165e527fe48d8f6f79165c85f525087b2"} Apr 22 14:17:34.936073 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.936075 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-97d65848d-5xg44" event={"ID":"0818bdbc-9aec-4634-8ee1-72abdded7e9b","Type":"ContainerStarted","Data":"e6ccd5f9ef700c58d03baae5feb769734cd5da00085a0ceba888cc5b353e437e"} Apr 22 14:17:34.955910 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:34.955856 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-97d65848d-5xg44" podStartSLOduration=1.955839112 podStartE2EDuration="1.955839112s" podCreationTimestamp="2026-04-22 14:17:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:17:34.954247871 +0000 UTC m=+101.930213312" watchObservedRunningTime="2026-04-22 14:17:34.955839112 +0000 UTC m=+101.931804553" Apr 22 14:17:39.160655 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.160594 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-56d4fb7f84-k98cp" podUID="0252c970-3240-4c6a-ba9f-f19024134322" containerName="console" containerID="cri-o://5bcdf75acb8ff8228e4064b658f857f63e0cf1f04eb40993da0fa7c2f25e9896" gracePeriod=15 Apr 22 14:17:39.414497 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.414423 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56d4fb7f84-k98cp_0252c970-3240-4c6a-ba9f-f19024134322/console/0.log" Apr 22 14:17:39.414611 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.414517 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:17:39.486407 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.486367 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdv7b\" (UniqueName: \"kubernetes.io/projected/0252c970-3240-4c6a-ba9f-f19024134322-kube-api-access-vdv7b\") pod \"0252c970-3240-4c6a-ba9f-f19024134322\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " Apr 22 14:17:39.486592 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.486428 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0252c970-3240-4c6a-ba9f-f19024134322-console-serving-cert\") pod \"0252c970-3240-4c6a-ba9f-f19024134322\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " Apr 22 14:17:39.486592 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.486487 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0252c970-3240-4c6a-ba9f-f19024134322-console-config\") pod \"0252c970-3240-4c6a-ba9f-f19024134322\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " Apr 22 14:17:39.486592 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.486525 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0252c970-3240-4c6a-ba9f-f19024134322-console-oauth-config\") pod \"0252c970-3240-4c6a-ba9f-f19024134322\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " Apr 22 14:17:39.486592 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.486542 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0252c970-3240-4c6a-ba9f-f19024134322-service-ca\") pod \"0252c970-3240-4c6a-ba9f-f19024134322\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " Apr 22 14:17:39.486592 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.486560 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0252c970-3240-4c6a-ba9f-f19024134322-oauth-serving-cert\") pod \"0252c970-3240-4c6a-ba9f-f19024134322\" (UID: \"0252c970-3240-4c6a-ba9f-f19024134322\") " Apr 22 14:17:39.486965 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.486933 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0252c970-3240-4c6a-ba9f-f19024134322-service-ca" (OuterVolumeSpecName: "service-ca") pod "0252c970-3240-4c6a-ba9f-f19024134322" (UID: "0252c970-3240-4c6a-ba9f-f19024134322"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:17:39.486965 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.486942 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0252c970-3240-4c6a-ba9f-f19024134322-console-config" (OuterVolumeSpecName: "console-config") pod "0252c970-3240-4c6a-ba9f-f19024134322" (UID: "0252c970-3240-4c6a-ba9f-f19024134322"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:17:39.487145 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.487007 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0252c970-3240-4c6a-ba9f-f19024134322-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0252c970-3240-4c6a-ba9f-f19024134322" (UID: "0252c970-3240-4c6a-ba9f-f19024134322"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:17:39.488843 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.488813 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0252c970-3240-4c6a-ba9f-f19024134322-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0252c970-3240-4c6a-ba9f-f19024134322" (UID: "0252c970-3240-4c6a-ba9f-f19024134322"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:17:39.488949 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.488817 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0252c970-3240-4c6a-ba9f-f19024134322-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0252c970-3240-4c6a-ba9f-f19024134322" (UID: "0252c970-3240-4c6a-ba9f-f19024134322"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:17:39.488949 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.488887 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0252c970-3240-4c6a-ba9f-f19024134322-kube-api-access-vdv7b" (OuterVolumeSpecName: "kube-api-access-vdv7b") pod "0252c970-3240-4c6a-ba9f-f19024134322" (UID: "0252c970-3240-4c6a-ba9f-f19024134322"). InnerVolumeSpecName "kube-api-access-vdv7b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:17:39.587878 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.587837 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0252c970-3240-4c6a-ba9f-f19024134322-console-serving-cert\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:17:39.587878 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.587874 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0252c970-3240-4c6a-ba9f-f19024134322-console-config\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:17:39.587878 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.587885 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0252c970-3240-4c6a-ba9f-f19024134322-console-oauth-config\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:17:39.588099 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.587894 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0252c970-3240-4c6a-ba9f-f19024134322-service-ca\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:17:39.588099 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.587903 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0252c970-3240-4c6a-ba9f-f19024134322-oauth-serving-cert\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:17:39.588099 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.587912 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vdv7b\" (UniqueName: \"kubernetes.io/projected/0252c970-3240-4c6a-ba9f-f19024134322-kube-api-access-vdv7b\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:17:39.951963 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.951936 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56d4fb7f84-k98cp_0252c970-3240-4c6a-ba9f-f19024134322/console/0.log" Apr 22 14:17:39.952137 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.951975 2579 generic.go:358] "Generic (PLEG): container finished" podID="0252c970-3240-4c6a-ba9f-f19024134322" containerID="5bcdf75acb8ff8228e4064b658f857f63e0cf1f04eb40993da0fa7c2f25e9896" exitCode=2 Apr 22 14:17:39.952137 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.952042 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56d4fb7f84-k98cp" Apr 22 14:17:39.952137 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.952064 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56d4fb7f84-k98cp" event={"ID":"0252c970-3240-4c6a-ba9f-f19024134322","Type":"ContainerDied","Data":"5bcdf75acb8ff8228e4064b658f857f63e0cf1f04eb40993da0fa7c2f25e9896"} Apr 22 14:17:39.952137 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.952096 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56d4fb7f84-k98cp" event={"ID":"0252c970-3240-4c6a-ba9f-f19024134322","Type":"ContainerDied","Data":"6b43475cd7ea1d041a2c2707d6a8fa5d5e6b38f9e1fc29d769205914d6392217"} Apr 22 14:17:39.952137 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.952111 2579 scope.go:117] "RemoveContainer" containerID="5bcdf75acb8ff8228e4064b658f857f63e0cf1f04eb40993da0fa7c2f25e9896" Apr 22 14:17:39.959878 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.959861 2579 scope.go:117] "RemoveContainer" containerID="5bcdf75acb8ff8228e4064b658f857f63e0cf1f04eb40993da0fa7c2f25e9896" Apr 22 14:17:39.960133 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:17:39.960114 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bcdf75acb8ff8228e4064b658f857f63e0cf1f04eb40993da0fa7c2f25e9896\": container with ID starting with 5bcdf75acb8ff8228e4064b658f857f63e0cf1f04eb40993da0fa7c2f25e9896 not found: ID does not exist" containerID="5bcdf75acb8ff8228e4064b658f857f63e0cf1f04eb40993da0fa7c2f25e9896" Apr 22 14:17:39.960202 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.960140 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bcdf75acb8ff8228e4064b658f857f63e0cf1f04eb40993da0fa7c2f25e9896"} err="failed to get container status \"5bcdf75acb8ff8228e4064b658f857f63e0cf1f04eb40993da0fa7c2f25e9896\": rpc error: code = NotFound desc = could not find container \"5bcdf75acb8ff8228e4064b658f857f63e0cf1f04eb40993da0fa7c2f25e9896\": container with ID starting with 5bcdf75acb8ff8228e4064b658f857f63e0cf1f04eb40993da0fa7c2f25e9896 not found: ID does not exist" Apr 22 14:17:39.969741 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.969711 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56d4fb7f84-k98cp"] Apr 22 14:17:39.975522 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:39.975500 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-56d4fb7f84-k98cp"] Apr 22 14:17:40.043558 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:40.043522 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:41.517177 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:41.517145 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0252c970-3240-4c6a-ba9f-f19024134322" path="/var/lib/kubelet/pods/0252c970-3240-4c6a-ba9f-f19024134322/volumes" Apr 22 14:17:44.295680 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:44.295641 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:44.295680 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:44.295687 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:44.300214 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:44.300184 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:44.972316 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:44.972285 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:17:49.872137 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:49.872097 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:49.872137 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:49.872142 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:17:58.966767 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:58.966700 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-699d9cf6b6-q4cvw" podUID="55543e02-fdb1-4183-b740-a59c692d6454" containerName="console" containerID="cri-o://072c9f3333b0163a96c53d5aa53f5bd4185a11fc5c1a5dcfbab1d71aa475bf42" gracePeriod=15 Apr 22 14:17:59.204094 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.204065 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-699d9cf6b6-q4cvw_55543e02-fdb1-4183-b740-a59c692d6454/console/0.log" Apr 22 14:17:59.204222 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.204130 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:17:59.359039 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.358942 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-www2f\" (UniqueName: \"kubernetes.io/projected/55543e02-fdb1-4183-b740-a59c692d6454-kube-api-access-www2f\") pod \"55543e02-fdb1-4183-b740-a59c692d6454\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " Apr 22 14:17:59.359039 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.358996 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-oauth-serving-cert\") pod \"55543e02-fdb1-4183-b740-a59c692d6454\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " Apr 22 14:17:59.359039 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.359023 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-console-config\") pod \"55543e02-fdb1-4183-b740-a59c692d6454\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " Apr 22 14:17:59.359313 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.359062 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55543e02-fdb1-4183-b740-a59c692d6454-console-oauth-config\") pod \"55543e02-fdb1-4183-b740-a59c692d6454\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " Apr 22 14:17:59.359313 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.359123 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-service-ca\") pod \"55543e02-fdb1-4183-b740-a59c692d6454\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " Apr 22 14:17:59.359313 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.359147 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-trusted-ca-bundle\") pod \"55543e02-fdb1-4183-b740-a59c692d6454\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " Apr 22 14:17:59.359313 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.359207 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55543e02-fdb1-4183-b740-a59c692d6454-console-serving-cert\") pod \"55543e02-fdb1-4183-b740-a59c692d6454\" (UID: \"55543e02-fdb1-4183-b740-a59c692d6454\") " Apr 22 14:17:59.359560 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.359485 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-console-config" (OuterVolumeSpecName: "console-config") pod "55543e02-fdb1-4183-b740-a59c692d6454" (UID: "55543e02-fdb1-4183-b740-a59c692d6454"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:17:59.359602 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.359554 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-service-ca" (OuterVolumeSpecName: "service-ca") pod "55543e02-fdb1-4183-b740-a59c692d6454" (UID: "55543e02-fdb1-4183-b740-a59c692d6454"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:17:59.359661 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.359585 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "55543e02-fdb1-4183-b740-a59c692d6454" (UID: "55543e02-fdb1-4183-b740-a59c692d6454"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:17:59.359854 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.359798 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "55543e02-fdb1-4183-b740-a59c692d6454" (UID: "55543e02-fdb1-4183-b740-a59c692d6454"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:17:59.361282 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.361256 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55543e02-fdb1-4183-b740-a59c692d6454-kube-api-access-www2f" (OuterVolumeSpecName: "kube-api-access-www2f") pod "55543e02-fdb1-4183-b740-a59c692d6454" (UID: "55543e02-fdb1-4183-b740-a59c692d6454"). InnerVolumeSpecName "kube-api-access-www2f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:17:59.361371 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.361255 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55543e02-fdb1-4183-b740-a59c692d6454-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "55543e02-fdb1-4183-b740-a59c692d6454" (UID: "55543e02-fdb1-4183-b740-a59c692d6454"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:17:59.361371 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.361265 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55543e02-fdb1-4183-b740-a59c692d6454-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "55543e02-fdb1-4183-b740-a59c692d6454" (UID: "55543e02-fdb1-4183-b740-a59c692d6454"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:17:59.460653 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.460615 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55543e02-fdb1-4183-b740-a59c692d6454-console-oauth-config\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:17:59.460653 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.460649 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-service-ca\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:17:59.460653 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.460662 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-trusted-ca-bundle\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:17:59.460903 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.460675 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55543e02-fdb1-4183-b740-a59c692d6454-console-serving-cert\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:17:59.460903 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.460689 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-www2f\" (UniqueName: \"kubernetes.io/projected/55543e02-fdb1-4183-b740-a59c692d6454-kube-api-access-www2f\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:17:59.460903 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.460702 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-oauth-serving-cert\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:17:59.460903 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:17:59.460713 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55543e02-fdb1-4183-b740-a59c692d6454-console-config\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:18:00.012960 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:00.012934 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-699d9cf6b6-q4cvw_55543e02-fdb1-4183-b740-a59c692d6454/console/0.log" Apr 22 14:18:00.013387 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:00.012976 2579 generic.go:358] "Generic (PLEG): container finished" podID="55543e02-fdb1-4183-b740-a59c692d6454" containerID="072c9f3333b0163a96c53d5aa53f5bd4185a11fc5c1a5dcfbab1d71aa475bf42" exitCode=2 Apr 22 14:18:00.013387 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:00.013031 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699d9cf6b6-q4cvw" event={"ID":"55543e02-fdb1-4183-b740-a59c692d6454","Type":"ContainerDied","Data":"072c9f3333b0163a96c53d5aa53f5bd4185a11fc5c1a5dcfbab1d71aa475bf42"} Apr 22 14:18:00.013387 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:00.013055 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699d9cf6b6-q4cvw" Apr 22 14:18:00.013387 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:00.013066 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699d9cf6b6-q4cvw" event={"ID":"55543e02-fdb1-4183-b740-a59c692d6454","Type":"ContainerDied","Data":"556a4ae011b5cbe0ff1da613c797113635dda053cddf6d293e6e1c83035b38a2"} Apr 22 14:18:00.013387 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:00.013085 2579 scope.go:117] "RemoveContainer" containerID="072c9f3333b0163a96c53d5aa53f5bd4185a11fc5c1a5dcfbab1d71aa475bf42" Apr 22 14:18:00.020820 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:00.020802 2579 scope.go:117] "RemoveContainer" containerID="072c9f3333b0163a96c53d5aa53f5bd4185a11fc5c1a5dcfbab1d71aa475bf42" Apr 22 14:18:00.021085 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:18:00.021062 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"072c9f3333b0163a96c53d5aa53f5bd4185a11fc5c1a5dcfbab1d71aa475bf42\": container with ID starting with 072c9f3333b0163a96c53d5aa53f5bd4185a11fc5c1a5dcfbab1d71aa475bf42 not found: ID does not exist" containerID="072c9f3333b0163a96c53d5aa53f5bd4185a11fc5c1a5dcfbab1d71aa475bf42" Apr 22 14:18:00.021181 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:00.021093 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"072c9f3333b0163a96c53d5aa53f5bd4185a11fc5c1a5dcfbab1d71aa475bf42"} err="failed to get container status \"072c9f3333b0163a96c53d5aa53f5bd4185a11fc5c1a5dcfbab1d71aa475bf42\": rpc error: code = NotFound desc = could not find container \"072c9f3333b0163a96c53d5aa53f5bd4185a11fc5c1a5dcfbab1d71aa475bf42\": container with ID starting with 072c9f3333b0163a96c53d5aa53f5bd4185a11fc5c1a5dcfbab1d71aa475bf42 not found: ID does not exist" Apr 22 14:18:00.031025 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:00.030973 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-699d9cf6b6-q4cvw"] Apr 22 14:18:00.034711 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:00.034686 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-699d9cf6b6-q4cvw"] Apr 22 14:18:01.516858 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:01.516825 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55543e02-fdb1-4183-b740-a59c692d6454" path="/var/lib/kubelet/pods/55543e02-fdb1-4183-b740-a59c692d6454/volumes" Apr 22 14:18:09.878240 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:09.878208 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:18:09.882426 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:09.882393 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-67cfb96f85-ctd8m" Apr 22 14:18:47.652692 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.652605 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-55468d84bd-l8v6j"] Apr 22 14:18:47.653239 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.653004 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0252c970-3240-4c6a-ba9f-f19024134322" containerName="console" Apr 22 14:18:47.653239 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.653022 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0252c970-3240-4c6a-ba9f-f19024134322" containerName="console" Apr 22 14:18:47.653239 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.653050 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55543e02-fdb1-4183-b740-a59c692d6454" containerName="console" Apr 22 14:18:47.653239 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.653058 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="55543e02-fdb1-4183-b740-a59c692d6454" containerName="console" Apr 22 14:18:47.653239 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.653141 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="55543e02-fdb1-4183-b740-a59c692d6454" containerName="console" Apr 22 14:18:47.653239 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.653153 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0252c970-3240-4c6a-ba9f-f19024134322" containerName="console" Apr 22 14:18:47.656218 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.656193 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:47.667801 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.667765 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55468d84bd-l8v6j"] Apr 22 14:18:47.730877 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.730838 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-console-serving-cert\") pod \"console-55468d84bd-l8v6j\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:47.731042 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.730895 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-trusted-ca-bundle\") pod \"console-55468d84bd-l8v6j\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:47.731042 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.730980 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-oauth-serving-cert\") pod \"console-55468d84bd-l8v6j\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:47.731042 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.731029 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfgqc\" (UniqueName: \"kubernetes.io/projected/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-kube-api-access-nfgqc\") pod \"console-55468d84bd-l8v6j\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:47.731154 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.731103 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-console-oauth-config\") pod \"console-55468d84bd-l8v6j\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:47.731154 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.731130 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-console-config\") pod \"console-55468d84bd-l8v6j\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:47.731227 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.731156 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-service-ca\") pod \"console-55468d84bd-l8v6j\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:47.832348 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.832309 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-trusted-ca-bundle\") pod \"console-55468d84bd-l8v6j\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:47.832566 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.832367 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-oauth-serving-cert\") pod \"console-55468d84bd-l8v6j\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:47.832566 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.832406 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nfgqc\" (UniqueName: \"kubernetes.io/projected/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-kube-api-access-nfgqc\") pod \"console-55468d84bd-l8v6j\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:47.832566 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.832443 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-console-oauth-config\") pod \"console-55468d84bd-l8v6j\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:47.832566 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.832527 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-console-config\") pod \"console-55468d84bd-l8v6j\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:47.832566 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.832554 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-service-ca\") pod \"console-55468d84bd-l8v6j\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:47.832848 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.832589 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-console-serving-cert\") pod \"console-55468d84bd-l8v6j\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:47.833228 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.833184 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-console-config\") pod \"console-55468d84bd-l8v6j\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:47.833336 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.833234 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-oauth-serving-cert\") pod \"console-55468d84bd-l8v6j\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:47.833336 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.833295 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-service-ca\") pod \"console-55468d84bd-l8v6j\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:47.833427 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.833410 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-trusted-ca-bundle\") pod \"console-55468d84bd-l8v6j\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:47.834960 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.834942 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-console-serving-cert\") pod \"console-55468d84bd-l8v6j\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:47.835053 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.834994 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-console-oauth-config\") pod \"console-55468d84bd-l8v6j\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:47.842245 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.842218 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfgqc\" (UniqueName: \"kubernetes.io/projected/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-kube-api-access-nfgqc\") pod \"console-55468d84bd-l8v6j\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:47.965238 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:47.965190 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:48.085308 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:48.085275 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55468d84bd-l8v6j"] Apr 22 14:18:48.088138 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:18:48.088103 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4647b06a_ef4b_4a2d_8fd5_33a36ca1c7c1.slice/crio-80d2cd78ce97d3b3bd5965669db8fd1115ad3c7baedfdaa02581ce8e0acbd9cf WatchSource:0}: Error finding container 80d2cd78ce97d3b3bd5965669db8fd1115ad3c7baedfdaa02581ce8e0acbd9cf: Status 404 returned error can't find the container with id 80d2cd78ce97d3b3bd5965669db8fd1115ad3c7baedfdaa02581ce8e0acbd9cf Apr 22 14:18:48.149520 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:48.149487 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55468d84bd-l8v6j" event={"ID":"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1","Type":"ContainerStarted","Data":"5b52b7fbf986b59c905f1c30354f5dcbfc767065fdb6b9a7fd7a2b5a2eb82ebe"} Apr 22 14:18:48.149638 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:48.149527 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55468d84bd-l8v6j" event={"ID":"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1","Type":"ContainerStarted","Data":"80d2cd78ce97d3b3bd5965669db8fd1115ad3c7baedfdaa02581ce8e0acbd9cf"} Apr 22 14:18:48.170002 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:48.169940 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55468d84bd-l8v6j" podStartSLOduration=1.169918332 podStartE2EDuration="1.169918332s" podCreationTimestamp="2026-04-22 14:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:18:48.168348187 +0000 UTC m=+175.144313627" watchObservedRunningTime="2026-04-22 14:18:48.169918332 +0000 UTC m=+175.145883772" Apr 22 14:18:57.965965 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:57.965912 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:57.965965 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:57.965966 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:57.970661 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:57.970634 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:58.185993 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:58.185965 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:18:58.232868 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:18:58.232780 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-97d65848d-5xg44"] Apr 22 14:19:11.830571 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:11.830535 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-hfm6z"] Apr 22 14:19:11.833786 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:11.833763 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hfm6z" Apr 22 14:19:11.836387 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:11.836364 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 14:19:11.846561 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:11.846536 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hfm6z"] Apr 22 14:19:11.943293 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:11.943253 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a00d0fb2-d626-4537-b0c0-59d6dc7c1b06-dbus\") pod \"global-pull-secret-syncer-hfm6z\" (UID: \"a00d0fb2-d626-4537-b0c0-59d6dc7c1b06\") " pod="kube-system/global-pull-secret-syncer-hfm6z" Apr 22 14:19:11.943495 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:11.943325 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a00d0fb2-d626-4537-b0c0-59d6dc7c1b06-original-pull-secret\") pod \"global-pull-secret-syncer-hfm6z\" (UID: \"a00d0fb2-d626-4537-b0c0-59d6dc7c1b06\") " pod="kube-system/global-pull-secret-syncer-hfm6z" Apr 22 14:19:11.943495 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:11.943399 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a00d0fb2-d626-4537-b0c0-59d6dc7c1b06-kubelet-config\") pod \"global-pull-secret-syncer-hfm6z\" (UID: \"a00d0fb2-d626-4537-b0c0-59d6dc7c1b06\") " pod="kube-system/global-pull-secret-syncer-hfm6z" Apr 22 14:19:12.044834 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:12.044788 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a00d0fb2-d626-4537-b0c0-59d6dc7c1b06-kubelet-config\") pod \"global-pull-secret-syncer-hfm6z\" (UID: \"a00d0fb2-d626-4537-b0c0-59d6dc7c1b06\") " pod="kube-system/global-pull-secret-syncer-hfm6z" Apr 22 14:19:12.045022 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:12.044852 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a00d0fb2-d626-4537-b0c0-59d6dc7c1b06-dbus\") pod \"global-pull-secret-syncer-hfm6z\" (UID: \"a00d0fb2-d626-4537-b0c0-59d6dc7c1b06\") " pod="kube-system/global-pull-secret-syncer-hfm6z" Apr 22 14:19:12.045022 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:12.044903 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a00d0fb2-d626-4537-b0c0-59d6dc7c1b06-original-pull-secret\") pod \"global-pull-secret-syncer-hfm6z\" (UID: \"a00d0fb2-d626-4537-b0c0-59d6dc7c1b06\") " pod="kube-system/global-pull-secret-syncer-hfm6z" Apr 22 14:19:12.045022 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:12.044930 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a00d0fb2-d626-4537-b0c0-59d6dc7c1b06-kubelet-config\") pod \"global-pull-secret-syncer-hfm6z\" (UID: \"a00d0fb2-d626-4537-b0c0-59d6dc7c1b06\") " pod="kube-system/global-pull-secret-syncer-hfm6z" Apr 22 14:19:12.045022 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:12.044979 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a00d0fb2-d626-4537-b0c0-59d6dc7c1b06-dbus\") pod \"global-pull-secret-syncer-hfm6z\" (UID: \"a00d0fb2-d626-4537-b0c0-59d6dc7c1b06\") " pod="kube-system/global-pull-secret-syncer-hfm6z" Apr 22 14:19:12.047134 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:12.047117 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a00d0fb2-d626-4537-b0c0-59d6dc7c1b06-original-pull-secret\") pod \"global-pull-secret-syncer-hfm6z\" (UID: \"a00d0fb2-d626-4537-b0c0-59d6dc7c1b06\") " pod="kube-system/global-pull-secret-syncer-hfm6z" Apr 22 14:19:12.143517 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:12.143411 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hfm6z" Apr 22 14:19:12.267214 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:12.267181 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hfm6z"] Apr 22 14:19:12.270279 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:19:12.270250 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda00d0fb2_d626_4537_b0c0_59d6dc7c1b06.slice/crio-d9ab8baf1fe7ec57a662e089e906af4e737a36e697c9b7a7ce403c254831e4b1 WatchSource:0}: Error finding container d9ab8baf1fe7ec57a662e089e906af4e737a36e697c9b7a7ce403c254831e4b1: Status 404 returned error can't find the container with id d9ab8baf1fe7ec57a662e089e906af4e737a36e697c9b7a7ce403c254831e4b1 Apr 22 14:19:13.223624 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:13.223587 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hfm6z" event={"ID":"a00d0fb2-d626-4537-b0c0-59d6dc7c1b06","Type":"ContainerStarted","Data":"d9ab8baf1fe7ec57a662e089e906af4e737a36e697c9b7a7ce403c254831e4b1"} Apr 22 14:19:17.236683 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:17.236643 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hfm6z" event={"ID":"a00d0fb2-d626-4537-b0c0-59d6dc7c1b06","Type":"ContainerStarted","Data":"9d2235e600784d1f62e1b00dd4a73483c193435f32dc35c52341cc7b85b4878b"} Apr 22 14:19:17.256341 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:17.256266 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hfm6z" podStartSLOduration=1.880610772 podStartE2EDuration="6.256249706s" podCreationTimestamp="2026-04-22 14:19:11 +0000 UTC" firstStartedPulling="2026-04-22 14:19:12.271968813 +0000 UTC m=+199.247934232" lastFinishedPulling="2026-04-22 14:19:16.647607749 +0000 UTC m=+203.623573166" observedRunningTime="2026-04-22 14:19:17.255661898 +0000 UTC m=+204.231627340" watchObservedRunningTime="2026-04-22 14:19:17.256249706 +0000 UTC m=+204.232215136" Apr 22 14:19:23.251796 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.251734 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-97d65848d-5xg44" podUID="0818bdbc-9aec-4634-8ee1-72abdded7e9b" containerName="console" containerID="cri-o://6d058b8000be5a8aa092d94662e4aee165e527fe48d8f6f79165c85f525087b2" gracePeriod=15 Apr 22 14:19:23.491620 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.491597 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-97d65848d-5xg44_0818bdbc-9aec-4634-8ee1-72abdded7e9b/console/0.log" Apr 22 14:19:23.491756 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.491657 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:19:23.541756 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.541670 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-oauth-serving-cert\") pod \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " Apr 22 14:19:23.541756 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.541702 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-service-ca\") pod \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " Apr 22 14:19:23.541756 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.541734 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-trusted-ca-bundle\") pod \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " Apr 22 14:19:23.541980 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.541785 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0818bdbc-9aec-4634-8ee1-72abdded7e9b-console-oauth-config\") pod \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " Apr 22 14:19:23.541980 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.541802 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25t46\" (UniqueName: \"kubernetes.io/projected/0818bdbc-9aec-4634-8ee1-72abdded7e9b-kube-api-access-25t46\") pod \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " Apr 22 14:19:23.542081 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.542035 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-console-config\") pod \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " Apr 22 14:19:23.542139 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.542104 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0818bdbc-9aec-4634-8ee1-72abdded7e9b-console-serving-cert\") pod \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\" (UID: \"0818bdbc-9aec-4634-8ee1-72abdded7e9b\") " Apr 22 14:19:23.542139 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.542102 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0818bdbc-9aec-4634-8ee1-72abdded7e9b" (UID: "0818bdbc-9aec-4634-8ee1-72abdded7e9b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:19:23.542139 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.542130 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0818bdbc-9aec-4634-8ee1-72abdded7e9b" (UID: "0818bdbc-9aec-4634-8ee1-72abdded7e9b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:19:23.542364 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.542341 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-service-ca" (OuterVolumeSpecName: "service-ca") pod "0818bdbc-9aec-4634-8ee1-72abdded7e9b" (UID: "0818bdbc-9aec-4634-8ee1-72abdded7e9b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:19:23.542445 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.542423 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-console-config" (OuterVolumeSpecName: "console-config") pod "0818bdbc-9aec-4634-8ee1-72abdded7e9b" (UID: "0818bdbc-9aec-4634-8ee1-72abdded7e9b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:19:23.542563 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.542548 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-console-config\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:19:23.542618 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.542570 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-oauth-serving-cert\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:19:23.542618 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.542586 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-service-ca\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:19:23.542618 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.542601 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0818bdbc-9aec-4634-8ee1-72abdded7e9b-trusted-ca-bundle\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:19:23.544216 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.544200 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0818bdbc-9aec-4634-8ee1-72abdded7e9b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0818bdbc-9aec-4634-8ee1-72abdded7e9b" (UID: "0818bdbc-9aec-4634-8ee1-72abdded7e9b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:23.544497 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.544468 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0818bdbc-9aec-4634-8ee1-72abdded7e9b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0818bdbc-9aec-4634-8ee1-72abdded7e9b" (UID: "0818bdbc-9aec-4634-8ee1-72abdded7e9b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:23.544599 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.544513 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0818bdbc-9aec-4634-8ee1-72abdded7e9b-kube-api-access-25t46" (OuterVolumeSpecName: "kube-api-access-25t46") pod "0818bdbc-9aec-4634-8ee1-72abdded7e9b" (UID: "0818bdbc-9aec-4634-8ee1-72abdded7e9b"). InnerVolumeSpecName "kube-api-access-25t46". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:19:23.643257 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.643203 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0818bdbc-9aec-4634-8ee1-72abdded7e9b-console-oauth-config\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:19:23.643257 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.643248 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-25t46\" (UniqueName: \"kubernetes.io/projected/0818bdbc-9aec-4634-8ee1-72abdded7e9b-kube-api-access-25t46\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:19:23.643257 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:23.643259 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0818bdbc-9aec-4634-8ee1-72abdded7e9b-console-serving-cert\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:19:24.257013 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:24.256985 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-97d65848d-5xg44_0818bdbc-9aec-4634-8ee1-72abdded7e9b/console/0.log" Apr 22 14:19:24.257403 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:24.257025 2579 generic.go:358] "Generic (PLEG): container finished" podID="0818bdbc-9aec-4634-8ee1-72abdded7e9b" containerID="6d058b8000be5a8aa092d94662e4aee165e527fe48d8f6f79165c85f525087b2" exitCode=2 Apr 22 14:19:24.257403 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:24.257119 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-97d65848d-5xg44" Apr 22 14:19:24.257403 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:24.257118 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-97d65848d-5xg44" event={"ID":"0818bdbc-9aec-4634-8ee1-72abdded7e9b","Type":"ContainerDied","Data":"6d058b8000be5a8aa092d94662e4aee165e527fe48d8f6f79165c85f525087b2"} Apr 22 14:19:24.257403 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:24.257159 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-97d65848d-5xg44" event={"ID":"0818bdbc-9aec-4634-8ee1-72abdded7e9b","Type":"ContainerDied","Data":"e6ccd5f9ef700c58d03baae5feb769734cd5da00085a0ceba888cc5b353e437e"} Apr 22 14:19:24.257403 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:24.257175 2579 scope.go:117] "RemoveContainer" containerID="6d058b8000be5a8aa092d94662e4aee165e527fe48d8f6f79165c85f525087b2" Apr 22 14:19:24.265252 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:24.265236 2579 scope.go:117] "RemoveContainer" containerID="6d058b8000be5a8aa092d94662e4aee165e527fe48d8f6f79165c85f525087b2" Apr 22 14:19:24.265512 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:19:24.265486 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d058b8000be5a8aa092d94662e4aee165e527fe48d8f6f79165c85f525087b2\": container with ID starting with 6d058b8000be5a8aa092d94662e4aee165e527fe48d8f6f79165c85f525087b2 not found: ID does not exist" containerID="6d058b8000be5a8aa092d94662e4aee165e527fe48d8f6f79165c85f525087b2" Apr 22 14:19:24.265583 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:24.265524 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d058b8000be5a8aa092d94662e4aee165e527fe48d8f6f79165c85f525087b2"} err="failed to get container status \"6d058b8000be5a8aa092d94662e4aee165e527fe48d8f6f79165c85f525087b2\": rpc error: code = NotFound desc = could not find container \"6d058b8000be5a8aa092d94662e4aee165e527fe48d8f6f79165c85f525087b2\": container with ID starting with 6d058b8000be5a8aa092d94662e4aee165e527fe48d8f6f79165c85f525087b2 not found: ID does not exist" Apr 22 14:19:24.278445 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:24.278421 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-97d65848d-5xg44"] Apr 22 14:19:24.282618 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:24.282597 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-97d65848d-5xg44"] Apr 22 14:19:25.516606 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:19:25.516573 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0818bdbc-9aec-4634-8ee1-72abdded7e9b" path="/var/lib/kubelet/pods/0818bdbc-9aec-4634-8ee1-72abdded7e9b/volumes" Apr 22 14:20:53.396787 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:20:53.396753 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 14:20:53.398376 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:20:53.398348 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 14:20:53.403999 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:20:53.403976 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 14:21:17.045534 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:17.045498 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv"] Apr 22 14:21:17.045957 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:17.045781 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0818bdbc-9aec-4634-8ee1-72abdded7e9b" containerName="console" Apr 22 14:21:17.045957 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:17.045792 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0818bdbc-9aec-4634-8ee1-72abdded7e9b" containerName="console" Apr 22 14:21:17.045957 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:17.045840 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0818bdbc-9aec-4634-8ee1-72abdded7e9b" containerName="console" Apr 22 14:21:17.048623 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:17.048607 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv" Apr 22 14:21:17.053178 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:17.053152 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 14:21:17.053178 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:17.053173 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-pwqj6\"" Apr 22 14:21:17.054334 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:17.054316 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 14:21:17.071144 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:17.071110 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv"] Apr 22 14:21:17.151745 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:17.151703 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/193b6b49-d2d7-4dfc-baaf-611e2d00886a-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv\" (UID: \"193b6b49-d2d7-4dfc-baaf-611e2d00886a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv" Apr 22 14:21:17.151935 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:17.151767 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/193b6b49-d2d7-4dfc-baaf-611e2d00886a-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv\" (UID: \"193b6b49-d2d7-4dfc-baaf-611e2d00886a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv" Apr 22 14:21:17.151935 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:17.151834 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq2df\" (UniqueName: \"kubernetes.io/projected/193b6b49-d2d7-4dfc-baaf-611e2d00886a-kube-api-access-cq2df\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv\" (UID: \"193b6b49-d2d7-4dfc-baaf-611e2d00886a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv" Apr 22 14:21:17.252707 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:17.252663 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/193b6b49-d2d7-4dfc-baaf-611e2d00886a-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv\" (UID: \"193b6b49-d2d7-4dfc-baaf-611e2d00886a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv" Apr 22 14:21:17.252832 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:17.252715 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cq2df\" (UniqueName: \"kubernetes.io/projected/193b6b49-d2d7-4dfc-baaf-611e2d00886a-kube-api-access-cq2df\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv\" (UID: \"193b6b49-d2d7-4dfc-baaf-611e2d00886a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv" Apr 22 14:21:17.252832 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:17.252779 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/193b6b49-d2d7-4dfc-baaf-611e2d00886a-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv\" (UID: \"193b6b49-d2d7-4dfc-baaf-611e2d00886a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv" Apr 22 14:21:17.253069 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:17.253046 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/193b6b49-d2d7-4dfc-baaf-611e2d00886a-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv\" (UID: \"193b6b49-d2d7-4dfc-baaf-611e2d00886a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv" Apr 22 14:21:17.253143 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:17.253069 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/193b6b49-d2d7-4dfc-baaf-611e2d00886a-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv\" (UID: \"193b6b49-d2d7-4dfc-baaf-611e2d00886a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv" Apr 22 14:21:17.261741 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:17.261712 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq2df\" (UniqueName: \"kubernetes.io/projected/193b6b49-d2d7-4dfc-baaf-611e2d00886a-kube-api-access-cq2df\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv\" (UID: \"193b6b49-d2d7-4dfc-baaf-611e2d00886a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv" Apr 22 14:21:17.358084 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:17.357999 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv" Apr 22 14:21:17.476259 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:17.476232 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv"] Apr 22 14:21:17.478813 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:21:17.478785 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod193b6b49_d2d7_4dfc_baaf_611e2d00886a.slice/crio-b6af502c7cd2d7c309f2d5398f8bf4345b3fb51ae35d763a76a9bd9a6f32241e WatchSource:0}: Error finding container b6af502c7cd2d7c309f2d5398f8bf4345b3fb51ae35d763a76a9bd9a6f32241e: Status 404 returned error can't find the container with id b6af502c7cd2d7c309f2d5398f8bf4345b3fb51ae35d763a76a9bd9a6f32241e Apr 22 14:21:17.480559 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:17.480543 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:21:17.567038 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:17.566998 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv" event={"ID":"193b6b49-d2d7-4dfc-baaf-611e2d00886a","Type":"ContainerStarted","Data":"b6af502c7cd2d7c309f2d5398f8bf4345b3fb51ae35d763a76a9bd9a6f32241e"} Apr 22 14:21:23.587246 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:23.587210 2579 generic.go:358] "Generic (PLEG): container finished" podID="193b6b49-d2d7-4dfc-baaf-611e2d00886a" containerID="5197dc31efb3ebdc107fe431161e20e114eafd2c86c4029adf11980254154ccf" exitCode=0 Apr 22 14:21:23.587640 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:23.587293 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv" event={"ID":"193b6b49-d2d7-4dfc-baaf-611e2d00886a","Type":"ContainerDied","Data":"5197dc31efb3ebdc107fe431161e20e114eafd2c86c4029adf11980254154ccf"} Apr 22 14:21:26.596920 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:26.596884 2579 generic.go:358] "Generic (PLEG): container finished" podID="193b6b49-d2d7-4dfc-baaf-611e2d00886a" containerID="b26fb5e82151989558500f6e97a4f8431a14f40ec65760239fe13e8368bcbe9c" exitCode=0 Apr 22 14:21:26.597350 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:26.596949 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv" event={"ID":"193b6b49-d2d7-4dfc-baaf-611e2d00886a","Type":"ContainerDied","Data":"b26fb5e82151989558500f6e97a4f8431a14f40ec65760239fe13e8368bcbe9c"} Apr 22 14:21:33.618067 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:33.618022 2579 generic.go:358] "Generic (PLEG): container finished" podID="193b6b49-d2d7-4dfc-baaf-611e2d00886a" containerID="72b9e4a9c99ce6138f96ef3e227d32b2f06d72ab34fbd93c529fc0df4d4310f2" exitCode=0 Apr 22 14:21:33.618468 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:33.618109 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv" event={"ID":"193b6b49-d2d7-4dfc-baaf-611e2d00886a","Type":"ContainerDied","Data":"72b9e4a9c99ce6138f96ef3e227d32b2f06d72ab34fbd93c529fc0df4d4310f2"} Apr 22 14:21:34.740089 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:34.740063 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv" Apr 22 14:21:34.797584 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:34.797544 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq2df\" (UniqueName: \"kubernetes.io/projected/193b6b49-d2d7-4dfc-baaf-611e2d00886a-kube-api-access-cq2df\") pod \"193b6b49-d2d7-4dfc-baaf-611e2d00886a\" (UID: \"193b6b49-d2d7-4dfc-baaf-611e2d00886a\") " Apr 22 14:21:34.797753 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:34.797611 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/193b6b49-d2d7-4dfc-baaf-611e2d00886a-bundle\") pod \"193b6b49-d2d7-4dfc-baaf-611e2d00886a\" (UID: \"193b6b49-d2d7-4dfc-baaf-611e2d00886a\") " Apr 22 14:21:34.797753 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:34.797651 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/193b6b49-d2d7-4dfc-baaf-611e2d00886a-util\") pod \"193b6b49-d2d7-4dfc-baaf-611e2d00886a\" (UID: \"193b6b49-d2d7-4dfc-baaf-611e2d00886a\") " Apr 22 14:21:34.798241 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:34.798218 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/193b6b49-d2d7-4dfc-baaf-611e2d00886a-bundle" (OuterVolumeSpecName: "bundle") pod "193b6b49-d2d7-4dfc-baaf-611e2d00886a" (UID: "193b6b49-d2d7-4dfc-baaf-611e2d00886a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:21:34.799785 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:34.799763 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/193b6b49-d2d7-4dfc-baaf-611e2d00886a-kube-api-access-cq2df" (OuterVolumeSpecName: "kube-api-access-cq2df") pod "193b6b49-d2d7-4dfc-baaf-611e2d00886a" (UID: "193b6b49-d2d7-4dfc-baaf-611e2d00886a"). InnerVolumeSpecName "kube-api-access-cq2df". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:21:34.802414 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:34.802384 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/193b6b49-d2d7-4dfc-baaf-611e2d00886a-util" (OuterVolumeSpecName: "util") pod "193b6b49-d2d7-4dfc-baaf-611e2d00886a" (UID: "193b6b49-d2d7-4dfc-baaf-611e2d00886a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:21:34.899307 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:34.899201 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cq2df\" (UniqueName: \"kubernetes.io/projected/193b6b49-d2d7-4dfc-baaf-611e2d00886a-kube-api-access-cq2df\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:21:34.899307 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:34.899244 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/193b6b49-d2d7-4dfc-baaf-611e2d00886a-bundle\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:21:34.899307 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:34.899254 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/193b6b49-d2d7-4dfc-baaf-611e2d00886a-util\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:21:35.625280 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:35.625235 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv" event={"ID":"193b6b49-d2d7-4dfc-baaf-611e2d00886a","Type":"ContainerDied","Data":"b6af502c7cd2d7c309f2d5398f8bf4345b3fb51ae35d763a76a9bd9a6f32241e"} Apr 22 14:21:35.625280 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:35.625266 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxf5tv" Apr 22 14:21:35.625280 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:35.625277 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6af502c7cd2d7c309f2d5398f8bf4345b3fb51ae35d763a76a9bd9a6f32241e" Apr 22 14:21:39.286143 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:39.286059 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96z9l"] Apr 22 14:21:39.286527 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:39.286350 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="193b6b49-d2d7-4dfc-baaf-611e2d00886a" containerName="util" Apr 22 14:21:39.286527 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:39.286363 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="193b6b49-d2d7-4dfc-baaf-611e2d00886a" containerName="util" Apr 22 14:21:39.286527 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:39.286377 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="193b6b49-d2d7-4dfc-baaf-611e2d00886a" containerName="pull" Apr 22 14:21:39.286527 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:39.286383 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="193b6b49-d2d7-4dfc-baaf-611e2d00886a" containerName="pull" Apr 22 14:21:39.286527 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:39.286395 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="193b6b49-d2d7-4dfc-baaf-611e2d00886a" containerName="extract" Apr 22 14:21:39.286527 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:39.286402 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="193b6b49-d2d7-4dfc-baaf-611e2d00886a" containerName="extract" Apr 22 14:21:39.286527 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:39.286468 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="193b6b49-d2d7-4dfc-baaf-611e2d00886a" containerName="extract" Apr 22 14:21:39.288203 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:39.288183 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96z9l" Apr 22 14:21:39.291110 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:39.291084 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 14:21:39.291219 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:39.291137 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 14:21:39.291219 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:39.291157 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-pktqp\"" Apr 22 14:21:39.291348 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:39.291331 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 14:21:39.303312 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:39.303290 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96z9l"] Apr 22 14:21:39.332561 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:39.332531 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wrr5\" (UniqueName: \"kubernetes.io/projected/0dd05019-3d28-4dbb-acd7-30e2d734b4ce-kube-api-access-8wrr5\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-96z9l\" (UID: \"0dd05019-3d28-4dbb-acd7-30e2d734b4ce\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96z9l" Apr 22 14:21:39.332561 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:39.332566 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0dd05019-3d28-4dbb-acd7-30e2d734b4ce-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-96z9l\" (UID: \"0dd05019-3d28-4dbb-acd7-30e2d734b4ce\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96z9l" Apr 22 14:21:39.433912 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:39.433876 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wrr5\" (UniqueName: \"kubernetes.io/projected/0dd05019-3d28-4dbb-acd7-30e2d734b4ce-kube-api-access-8wrr5\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-96z9l\" (UID: \"0dd05019-3d28-4dbb-acd7-30e2d734b4ce\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96z9l" Apr 22 14:21:39.433912 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:39.433914 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0dd05019-3d28-4dbb-acd7-30e2d734b4ce-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-96z9l\" (UID: \"0dd05019-3d28-4dbb-acd7-30e2d734b4ce\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96z9l" Apr 22 14:21:39.436140 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:39.436118 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0dd05019-3d28-4dbb-acd7-30e2d734b4ce-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-96z9l\" (UID: \"0dd05019-3d28-4dbb-acd7-30e2d734b4ce\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96z9l" Apr 22 14:21:39.446118 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:39.446093 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wrr5\" (UniqueName: \"kubernetes.io/projected/0dd05019-3d28-4dbb-acd7-30e2d734b4ce-kube-api-access-8wrr5\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-96z9l\" (UID: \"0dd05019-3d28-4dbb-acd7-30e2d734b4ce\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96z9l" Apr 22 14:21:39.598303 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:39.598214 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96z9l" Apr 22 14:21:39.727207 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:39.727150 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96z9l"] Apr 22 14:21:39.730208 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:21:39.730181 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dd05019_3d28_4dbb_acd7_30e2d734b4ce.slice/crio-bd1dd2c59c1cd6e31c327b698fe8c88d79e8dff65f06dea4b3c89518e6f27d00 WatchSource:0}: Error finding container bd1dd2c59c1cd6e31c327b698fe8c88d79e8dff65f06dea4b3c89518e6f27d00: Status 404 returned error can't find the container with id bd1dd2c59c1cd6e31c327b698fe8c88d79e8dff65f06dea4b3c89518e6f27d00 Apr 22 14:21:40.644946 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:40.644910 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96z9l" event={"ID":"0dd05019-3d28-4dbb-acd7-30e2d734b4ce","Type":"ContainerStarted","Data":"bd1dd2c59c1cd6e31c327b698fe8c88d79e8dff65f06dea4b3c89518e6f27d00"} Apr 22 14:21:43.656150 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:43.656067 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96z9l" event={"ID":"0dd05019-3d28-4dbb-acd7-30e2d734b4ce","Type":"ContainerStarted","Data":"73c2608e5579597bce7660246b576373177ffc0f3d2f4a3e4240c51c23285e1c"} Apr 22 14:21:43.656573 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:43.656307 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96z9l" Apr 22 14:21:43.677195 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:43.677136 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96z9l" podStartSLOduration=1.003479143 podStartE2EDuration="4.677120064s" podCreationTimestamp="2026-04-22 14:21:39 +0000 UTC" firstStartedPulling="2026-04-22 14:21:39.732446983 +0000 UTC m=+346.708412401" lastFinishedPulling="2026-04-22 14:21:43.406087904 +0000 UTC m=+350.382053322" observedRunningTime="2026-04-22 14:21:43.675073822 +0000 UTC m=+350.651039262" watchObservedRunningTime="2026-04-22 14:21:43.677120064 +0000 UTC m=+350.653085501" Apr 22 14:21:43.977874 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:43.977840 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-2qqgq"] Apr 22 14:21:43.979815 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:43.979799 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-2qqgq" Apr 22 14:21:43.983818 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:43.983793 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-fsd6x\"" Apr 22 14:21:43.983982 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:43.983835 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 14:21:43.984401 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:43.984386 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 14:21:43.995764 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:43.995716 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-2qqgq"] Apr 22 14:21:44.074909 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.074826 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsxhv\" (UniqueName: \"kubernetes.io/projected/ac38b264-8c29-46b5-995a-922577acc638-kube-api-access-wsxhv\") pod \"keda-operator-ffbb595cb-2qqgq\" (UID: \"ac38b264-8c29-46b5-995a-922577acc638\") " pod="openshift-keda/keda-operator-ffbb595cb-2qqgq" Apr 22 14:21:44.075233 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.075213 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ac38b264-8c29-46b5-995a-922577acc638-cabundle0\") pod \"keda-operator-ffbb595cb-2qqgq\" (UID: \"ac38b264-8c29-46b5-995a-922577acc638\") " pod="openshift-keda/keda-operator-ffbb595cb-2qqgq" Apr 22 14:21:44.075411 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.075382 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ac38b264-8c29-46b5-995a-922577acc638-certificates\") pod \"keda-operator-ffbb595cb-2qqgq\" (UID: \"ac38b264-8c29-46b5-995a-922577acc638\") " pod="openshift-keda/keda-operator-ffbb595cb-2qqgq" Apr 22 14:21:44.176833 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.176802 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wsxhv\" (UniqueName: \"kubernetes.io/projected/ac38b264-8c29-46b5-995a-922577acc638-kube-api-access-wsxhv\") pod \"keda-operator-ffbb595cb-2qqgq\" (UID: \"ac38b264-8c29-46b5-995a-922577acc638\") " pod="openshift-keda/keda-operator-ffbb595cb-2qqgq" Apr 22 14:21:44.176833 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.176846 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ac38b264-8c29-46b5-995a-922577acc638-cabundle0\") pod \"keda-operator-ffbb595cb-2qqgq\" (UID: \"ac38b264-8c29-46b5-995a-922577acc638\") " pod="openshift-keda/keda-operator-ffbb595cb-2qqgq" Apr 22 14:21:44.177096 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.176869 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ac38b264-8c29-46b5-995a-922577acc638-certificates\") pod \"keda-operator-ffbb595cb-2qqgq\" (UID: \"ac38b264-8c29-46b5-995a-922577acc638\") " pod="openshift-keda/keda-operator-ffbb595cb-2qqgq" Apr 22 14:21:44.177096 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:44.176960 2579 secret.go:281] references non-existent secret key: ca.crt Apr 22 14:21:44.177096 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:44.176971 2579 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 14:21:44.177096 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:44.176979 2579 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-2qqgq: references non-existent secret key: ca.crt Apr 22 14:21:44.177096 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:44.177033 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac38b264-8c29-46b5-995a-922577acc638-certificates podName:ac38b264-8c29-46b5-995a-922577acc638 nodeName:}" failed. No retries permitted until 2026-04-22 14:21:44.677014134 +0000 UTC m=+351.652979555 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ac38b264-8c29-46b5-995a-922577acc638-certificates") pod "keda-operator-ffbb595cb-2qqgq" (UID: "ac38b264-8c29-46b5-995a-922577acc638") : references non-existent secret key: ca.crt Apr 22 14:21:44.177433 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.177410 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ac38b264-8c29-46b5-995a-922577acc638-cabundle0\") pod \"keda-operator-ffbb595cb-2qqgq\" (UID: \"ac38b264-8c29-46b5-995a-922577acc638\") " pod="openshift-keda/keda-operator-ffbb595cb-2qqgq" Apr 22 14:21:44.187524 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.187497 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsxhv\" (UniqueName: \"kubernetes.io/projected/ac38b264-8c29-46b5-995a-922577acc638-kube-api-access-wsxhv\") pod \"keda-operator-ffbb595cb-2qqgq\" (UID: \"ac38b264-8c29-46b5-995a-922577acc638\") " pod="openshift-keda/keda-operator-ffbb595cb-2qqgq" Apr 22 14:21:44.378182 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.378100 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx"] Apr 22 14:21:44.380234 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.380213 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx" Apr 22 14:21:44.383818 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.383791 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 14:21:44.417974 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.417943 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx"] Apr 22 14:21:44.479064 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.479021 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1d1b54a3-25b0-4229-b889-ee27113621b2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-b9slx\" (UID: \"1d1b54a3-25b0-4229-b889-ee27113621b2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx" Apr 22 14:21:44.479244 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.479173 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/1d1b54a3-25b0-4229-b889-ee27113621b2-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-b9slx\" (UID: \"1d1b54a3-25b0-4229-b889-ee27113621b2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx" Apr 22 14:21:44.479244 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.479207 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7kbb\" (UniqueName: \"kubernetes.io/projected/1d1b54a3-25b0-4229-b889-ee27113621b2-kube-api-access-f7kbb\") pod \"keda-metrics-apiserver-7c9f485588-b9slx\" (UID: \"1d1b54a3-25b0-4229-b889-ee27113621b2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx" Apr 22 14:21:44.580407 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.580367 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1d1b54a3-25b0-4229-b889-ee27113621b2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-b9slx\" (UID: \"1d1b54a3-25b0-4229-b889-ee27113621b2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx" Apr 22 14:21:44.580618 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.580498 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/1d1b54a3-25b0-4229-b889-ee27113621b2-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-b9slx\" (UID: \"1d1b54a3-25b0-4229-b889-ee27113621b2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx" Apr 22 14:21:44.580618 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.580522 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7kbb\" (UniqueName: \"kubernetes.io/projected/1d1b54a3-25b0-4229-b889-ee27113621b2-kube-api-access-f7kbb\") pod \"keda-metrics-apiserver-7c9f485588-b9slx\" (UID: \"1d1b54a3-25b0-4229-b889-ee27113621b2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx" Apr 22 14:21:44.580618 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:44.580529 2579 secret.go:281] references non-existent secret key: tls.crt Apr 22 14:21:44.580618 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:44.580550 2579 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 14:21:44.580618 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:44.580572 2579 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx: references non-existent secret key: tls.crt Apr 22 14:21:44.580846 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:44.580627 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1d1b54a3-25b0-4229-b889-ee27113621b2-certificates podName:1d1b54a3-25b0-4229-b889-ee27113621b2 nodeName:}" failed. No retries permitted until 2026-04-22 14:21:45.08060891 +0000 UTC m=+352.056574337 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1d1b54a3-25b0-4229-b889-ee27113621b2-certificates") pod "keda-metrics-apiserver-7c9f485588-b9slx" (UID: "1d1b54a3-25b0-4229-b889-ee27113621b2") : references non-existent secret key: tls.crt Apr 22 14:21:44.580978 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.580956 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/1d1b54a3-25b0-4229-b889-ee27113621b2-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-b9slx\" (UID: \"1d1b54a3-25b0-4229-b889-ee27113621b2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx" Apr 22 14:21:44.590288 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.590261 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7kbb\" (UniqueName: \"kubernetes.io/projected/1d1b54a3-25b0-4229-b889-ee27113621b2-kube-api-access-f7kbb\") pod \"keda-metrics-apiserver-7c9f485588-b9slx\" (UID: \"1d1b54a3-25b0-4229-b889-ee27113621b2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx" Apr 22 14:21:44.637735 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.637654 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-6522t"] Apr 22 14:21:44.639647 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.639631 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-6522t" Apr 22 14:21:44.642923 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.642899 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 14:21:44.656986 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.656958 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-6522t"] Apr 22 14:21:44.681636 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.681598 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ac38b264-8c29-46b5-995a-922577acc638-certificates\") pod \"keda-operator-ffbb595cb-2qqgq\" (UID: \"ac38b264-8c29-46b5-995a-922577acc638\") " pod="openshift-keda/keda-operator-ffbb595cb-2qqgq" Apr 22 14:21:44.681793 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:44.681750 2579 secret.go:281] references non-existent secret key: ca.crt Apr 22 14:21:44.681793 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:44.681764 2579 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 14:21:44.681793 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:44.681773 2579 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-2qqgq: references non-existent secret key: ca.crt Apr 22 14:21:44.681892 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:44.681826 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac38b264-8c29-46b5-995a-922577acc638-certificates podName:ac38b264-8c29-46b5-995a-922577acc638 nodeName:}" failed. No retries permitted until 2026-04-22 14:21:45.681810758 +0000 UTC m=+352.657776190 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ac38b264-8c29-46b5-995a-922577acc638-certificates") pod "keda-operator-ffbb595cb-2qqgq" (UID: "ac38b264-8c29-46b5-995a-922577acc638") : references non-existent secret key: ca.crt Apr 22 14:21:44.782394 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.782353 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcc45\" (UniqueName: \"kubernetes.io/projected/60782240-fb2a-4de3-9153-07e4f076ddbf-kube-api-access-qcc45\") pod \"keda-admission-cf49989db-6522t\" (UID: \"60782240-fb2a-4de3-9153-07e4f076ddbf\") " pod="openshift-keda/keda-admission-cf49989db-6522t" Apr 22 14:21:44.782718 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.782688 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/60782240-fb2a-4de3-9153-07e4f076ddbf-certificates\") pod \"keda-admission-cf49989db-6522t\" (UID: \"60782240-fb2a-4de3-9153-07e4f076ddbf\") " pod="openshift-keda/keda-admission-cf49989db-6522t" Apr 22 14:21:44.884096 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.884050 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qcc45\" (UniqueName: \"kubernetes.io/projected/60782240-fb2a-4de3-9153-07e4f076ddbf-kube-api-access-qcc45\") pod \"keda-admission-cf49989db-6522t\" (UID: \"60782240-fb2a-4de3-9153-07e4f076ddbf\") " pod="openshift-keda/keda-admission-cf49989db-6522t" Apr 22 14:21:44.884365 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.884130 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/60782240-fb2a-4de3-9153-07e4f076ddbf-certificates\") pod \"keda-admission-cf49989db-6522t\" (UID: \"60782240-fb2a-4de3-9153-07e4f076ddbf\") " pod="openshift-keda/keda-admission-cf49989db-6522t" Apr 22 14:21:44.884365 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:44.884246 2579 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 22 14:21:44.884365 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:44.884264 2579 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-6522t: secret "keda-admission-webhooks-certs" not found Apr 22 14:21:44.884365 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:44.884317 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/60782240-fb2a-4de3-9153-07e4f076ddbf-certificates podName:60782240-fb2a-4de3-9153-07e4f076ddbf nodeName:}" failed. No retries permitted until 2026-04-22 14:21:45.384300639 +0000 UTC m=+352.360266058 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/60782240-fb2a-4de3-9153-07e4f076ddbf-certificates") pod "keda-admission-cf49989db-6522t" (UID: "60782240-fb2a-4de3-9153-07e4f076ddbf") : secret "keda-admission-webhooks-certs" not found Apr 22 14:21:44.893999 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:44.893933 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcc45\" (UniqueName: \"kubernetes.io/projected/60782240-fb2a-4de3-9153-07e4f076ddbf-kube-api-access-qcc45\") pod \"keda-admission-cf49989db-6522t\" (UID: \"60782240-fb2a-4de3-9153-07e4f076ddbf\") " pod="openshift-keda/keda-admission-cf49989db-6522t" Apr 22 14:21:45.085915 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:45.085881 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1d1b54a3-25b0-4229-b889-ee27113621b2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-b9slx\" (UID: \"1d1b54a3-25b0-4229-b889-ee27113621b2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx" Apr 22 14:21:45.086073 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:45.086032 2579 secret.go:281] references non-existent secret key: tls.crt Apr 22 14:21:45.086073 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:45.086052 2579 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 14:21:45.086073 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:45.086072 2579 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx: references non-existent secret key: tls.crt Apr 22 14:21:45.086205 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:45.086122 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1d1b54a3-25b0-4229-b889-ee27113621b2-certificates podName:1d1b54a3-25b0-4229-b889-ee27113621b2 nodeName:}" failed. No retries permitted until 2026-04-22 14:21:46.086107923 +0000 UTC m=+353.062073342 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1d1b54a3-25b0-4229-b889-ee27113621b2-certificates") pod "keda-metrics-apiserver-7c9f485588-b9slx" (UID: "1d1b54a3-25b0-4229-b889-ee27113621b2") : references non-existent secret key: tls.crt Apr 22 14:21:45.388493 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:45.388439 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/60782240-fb2a-4de3-9153-07e4f076ddbf-certificates\") pod \"keda-admission-cf49989db-6522t\" (UID: \"60782240-fb2a-4de3-9153-07e4f076ddbf\") " pod="openshift-keda/keda-admission-cf49989db-6522t" Apr 22 14:21:45.388668 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:45.388582 2579 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 22 14:21:45.388668 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:45.388603 2579 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-6522t: secret "keda-admission-webhooks-certs" not found Apr 22 14:21:45.388668 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:45.388656 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/60782240-fb2a-4de3-9153-07e4f076ddbf-certificates podName:60782240-fb2a-4de3-9153-07e4f076ddbf nodeName:}" failed. No retries permitted until 2026-04-22 14:21:46.38863986 +0000 UTC m=+353.364605303 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/60782240-fb2a-4de3-9153-07e4f076ddbf-certificates") pod "keda-admission-cf49989db-6522t" (UID: "60782240-fb2a-4de3-9153-07e4f076ddbf") : secret "keda-admission-webhooks-certs" not found Apr 22 14:21:45.691987 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:45.691947 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ac38b264-8c29-46b5-995a-922577acc638-certificates\") pod \"keda-operator-ffbb595cb-2qqgq\" (UID: \"ac38b264-8c29-46b5-995a-922577acc638\") " pod="openshift-keda/keda-operator-ffbb595cb-2qqgq" Apr 22 14:21:45.692441 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:45.692119 2579 secret.go:281] references non-existent secret key: ca.crt Apr 22 14:21:45.692441 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:45.692147 2579 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 14:21:45.692441 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:45.692172 2579 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-2qqgq: references non-existent secret key: ca.crt Apr 22 14:21:45.692441 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:45.692238 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac38b264-8c29-46b5-995a-922577acc638-certificates podName:ac38b264-8c29-46b5-995a-922577acc638 nodeName:}" failed. No retries permitted until 2026-04-22 14:21:47.692219573 +0000 UTC m=+354.668185000 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ac38b264-8c29-46b5-995a-922577acc638-certificates") pod "keda-operator-ffbb595cb-2qqgq" (UID: "ac38b264-8c29-46b5-995a-922577acc638") : references non-existent secret key: ca.crt Apr 22 14:21:46.096591 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:46.096499 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1d1b54a3-25b0-4229-b889-ee27113621b2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-b9slx\" (UID: \"1d1b54a3-25b0-4229-b889-ee27113621b2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx" Apr 22 14:21:46.096754 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:46.096609 2579 secret.go:281] references non-existent secret key: tls.crt Apr 22 14:21:46.096754 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:46.096622 2579 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 14:21:46.096754 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:46.096638 2579 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx: references non-existent secret key: tls.crt Apr 22 14:21:46.096754 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:21:46.096694 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1d1b54a3-25b0-4229-b889-ee27113621b2-certificates podName:1d1b54a3-25b0-4229-b889-ee27113621b2 nodeName:}" failed. No retries permitted until 2026-04-22 14:21:48.096682453 +0000 UTC m=+355.072647870 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1d1b54a3-25b0-4229-b889-ee27113621b2-certificates") pod "keda-metrics-apiserver-7c9f485588-b9slx" (UID: "1d1b54a3-25b0-4229-b889-ee27113621b2") : references non-existent secret key: tls.crt Apr 22 14:21:46.399160 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:46.399066 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/60782240-fb2a-4de3-9153-07e4f076ddbf-certificates\") pod \"keda-admission-cf49989db-6522t\" (UID: \"60782240-fb2a-4de3-9153-07e4f076ddbf\") " pod="openshift-keda/keda-admission-cf49989db-6522t" Apr 22 14:21:46.401505 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:46.401474 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/60782240-fb2a-4de3-9153-07e4f076ddbf-certificates\") pod \"keda-admission-cf49989db-6522t\" (UID: \"60782240-fb2a-4de3-9153-07e4f076ddbf\") " pod="openshift-keda/keda-admission-cf49989db-6522t" Apr 22 14:21:46.454191 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:46.454149 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-6522t" Apr 22 14:21:46.578428 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:46.578391 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-6522t"] Apr 22 14:21:46.582157 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:21:46.582130 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60782240_fb2a_4de3_9153_07e4f076ddbf.slice/crio-04ba5b9bf5875f4fd39552fc0663c9cbedcf0038b6ea7fd4d773e8718946fea5 WatchSource:0}: Error finding container 04ba5b9bf5875f4fd39552fc0663c9cbedcf0038b6ea7fd4d773e8718946fea5: Status 404 returned error can't find the container with id 04ba5b9bf5875f4fd39552fc0663c9cbedcf0038b6ea7fd4d773e8718946fea5 Apr 22 14:21:46.667010 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:46.666927 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-6522t" event={"ID":"60782240-fb2a-4de3-9153-07e4f076ddbf","Type":"ContainerStarted","Data":"04ba5b9bf5875f4fd39552fc0663c9cbedcf0038b6ea7fd4d773e8718946fea5"} Apr 22 14:21:47.711788 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:47.711746 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ac38b264-8c29-46b5-995a-922577acc638-certificates\") pod \"keda-operator-ffbb595cb-2qqgq\" (UID: \"ac38b264-8c29-46b5-995a-922577acc638\") " pod="openshift-keda/keda-operator-ffbb595cb-2qqgq" Apr 22 14:21:47.714542 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:47.714510 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ac38b264-8c29-46b5-995a-922577acc638-certificates\") pod \"keda-operator-ffbb595cb-2qqgq\" (UID: \"ac38b264-8c29-46b5-995a-922577acc638\") " pod="openshift-keda/keda-operator-ffbb595cb-2qqgq" Apr 22 14:21:47.890259 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:47.890196 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-2qqgq" Apr 22 14:21:48.015238 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:48.015212 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-2qqgq"] Apr 22 14:21:48.017624 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:21:48.017596 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac38b264_8c29_46b5_995a_922577acc638.slice/crio-c24cfa5dd0641b0180aab30ab21d477b3e72a8de3badc0c2ebcea9ded79020d7 WatchSource:0}: Error finding container c24cfa5dd0641b0180aab30ab21d477b3e72a8de3badc0c2ebcea9ded79020d7: Status 404 returned error can't find the container with id c24cfa5dd0641b0180aab30ab21d477b3e72a8de3badc0c2ebcea9ded79020d7 Apr 22 14:21:48.115290 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:48.115251 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1d1b54a3-25b0-4229-b889-ee27113621b2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-b9slx\" (UID: \"1d1b54a3-25b0-4229-b889-ee27113621b2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx" Apr 22 14:21:48.117778 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:48.117750 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1d1b54a3-25b0-4229-b889-ee27113621b2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-b9slx\" (UID: \"1d1b54a3-25b0-4229-b889-ee27113621b2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx" Apr 22 14:21:48.290883 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:48.290792 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx" Apr 22 14:21:48.412722 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:48.412684 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx"] Apr 22 14:21:48.415057 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:21:48.415028 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d1b54a3_25b0_4229_b889_ee27113621b2.slice/crio-29daa360c7d0408fde3417c93a6130788708d6a33ed74ca8a3845ca8dcb1341f WatchSource:0}: Error finding container 29daa360c7d0408fde3417c93a6130788708d6a33ed74ca8a3845ca8dcb1341f: Status 404 returned error can't find the container with id 29daa360c7d0408fde3417c93a6130788708d6a33ed74ca8a3845ca8dcb1341f Apr 22 14:21:48.674869 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:48.674769 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx" event={"ID":"1d1b54a3-25b0-4229-b889-ee27113621b2","Type":"ContainerStarted","Data":"29daa360c7d0408fde3417c93a6130788708d6a33ed74ca8a3845ca8dcb1341f"} Apr 22 14:21:48.676039 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:48.676009 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-6522t" event={"ID":"60782240-fb2a-4de3-9153-07e4f076ddbf","Type":"ContainerStarted","Data":"5bd4e4a6cc45d47b4f32c255238d493e4056fdcc9563fe92495b0619c92eb78a"} Apr 22 14:21:48.676182 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:48.676070 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-6522t" Apr 22 14:21:48.677037 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:48.677017 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-2qqgq" event={"ID":"ac38b264-8c29-46b5-995a-922577acc638","Type":"ContainerStarted","Data":"c24cfa5dd0641b0180aab30ab21d477b3e72a8de3badc0c2ebcea9ded79020d7"} Apr 22 14:21:48.695233 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:48.695182 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-6522t" podStartSLOduration=3.458787238 podStartE2EDuration="4.695165277s" podCreationTimestamp="2026-04-22 14:21:44 +0000 UTC" firstStartedPulling="2026-04-22 14:21:46.583351484 +0000 UTC m=+353.559316902" lastFinishedPulling="2026-04-22 14:21:47.819729516 +0000 UTC m=+354.795694941" observedRunningTime="2026-04-22 14:21:48.692837238 +0000 UTC m=+355.668802678" watchObservedRunningTime="2026-04-22 14:21:48.695165277 +0000 UTC m=+355.671130708" Apr 22 14:21:52.694222 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:52.694180 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx" event={"ID":"1d1b54a3-25b0-4229-b889-ee27113621b2","Type":"ContainerStarted","Data":"c20d80aaefe7dc1dae12bac8cc7682d17f9060f525a6298a994fc47a11c77452"} Apr 22 14:21:52.694709 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:52.694256 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx" Apr 22 14:21:52.695430 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:52.695406 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-2qqgq" event={"ID":"ac38b264-8c29-46b5-995a-922577acc638","Type":"ContainerStarted","Data":"a1eb964d3b4f5fad1ccf9c9f2290792ba7514a2d77c93291032842bed27b39e3"} Apr 22 14:21:52.695574 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:52.695559 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-2qqgq" Apr 22 14:21:52.715934 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:52.715874 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx" podStartSLOduration=5.198454934 podStartE2EDuration="8.715859468s" podCreationTimestamp="2026-04-22 14:21:44 +0000 UTC" firstStartedPulling="2026-04-22 14:21:48.4163561 +0000 UTC m=+355.392321518" lastFinishedPulling="2026-04-22 14:21:51.933760634 +0000 UTC m=+358.909726052" observedRunningTime="2026-04-22 14:21:52.713514859 +0000 UTC m=+359.689480312" watchObservedRunningTime="2026-04-22 14:21:52.715859468 +0000 UTC m=+359.691824907" Apr 22 14:21:52.738850 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:21:52.738798 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-2qqgq" podStartSLOduration=5.812748611 podStartE2EDuration="9.738781754s" podCreationTimestamp="2026-04-22 14:21:43 +0000 UTC" firstStartedPulling="2026-04-22 14:21:48.019532282 +0000 UTC m=+354.995497699" lastFinishedPulling="2026-04-22 14:21:51.945565413 +0000 UTC m=+358.921530842" observedRunningTime="2026-04-22 14:21:52.736606581 +0000 UTC m=+359.712572021" watchObservedRunningTime="2026-04-22 14:21:52.738781754 +0000 UTC m=+359.714747194" Apr 22 14:22:03.702945 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:03.702912 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b9slx" Apr 22 14:22:04.662717 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:04.662688 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96z9l" Apr 22 14:22:09.684918 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:09.684890 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-6522t" Apr 22 14:22:13.700641 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:13.700612 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-2qqgq" Apr 22 14:22:42.726664 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.726627 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-655c8696c9-49h4h"] Apr 22 14:22:42.729779 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.729761 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:42.759978 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.759942 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-655c8696c9-49h4h"] Apr 22 14:22:42.863561 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.863528 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3d898a11-5438-403f-8954-683e2aa1a385-console-config\") pod \"console-655c8696c9-49h4h\" (UID: \"3d898a11-5438-403f-8954-683e2aa1a385\") " pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:42.863754 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.863571 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr2x6\" (UniqueName: \"kubernetes.io/projected/3d898a11-5438-403f-8954-683e2aa1a385-kube-api-access-jr2x6\") pod \"console-655c8696c9-49h4h\" (UID: \"3d898a11-5438-403f-8954-683e2aa1a385\") " pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:42.863754 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.863629 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d898a11-5438-403f-8954-683e2aa1a385-service-ca\") pod \"console-655c8696c9-49h4h\" (UID: \"3d898a11-5438-403f-8954-683e2aa1a385\") " pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:42.863754 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.863665 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3d898a11-5438-403f-8954-683e2aa1a385-console-oauth-config\") pod \"console-655c8696c9-49h4h\" (UID: \"3d898a11-5438-403f-8954-683e2aa1a385\") " pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:42.863754 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.863697 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d898a11-5438-403f-8954-683e2aa1a385-trusted-ca-bundle\") pod \"console-655c8696c9-49h4h\" (UID: \"3d898a11-5438-403f-8954-683e2aa1a385\") " pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:42.863754 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.863747 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d898a11-5438-403f-8954-683e2aa1a385-console-serving-cert\") pod \"console-655c8696c9-49h4h\" (UID: \"3d898a11-5438-403f-8954-683e2aa1a385\") " pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:42.863957 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.863805 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3d898a11-5438-403f-8954-683e2aa1a385-oauth-serving-cert\") pod \"console-655c8696c9-49h4h\" (UID: \"3d898a11-5438-403f-8954-683e2aa1a385\") " pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:42.964797 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.964765 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d898a11-5438-403f-8954-683e2aa1a385-trusted-ca-bundle\") pod \"console-655c8696c9-49h4h\" (UID: \"3d898a11-5438-403f-8954-683e2aa1a385\") " pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:42.964963 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.964819 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d898a11-5438-403f-8954-683e2aa1a385-console-serving-cert\") pod \"console-655c8696c9-49h4h\" (UID: \"3d898a11-5438-403f-8954-683e2aa1a385\") " pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:42.964963 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.964861 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3d898a11-5438-403f-8954-683e2aa1a385-oauth-serving-cert\") pod \"console-655c8696c9-49h4h\" (UID: \"3d898a11-5438-403f-8954-683e2aa1a385\") " pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:42.964963 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.964907 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3d898a11-5438-403f-8954-683e2aa1a385-console-config\") pod \"console-655c8696c9-49h4h\" (UID: \"3d898a11-5438-403f-8954-683e2aa1a385\") " pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:42.964963 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.964941 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jr2x6\" (UniqueName: \"kubernetes.io/projected/3d898a11-5438-403f-8954-683e2aa1a385-kube-api-access-jr2x6\") pod \"console-655c8696c9-49h4h\" (UID: \"3d898a11-5438-403f-8954-683e2aa1a385\") " pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:42.965155 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.964984 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d898a11-5438-403f-8954-683e2aa1a385-service-ca\") pod \"console-655c8696c9-49h4h\" (UID: \"3d898a11-5438-403f-8954-683e2aa1a385\") " pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:42.965155 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.965023 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3d898a11-5438-403f-8954-683e2aa1a385-console-oauth-config\") pod \"console-655c8696c9-49h4h\" (UID: \"3d898a11-5438-403f-8954-683e2aa1a385\") " pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:42.965722 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.965693 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3d898a11-5438-403f-8954-683e2aa1a385-oauth-serving-cert\") pod \"console-655c8696c9-49h4h\" (UID: \"3d898a11-5438-403f-8954-683e2aa1a385\") " pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:42.965868 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.965726 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d898a11-5438-403f-8954-683e2aa1a385-trusted-ca-bundle\") pod \"console-655c8696c9-49h4h\" (UID: \"3d898a11-5438-403f-8954-683e2aa1a385\") " pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:42.965935 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.965883 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3d898a11-5438-403f-8954-683e2aa1a385-console-config\") pod \"console-655c8696c9-49h4h\" (UID: \"3d898a11-5438-403f-8954-683e2aa1a385\") " pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:42.966381 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.966335 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d898a11-5438-403f-8954-683e2aa1a385-service-ca\") pod \"console-655c8696c9-49h4h\" (UID: \"3d898a11-5438-403f-8954-683e2aa1a385\") " pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:42.967835 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.967813 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3d898a11-5438-403f-8954-683e2aa1a385-console-oauth-config\") pod \"console-655c8696c9-49h4h\" (UID: \"3d898a11-5438-403f-8954-683e2aa1a385\") " pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:42.968107 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.968090 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d898a11-5438-403f-8954-683e2aa1a385-console-serving-cert\") pod \"console-655c8696c9-49h4h\" (UID: \"3d898a11-5438-403f-8954-683e2aa1a385\") " pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:42.973757 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:42.973732 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr2x6\" (UniqueName: \"kubernetes.io/projected/3d898a11-5438-403f-8954-683e2aa1a385-kube-api-access-jr2x6\") pod \"console-655c8696c9-49h4h\" (UID: \"3d898a11-5438-403f-8954-683e2aa1a385\") " pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:43.038870 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:43.038768 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:43.168871 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:43.168846 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-655c8696c9-49h4h"] Apr 22 14:22:43.171055 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:22:43.171024 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d898a11_5438_403f_8954_683e2aa1a385.slice/crio-07949e46d67dccbebdf7363c220fb9fda452453c41e3555b2fc98742637aa412 WatchSource:0}: Error finding container 07949e46d67dccbebdf7363c220fb9fda452453c41e3555b2fc98742637aa412: Status 404 returned error can't find the container with id 07949e46d67dccbebdf7363c220fb9fda452453c41e3555b2fc98742637aa412 Apr 22 14:22:43.851651 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:43.851612 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-655c8696c9-49h4h" event={"ID":"3d898a11-5438-403f-8954-683e2aa1a385","Type":"ContainerStarted","Data":"4024da382b6183eec4a09db5a7ad1cee9249234cd2b60d4ae6a23957c0acca74"} Apr 22 14:22:43.851651 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:43.851652 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-655c8696c9-49h4h" event={"ID":"3d898a11-5438-403f-8954-683e2aa1a385","Type":"ContainerStarted","Data":"07949e46d67dccbebdf7363c220fb9fda452453c41e3555b2fc98742637aa412"} Apr 22 14:22:53.039299 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:53.039256 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:53.039815 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:53.039407 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:53.043859 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:53.043837 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:53.076110 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:53.076053 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-655c8696c9-49h4h" podStartSLOduration=11.076037852 podStartE2EDuration="11.076037852s" podCreationTimestamp="2026-04-22 14:22:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:22:43.876060338 +0000 UTC m=+410.852025778" watchObservedRunningTime="2026-04-22 14:22:53.076037852 +0000 UTC m=+420.052003292" Apr 22 14:22:53.812313 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:53.812276 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-zl958"] Apr 22 14:22:53.816440 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:53.816414 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-zl958" Apr 22 14:22:53.819339 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:53.819315 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 14:22:53.819580 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:53.819559 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-w9lzk\"" Apr 22 14:22:53.819712 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:53.819697 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 14:22:53.819771 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:53.819697 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 14:22:53.830327 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:53.830299 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-zl958"] Apr 22 14:22:53.842043 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:53.842010 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-ncwcf"] Apr 22 14:22:53.847077 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:53.847052 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-ncwcf" Apr 22 14:22:53.849872 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:53.849848 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-vmlfk\"" Apr 22 14:22:53.850000 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:53.849981 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 14:22:53.859578 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:53.859551 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-ncwcf"] Apr 22 14:22:53.887721 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:53.887687 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-655c8696c9-49h4h" Apr 22 14:22:53.944948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:53.944908 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55468d84bd-l8v6j"] Apr 22 14:22:53.961720 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:53.961680 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djhz7\" (UniqueName: \"kubernetes.io/projected/e4b14417-7924-4bcc-9c8e-87a4156ca1ec-kube-api-access-djhz7\") pod \"seaweedfs-86cc847c5c-ncwcf\" (UID: \"e4b14417-7924-4bcc-9c8e-87a4156ca1ec\") " pod="kserve/seaweedfs-86cc847c5c-ncwcf" Apr 22 14:22:53.961894 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:53.961727 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ed6e1f5-9714-4ad1-b910-2381a9997ff4-cert\") pod \"kserve-controller-manager-66cf78b85b-zl958\" (UID: \"3ed6e1f5-9714-4ad1-b910-2381a9997ff4\") " pod="kserve/kserve-controller-manager-66cf78b85b-zl958" Apr 22 14:22:53.961894 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:53.961822 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shztb\" (UniqueName: \"kubernetes.io/projected/3ed6e1f5-9714-4ad1-b910-2381a9997ff4-kube-api-access-shztb\") pod \"kserve-controller-manager-66cf78b85b-zl958\" (UID: \"3ed6e1f5-9714-4ad1-b910-2381a9997ff4\") " pod="kserve/kserve-controller-manager-66cf78b85b-zl958" Apr 22 14:22:53.962000 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:53.961905 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/e4b14417-7924-4bcc-9c8e-87a4156ca1ec-data\") pod \"seaweedfs-86cc847c5c-ncwcf\" (UID: \"e4b14417-7924-4bcc-9c8e-87a4156ca1ec\") " pod="kserve/seaweedfs-86cc847c5c-ncwcf" Apr 22 14:22:54.062439 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:54.062360 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djhz7\" (UniqueName: \"kubernetes.io/projected/e4b14417-7924-4bcc-9c8e-87a4156ca1ec-kube-api-access-djhz7\") pod \"seaweedfs-86cc847c5c-ncwcf\" (UID: \"e4b14417-7924-4bcc-9c8e-87a4156ca1ec\") " pod="kserve/seaweedfs-86cc847c5c-ncwcf" Apr 22 14:22:54.062439 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:54.062396 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ed6e1f5-9714-4ad1-b910-2381a9997ff4-cert\") pod \"kserve-controller-manager-66cf78b85b-zl958\" (UID: \"3ed6e1f5-9714-4ad1-b910-2381a9997ff4\") " pod="kserve/kserve-controller-manager-66cf78b85b-zl958" Apr 22 14:22:54.062439 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:54.062437 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shztb\" (UniqueName: \"kubernetes.io/projected/3ed6e1f5-9714-4ad1-b910-2381a9997ff4-kube-api-access-shztb\") pod \"kserve-controller-manager-66cf78b85b-zl958\" (UID: \"3ed6e1f5-9714-4ad1-b910-2381a9997ff4\") " pod="kserve/kserve-controller-manager-66cf78b85b-zl958" Apr 22 14:22:54.062932 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:54.062496 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/e4b14417-7924-4bcc-9c8e-87a4156ca1ec-data\") pod \"seaweedfs-86cc847c5c-ncwcf\" (UID: \"e4b14417-7924-4bcc-9c8e-87a4156ca1ec\") " pod="kserve/seaweedfs-86cc847c5c-ncwcf" Apr 22 14:22:54.062932 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:22:54.062567 2579 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 22 14:22:54.062932 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:22:54.062652 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ed6e1f5-9714-4ad1-b910-2381a9997ff4-cert podName:3ed6e1f5-9714-4ad1-b910-2381a9997ff4 nodeName:}" failed. No retries permitted until 2026-04-22 14:22:54.562631151 +0000 UTC m=+421.538596583 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ed6e1f5-9714-4ad1-b910-2381a9997ff4-cert") pod "kserve-controller-manager-66cf78b85b-zl958" (UID: "3ed6e1f5-9714-4ad1-b910-2381a9997ff4") : secret "kserve-webhook-server-cert" not found Apr 22 14:22:54.062932 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:54.062848 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/e4b14417-7924-4bcc-9c8e-87a4156ca1ec-data\") pod \"seaweedfs-86cc847c5c-ncwcf\" (UID: \"e4b14417-7924-4bcc-9c8e-87a4156ca1ec\") " pod="kserve/seaweedfs-86cc847c5c-ncwcf" Apr 22 14:22:54.077527 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:54.077497 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djhz7\" (UniqueName: \"kubernetes.io/projected/e4b14417-7924-4bcc-9c8e-87a4156ca1ec-kube-api-access-djhz7\") pod \"seaweedfs-86cc847c5c-ncwcf\" (UID: \"e4b14417-7924-4bcc-9c8e-87a4156ca1ec\") " pod="kserve/seaweedfs-86cc847c5c-ncwcf" Apr 22 14:22:54.077807 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:54.077785 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shztb\" (UniqueName: \"kubernetes.io/projected/3ed6e1f5-9714-4ad1-b910-2381a9997ff4-kube-api-access-shztb\") pod \"kserve-controller-manager-66cf78b85b-zl958\" (UID: \"3ed6e1f5-9714-4ad1-b910-2381a9997ff4\") " pod="kserve/kserve-controller-manager-66cf78b85b-zl958" Apr 22 14:22:54.156501 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:54.156465 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-ncwcf" Apr 22 14:22:54.284919 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:54.284885 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-ncwcf"] Apr 22 14:22:54.289943 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:22:54.289911 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4b14417_7924_4bcc_9c8e_87a4156ca1ec.slice/crio-e7439986f29e88c9ad0f85dc5f50db2139312b32c9613a37e3fd2ffe85cd06cc WatchSource:0}: Error finding container e7439986f29e88c9ad0f85dc5f50db2139312b32c9613a37e3fd2ffe85cd06cc: Status 404 returned error can't find the container with id e7439986f29e88c9ad0f85dc5f50db2139312b32c9613a37e3fd2ffe85cd06cc Apr 22 14:22:54.566020 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:54.565980 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ed6e1f5-9714-4ad1-b910-2381a9997ff4-cert\") pod \"kserve-controller-manager-66cf78b85b-zl958\" (UID: \"3ed6e1f5-9714-4ad1-b910-2381a9997ff4\") " pod="kserve/kserve-controller-manager-66cf78b85b-zl958" Apr 22 14:22:54.568300 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:54.568281 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ed6e1f5-9714-4ad1-b910-2381a9997ff4-cert\") pod \"kserve-controller-manager-66cf78b85b-zl958\" (UID: \"3ed6e1f5-9714-4ad1-b910-2381a9997ff4\") " pod="kserve/kserve-controller-manager-66cf78b85b-zl958" Apr 22 14:22:54.727111 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:54.727078 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-zl958" Apr 22 14:22:54.889446 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:54.889408 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-ncwcf" event={"ID":"e4b14417-7924-4bcc-9c8e-87a4156ca1ec","Type":"ContainerStarted","Data":"e7439986f29e88c9ad0f85dc5f50db2139312b32c9613a37e3fd2ffe85cd06cc"} Apr 22 14:22:54.990760 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:54.990719 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-zl958"] Apr 22 14:22:54.993825 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:22:54.993793 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ed6e1f5_9714_4ad1_b910_2381a9997ff4.slice/crio-da36b09c26f25fb5b2378f301aa1bf5c8465b63f6b0c17e8c7bb557742a09d47 WatchSource:0}: Error finding container da36b09c26f25fb5b2378f301aa1bf5c8465b63f6b0c17e8c7bb557742a09d47: Status 404 returned error can't find the container with id da36b09c26f25fb5b2378f301aa1bf5c8465b63f6b0c17e8c7bb557742a09d47 Apr 22 14:22:55.896054 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:55.896001 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-zl958" event={"ID":"3ed6e1f5-9714-4ad1-b910-2381a9997ff4","Type":"ContainerStarted","Data":"da36b09c26f25fb5b2378f301aa1bf5c8465b63f6b0c17e8c7bb557742a09d47"} Apr 22 14:22:58.908106 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:58.908069 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-ncwcf" event={"ID":"e4b14417-7924-4bcc-9c8e-87a4156ca1ec","Type":"ContainerStarted","Data":"0c55a6ba0d9053f5e8f17a49977297fccdc4ae8c1add32dc93db2d3f8d869d44"} Apr 22 14:22:58.908572 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:58.908167 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-ncwcf" Apr 22 14:22:58.909419 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:58.909398 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-zl958" event={"ID":"3ed6e1f5-9714-4ad1-b910-2381a9997ff4","Type":"ContainerStarted","Data":"f2a4b818b683381ff02a8585b916dacc16384246dd4229ca9bccfc5ba4a72c1a"} Apr 22 14:22:58.909555 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:58.909447 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-66cf78b85b-zl958" Apr 22 14:22:58.937526 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:58.937475 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-ncwcf" podStartSLOduration=1.70577735 podStartE2EDuration="5.937444606s" podCreationTimestamp="2026-04-22 14:22:53 +0000 UTC" firstStartedPulling="2026-04-22 14:22:54.291319211 +0000 UTC m=+421.267284629" lastFinishedPulling="2026-04-22 14:22:58.522986457 +0000 UTC m=+425.498951885" observedRunningTime="2026-04-22 14:22:58.935435779 +0000 UTC m=+425.911401218" watchObservedRunningTime="2026-04-22 14:22:58.937444606 +0000 UTC m=+425.913410045" Apr 22 14:22:58.966944 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:22:58.966892 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-66cf78b85b-zl958" podStartSLOduration=2.496972459 podStartE2EDuration="5.966877601s" podCreationTimestamp="2026-04-22 14:22:53 +0000 UTC" firstStartedPulling="2026-04-22 14:22:54.9962397 +0000 UTC m=+421.972205119" lastFinishedPulling="2026-04-22 14:22:58.466144839 +0000 UTC m=+425.442110261" observedRunningTime="2026-04-22 14:22:58.962310864 +0000 UTC m=+425.938276305" watchObservedRunningTime="2026-04-22 14:22:58.966877601 +0000 UTC m=+425.942843040" Apr 22 14:23:04.917436 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:04.917401 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-ncwcf" Apr 22 14:23:18.965137 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:18.965073 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-55468d84bd-l8v6j" podUID="4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1" containerName="console" containerID="cri-o://5b52b7fbf986b59c905f1c30354f5dcbfc767065fdb6b9a7fd7a2b5a2eb82ebe" gracePeriod=15 Apr 22 14:23:19.207121 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.207098 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55468d84bd-l8v6j_4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1/console/0.log" Apr 22 14:23:19.207249 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.207160 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:23:19.288770 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.288662 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-console-oauth-config\") pod \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " Apr 22 14:23:19.288770 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.288729 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-service-ca\") pod \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " Apr 22 14:23:19.288770 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.288762 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-console-config\") pod \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " Apr 22 14:23:19.289062 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.288786 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-console-serving-cert\") pod \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " Apr 22 14:23:19.289062 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.288813 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfgqc\" (UniqueName: \"kubernetes.io/projected/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-kube-api-access-nfgqc\") pod \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " Apr 22 14:23:19.289062 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.288833 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-oauth-serving-cert\") pod \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " Apr 22 14:23:19.289062 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.288888 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-trusted-ca-bundle\") pod \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\" (UID: \"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1\") " Apr 22 14:23:19.289301 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.289235 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-console-config" (OuterVolumeSpecName: "console-config") pod "4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1" (UID: "4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:23:19.289301 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.289256 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-service-ca" (OuterVolumeSpecName: "service-ca") pod "4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1" (UID: "4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:23:19.289404 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.289346 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1" (UID: "4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:23:19.289404 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.289389 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1" (UID: "4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:23:19.291049 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.291025 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1" (UID: "4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:23:19.291161 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.291025 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-kube-api-access-nfgqc" (OuterVolumeSpecName: "kube-api-access-nfgqc") pod "4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1" (UID: "4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1"). InnerVolumeSpecName "kube-api-access-nfgqc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:23:19.291161 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.291091 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1" (UID: "4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:23:19.390229 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.390193 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-console-config\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:23:19.390229 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.390223 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-console-serving-cert\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:23:19.390229 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.390235 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nfgqc\" (UniqueName: \"kubernetes.io/projected/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-kube-api-access-nfgqc\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:23:19.390445 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.390245 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-oauth-serving-cert\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:23:19.390445 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.390255 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-trusted-ca-bundle\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:23:19.390445 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.390263 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-console-oauth-config\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:23:19.390445 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.390272 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1-service-ca\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:23:19.978928 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.978901 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55468d84bd-l8v6j_4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1/console/0.log" Apr 22 14:23:19.979382 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.978944 2579 generic.go:358] "Generic (PLEG): container finished" podID="4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1" containerID="5b52b7fbf986b59c905f1c30354f5dcbfc767065fdb6b9a7fd7a2b5a2eb82ebe" exitCode=2 Apr 22 14:23:19.979382 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.979014 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55468d84bd-l8v6j" Apr 22 14:23:19.979382 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.979046 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55468d84bd-l8v6j" event={"ID":"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1","Type":"ContainerDied","Data":"5b52b7fbf986b59c905f1c30354f5dcbfc767065fdb6b9a7fd7a2b5a2eb82ebe"} Apr 22 14:23:19.979382 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.979090 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55468d84bd-l8v6j" event={"ID":"4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1","Type":"ContainerDied","Data":"80d2cd78ce97d3b3bd5965669db8fd1115ad3c7baedfdaa02581ce8e0acbd9cf"} Apr 22 14:23:19.979382 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.979113 2579 scope.go:117] "RemoveContainer" containerID="5b52b7fbf986b59c905f1c30354f5dcbfc767065fdb6b9a7fd7a2b5a2eb82ebe" Apr 22 14:23:19.987335 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.987313 2579 scope.go:117] "RemoveContainer" containerID="5b52b7fbf986b59c905f1c30354f5dcbfc767065fdb6b9a7fd7a2b5a2eb82ebe" Apr 22 14:23:19.987596 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:23:19.987581 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b52b7fbf986b59c905f1c30354f5dcbfc767065fdb6b9a7fd7a2b5a2eb82ebe\": container with ID starting with 5b52b7fbf986b59c905f1c30354f5dcbfc767065fdb6b9a7fd7a2b5a2eb82ebe not found: ID does not exist" containerID="5b52b7fbf986b59c905f1c30354f5dcbfc767065fdb6b9a7fd7a2b5a2eb82ebe" Apr 22 14:23:19.987653 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:19.987608 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b52b7fbf986b59c905f1c30354f5dcbfc767065fdb6b9a7fd7a2b5a2eb82ebe"} err="failed to get container status \"5b52b7fbf986b59c905f1c30354f5dcbfc767065fdb6b9a7fd7a2b5a2eb82ebe\": rpc error: code = NotFound desc = could not find container \"5b52b7fbf986b59c905f1c30354f5dcbfc767065fdb6b9a7fd7a2b5a2eb82ebe\": container with ID starting with 5b52b7fbf986b59c905f1c30354f5dcbfc767065fdb6b9a7fd7a2b5a2eb82ebe not found: ID does not exist" Apr 22 14:23:20.001013 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:20.000982 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55468d84bd-l8v6j"] Apr 22 14:23:20.004968 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:20.004942 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-55468d84bd-l8v6j"] Apr 22 14:23:21.517808 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:21.517775 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1" path="/var/lib/kubelet/pods/4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1/volumes" Apr 22 14:23:29.439725 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.439687 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-zl958"] Apr 22 14:23:29.440161 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.439955 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-66cf78b85b-zl958" podUID="3ed6e1f5-9714-4ad1-b910-2381a9997ff4" containerName="manager" containerID="cri-o://f2a4b818b683381ff02a8585b916dacc16384246dd4229ca9bccfc5ba4a72c1a" gracePeriod=10 Apr 22 14:23:29.445115 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.445093 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-66cf78b85b-zl958" Apr 22 14:23:29.470572 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.470544 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-wbnkc"] Apr 22 14:23:29.470875 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.470863 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1" containerName="console" Apr 22 14:23:29.470925 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.470876 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1" containerName="console" Apr 22 14:23:29.470959 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.470937 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="4647b06a-ef4b-4a2d-8fd5-33a36ca1c7c1" containerName="console" Apr 22 14:23:29.473929 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.473911 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-wbnkc" Apr 22 14:23:29.485303 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.485277 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-wbnkc"] Apr 22 14:23:29.577913 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.577879 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f64bfa0-4d5f-49ea-9bc6-02620079ddc5-cert\") pod \"kserve-controller-manager-66cf78b85b-wbnkc\" (UID: \"0f64bfa0-4d5f-49ea-9bc6-02620079ddc5\") " pod="kserve/kserve-controller-manager-66cf78b85b-wbnkc" Apr 22 14:23:29.577913 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.577913 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lmqp\" (UniqueName: \"kubernetes.io/projected/0f64bfa0-4d5f-49ea-9bc6-02620079ddc5-kube-api-access-5lmqp\") pod \"kserve-controller-manager-66cf78b85b-wbnkc\" (UID: \"0f64bfa0-4d5f-49ea-9bc6-02620079ddc5\") " pod="kserve/kserve-controller-manager-66cf78b85b-wbnkc" Apr 22 14:23:29.678703 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.678673 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f64bfa0-4d5f-49ea-9bc6-02620079ddc5-cert\") pod \"kserve-controller-manager-66cf78b85b-wbnkc\" (UID: \"0f64bfa0-4d5f-49ea-9bc6-02620079ddc5\") " pod="kserve/kserve-controller-manager-66cf78b85b-wbnkc" Apr 22 14:23:29.678851 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.678709 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lmqp\" (UniqueName: \"kubernetes.io/projected/0f64bfa0-4d5f-49ea-9bc6-02620079ddc5-kube-api-access-5lmqp\") pod \"kserve-controller-manager-66cf78b85b-wbnkc\" (UID: \"0f64bfa0-4d5f-49ea-9bc6-02620079ddc5\") " pod="kserve/kserve-controller-manager-66cf78b85b-wbnkc" Apr 22 14:23:29.681019 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.680994 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f64bfa0-4d5f-49ea-9bc6-02620079ddc5-cert\") pod \"kserve-controller-manager-66cf78b85b-wbnkc\" (UID: \"0f64bfa0-4d5f-49ea-9bc6-02620079ddc5\") " pod="kserve/kserve-controller-manager-66cf78b85b-wbnkc" Apr 22 14:23:29.688563 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.688542 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lmqp\" (UniqueName: \"kubernetes.io/projected/0f64bfa0-4d5f-49ea-9bc6-02620079ddc5-kube-api-access-5lmqp\") pod \"kserve-controller-manager-66cf78b85b-wbnkc\" (UID: \"0f64bfa0-4d5f-49ea-9bc6-02620079ddc5\") " pod="kserve/kserve-controller-manager-66cf78b85b-wbnkc" Apr 22 14:23:29.697559 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.697504 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-zl958" Apr 22 14:23:29.779751 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.779720 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ed6e1f5-9714-4ad1-b910-2381a9997ff4-cert\") pod \"3ed6e1f5-9714-4ad1-b910-2381a9997ff4\" (UID: \"3ed6e1f5-9714-4ad1-b910-2381a9997ff4\") " Apr 22 14:23:29.779927 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.779800 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shztb\" (UniqueName: \"kubernetes.io/projected/3ed6e1f5-9714-4ad1-b910-2381a9997ff4-kube-api-access-shztb\") pod \"3ed6e1f5-9714-4ad1-b910-2381a9997ff4\" (UID: \"3ed6e1f5-9714-4ad1-b910-2381a9997ff4\") " Apr 22 14:23:29.781763 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.781736 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed6e1f5-9714-4ad1-b910-2381a9997ff4-cert" (OuterVolumeSpecName: "cert") pod "3ed6e1f5-9714-4ad1-b910-2381a9997ff4" (UID: "3ed6e1f5-9714-4ad1-b910-2381a9997ff4"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:23:29.781836 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.781806 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ed6e1f5-9714-4ad1-b910-2381a9997ff4-kube-api-access-shztb" (OuterVolumeSpecName: "kube-api-access-shztb") pod "3ed6e1f5-9714-4ad1-b910-2381a9997ff4" (UID: "3ed6e1f5-9714-4ad1-b910-2381a9997ff4"). InnerVolumeSpecName "kube-api-access-shztb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:23:29.825177 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.825142 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-wbnkc" Apr 22 14:23:29.880892 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.880762 2579 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ed6e1f5-9714-4ad1-b910-2381a9997ff4-cert\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:23:29.880892 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.880793 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-shztb\" (UniqueName: \"kubernetes.io/projected/3ed6e1f5-9714-4ad1-b910-2381a9997ff4-kube-api-access-shztb\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:23:29.944579 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:29.944543 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-wbnkc"] Apr 22 14:23:29.947545 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:23:29.947516 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f64bfa0_4d5f_49ea_9bc6_02620079ddc5.slice/crio-86ac9ce9dabafcb7e8d4334adeb62daa273c9b28df143f35c722a710a25c148c WatchSource:0}: Error finding container 86ac9ce9dabafcb7e8d4334adeb62daa273c9b28df143f35c722a710a25c148c: Status 404 returned error can't find the container with id 86ac9ce9dabafcb7e8d4334adeb62daa273c9b28df143f35c722a710a25c148c Apr 22 14:23:30.017073 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:30.017034 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-wbnkc" event={"ID":"0f64bfa0-4d5f-49ea-9bc6-02620079ddc5","Type":"ContainerStarted","Data":"86ac9ce9dabafcb7e8d4334adeb62daa273c9b28df143f35c722a710a25c148c"} Apr 22 14:23:30.018103 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:30.018080 2579 generic.go:358] "Generic (PLEG): container finished" podID="3ed6e1f5-9714-4ad1-b910-2381a9997ff4" containerID="f2a4b818b683381ff02a8585b916dacc16384246dd4229ca9bccfc5ba4a72c1a" exitCode=0 Apr 22 14:23:30.018220 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:30.018128 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-zl958" event={"ID":"3ed6e1f5-9714-4ad1-b910-2381a9997ff4","Type":"ContainerDied","Data":"f2a4b818b683381ff02a8585b916dacc16384246dd4229ca9bccfc5ba4a72c1a"} Apr 22 14:23:30.018220 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:30.018144 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-zl958" event={"ID":"3ed6e1f5-9714-4ad1-b910-2381a9997ff4","Type":"ContainerDied","Data":"da36b09c26f25fb5b2378f301aa1bf5c8465b63f6b0c17e8c7bb557742a09d47"} Apr 22 14:23:30.018220 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:30.018146 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-zl958" Apr 22 14:23:30.018220 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:30.018156 2579 scope.go:117] "RemoveContainer" containerID="f2a4b818b683381ff02a8585b916dacc16384246dd4229ca9bccfc5ba4a72c1a" Apr 22 14:23:30.028247 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:30.026465 2579 scope.go:117] "RemoveContainer" containerID="f2a4b818b683381ff02a8585b916dacc16384246dd4229ca9bccfc5ba4a72c1a" Apr 22 14:23:30.028247 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:23:30.026840 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2a4b818b683381ff02a8585b916dacc16384246dd4229ca9bccfc5ba4a72c1a\": container with ID starting with f2a4b818b683381ff02a8585b916dacc16384246dd4229ca9bccfc5ba4a72c1a not found: ID does not exist" containerID="f2a4b818b683381ff02a8585b916dacc16384246dd4229ca9bccfc5ba4a72c1a" Apr 22 14:23:30.028247 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:30.026875 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2a4b818b683381ff02a8585b916dacc16384246dd4229ca9bccfc5ba4a72c1a"} err="failed to get container status \"f2a4b818b683381ff02a8585b916dacc16384246dd4229ca9bccfc5ba4a72c1a\": rpc error: code = NotFound desc = could not find container \"f2a4b818b683381ff02a8585b916dacc16384246dd4229ca9bccfc5ba4a72c1a\": container with ID starting with f2a4b818b683381ff02a8585b916dacc16384246dd4229ca9bccfc5ba4a72c1a not found: ID does not exist" Apr 22 14:23:30.039311 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:30.039285 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-zl958"] Apr 22 14:23:30.043007 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:30.042984 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-zl958"] Apr 22 14:23:31.023877 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:31.023835 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-wbnkc" event={"ID":"0f64bfa0-4d5f-49ea-9bc6-02620079ddc5","Type":"ContainerStarted","Data":"c983a9c17e6385fcb45af8385583c888d2f3f55599909be26c9c908039b7be5d"} Apr 22 14:23:31.024298 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:31.024107 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-66cf78b85b-wbnkc" Apr 22 14:23:31.055922 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:31.055867 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-66cf78b85b-wbnkc" podStartSLOduration=1.680334473 podStartE2EDuration="2.055851909s" podCreationTimestamp="2026-04-22 14:23:29 +0000 UTC" firstStartedPulling="2026-04-22 14:23:29.948814487 +0000 UTC m=+456.924779904" lastFinishedPulling="2026-04-22 14:23:30.324331922 +0000 UTC m=+457.300297340" observedRunningTime="2026-04-22 14:23:31.054835797 +0000 UTC m=+458.030801239" watchObservedRunningTime="2026-04-22 14:23:31.055851909 +0000 UTC m=+458.031817348" Apr 22 14:23:31.516940 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:23:31.516905 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ed6e1f5-9714-4ad1-b910-2381a9997ff4" path="/var/lib/kubelet/pods/3ed6e1f5-9714-4ad1-b910-2381a9997ff4/volumes" Apr 22 14:24:02.032182 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:02.032151 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-66cf78b85b-wbnkc" Apr 22 14:24:03.029412 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.029380 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-64kfj"] Apr 22 14:24:03.029737 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.029724 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ed6e1f5-9714-4ad1-b910-2381a9997ff4" containerName="manager" Apr 22 14:24:03.029786 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.029739 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed6e1f5-9714-4ad1-b910-2381a9997ff4" containerName="manager" Apr 22 14:24:03.029844 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.029808 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ed6e1f5-9714-4ad1-b910-2381a9997ff4" containerName="manager" Apr 22 14:24:03.032735 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.032714 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-64kfj" Apr 22 14:24:03.035780 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.035753 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 14:24:03.036762 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.036742 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-7nbtk\"" Apr 22 14:24:03.050276 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.050248 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-64kfj"] Apr 22 14:24:03.069756 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.069713 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a0f76e86-1dc2-4c14-9764-5ab70dd7d62f-tls-certs\") pod \"model-serving-api-86f7b4b499-64kfj\" (UID: \"a0f76e86-1dc2-4c14-9764-5ab70dd7d62f\") " pod="kserve/model-serving-api-86f7b4b499-64kfj" Apr 22 14:24:03.069929 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.069828 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8mpr\" (UniqueName: \"kubernetes.io/projected/a0f76e86-1dc2-4c14-9764-5ab70dd7d62f-kube-api-access-q8mpr\") pod \"model-serving-api-86f7b4b499-64kfj\" (UID: \"a0f76e86-1dc2-4c14-9764-5ab70dd7d62f\") " pod="kserve/model-serving-api-86f7b4b499-64kfj" Apr 22 14:24:03.081468 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.081413 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-5q956"] Apr 22 14:24:03.085121 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.085097 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-5q956" Apr 22 14:24:03.093047 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.093021 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-hk49j\"" Apr 22 14:24:03.093753 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.093730 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 14:24:03.105110 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.105085 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-5q956"] Apr 22 14:24:03.170783 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.170744 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26ce42c1-27d0-4bf4-8dd5-28989480af95-cert\") pod \"odh-model-controller-696fc77849-5q956\" (UID: \"26ce42c1-27d0-4bf4-8dd5-28989480af95\") " pod="kserve/odh-model-controller-696fc77849-5q956" Apr 22 14:24:03.170962 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.170799 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8mpr\" (UniqueName: \"kubernetes.io/projected/a0f76e86-1dc2-4c14-9764-5ab70dd7d62f-kube-api-access-q8mpr\") pod \"model-serving-api-86f7b4b499-64kfj\" (UID: \"a0f76e86-1dc2-4c14-9764-5ab70dd7d62f\") " pod="kserve/model-serving-api-86f7b4b499-64kfj" Apr 22 14:24:03.170962 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.170837 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kl74\" (UniqueName: \"kubernetes.io/projected/26ce42c1-27d0-4bf4-8dd5-28989480af95-kube-api-access-7kl74\") pod \"odh-model-controller-696fc77849-5q956\" (UID: \"26ce42c1-27d0-4bf4-8dd5-28989480af95\") " pod="kserve/odh-model-controller-696fc77849-5q956" Apr 22 14:24:03.171103 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.170970 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a0f76e86-1dc2-4c14-9764-5ab70dd7d62f-tls-certs\") pod \"model-serving-api-86f7b4b499-64kfj\" (UID: \"a0f76e86-1dc2-4c14-9764-5ab70dd7d62f\") " pod="kserve/model-serving-api-86f7b4b499-64kfj" Apr 22 14:24:03.173511 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.173426 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a0f76e86-1dc2-4c14-9764-5ab70dd7d62f-tls-certs\") pod \"model-serving-api-86f7b4b499-64kfj\" (UID: \"a0f76e86-1dc2-4c14-9764-5ab70dd7d62f\") " pod="kserve/model-serving-api-86f7b4b499-64kfj" Apr 22 14:24:03.187071 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.187041 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8mpr\" (UniqueName: \"kubernetes.io/projected/a0f76e86-1dc2-4c14-9764-5ab70dd7d62f-kube-api-access-q8mpr\") pod \"model-serving-api-86f7b4b499-64kfj\" (UID: \"a0f76e86-1dc2-4c14-9764-5ab70dd7d62f\") " pod="kserve/model-serving-api-86f7b4b499-64kfj" Apr 22 14:24:03.271353 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.271318 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26ce42c1-27d0-4bf4-8dd5-28989480af95-cert\") pod \"odh-model-controller-696fc77849-5q956\" (UID: \"26ce42c1-27d0-4bf4-8dd5-28989480af95\") " pod="kserve/odh-model-controller-696fc77849-5q956" Apr 22 14:24:03.271584 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.271378 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7kl74\" (UniqueName: \"kubernetes.io/projected/26ce42c1-27d0-4bf4-8dd5-28989480af95-kube-api-access-7kl74\") pod \"odh-model-controller-696fc77849-5q956\" (UID: \"26ce42c1-27d0-4bf4-8dd5-28989480af95\") " pod="kserve/odh-model-controller-696fc77849-5q956" Apr 22 14:24:03.274184 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.274151 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26ce42c1-27d0-4bf4-8dd5-28989480af95-cert\") pod \"odh-model-controller-696fc77849-5q956\" (UID: \"26ce42c1-27d0-4bf4-8dd5-28989480af95\") " pod="kserve/odh-model-controller-696fc77849-5q956" Apr 22 14:24:03.280977 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.280908 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kl74\" (UniqueName: \"kubernetes.io/projected/26ce42c1-27d0-4bf4-8dd5-28989480af95-kube-api-access-7kl74\") pod \"odh-model-controller-696fc77849-5q956\" (UID: \"26ce42c1-27d0-4bf4-8dd5-28989480af95\") " pod="kserve/odh-model-controller-696fc77849-5q956" Apr 22 14:24:03.342844 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.342810 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-64kfj" Apr 22 14:24:03.396365 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.396329 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-5q956" Apr 22 14:24:03.487875 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.487838 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-64kfj"] Apr 22 14:24:03.491559 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:24:03.491524 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0f76e86_1dc2_4c14_9764_5ab70dd7d62f.slice/crio-19b2b741f77957499f5ac759ffee9e0197c60d3e904509d4dd25661f613a6c82 WatchSource:0}: Error finding container 19b2b741f77957499f5ac759ffee9e0197c60d3e904509d4dd25661f613a6c82: Status 404 returned error can't find the container with id 19b2b741f77957499f5ac759ffee9e0197c60d3e904509d4dd25661f613a6c82 Apr 22 14:24:03.536761 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:03.536608 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-5q956"] Apr 22 14:24:03.539224 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:24:03.539192 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26ce42c1_27d0_4bf4_8dd5_28989480af95.slice/crio-5c8a0c9d22832c0ae41bc5ae41194160acdf0bfbbb13e684d5eb1bbd5854f4c9 WatchSource:0}: Error finding container 5c8a0c9d22832c0ae41bc5ae41194160acdf0bfbbb13e684d5eb1bbd5854f4c9: Status 404 returned error can't find the container with id 5c8a0c9d22832c0ae41bc5ae41194160acdf0bfbbb13e684d5eb1bbd5854f4c9 Apr 22 14:24:04.132781 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:04.132739 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-64kfj" event={"ID":"a0f76e86-1dc2-4c14-9764-5ab70dd7d62f","Type":"ContainerStarted","Data":"19b2b741f77957499f5ac759ffee9e0197c60d3e904509d4dd25661f613a6c82"} Apr 22 14:24:04.133948 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:04.133913 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-5q956" event={"ID":"26ce42c1-27d0-4bf4-8dd5-28989480af95","Type":"ContainerStarted","Data":"5c8a0c9d22832c0ae41bc5ae41194160acdf0bfbbb13e684d5eb1bbd5854f4c9"} Apr 22 14:24:07.145292 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:07.145248 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-5q956" event={"ID":"26ce42c1-27d0-4bf4-8dd5-28989480af95","Type":"ContainerStarted","Data":"85e9c44a14bb8aee4cdd690b82df85677c261b35740d0681cc23bcef550dc2ee"} Apr 22 14:24:07.145799 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:07.145358 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-5q956" Apr 22 14:24:07.146605 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:07.146579 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-64kfj" event={"ID":"a0f76e86-1dc2-4c14-9764-5ab70dd7d62f","Type":"ContainerStarted","Data":"0f20d6b2384af7c4121fe7e8b571d714e7c2205bf6c2c56b84497bd904d57468"} Apr 22 14:24:07.146793 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:07.146679 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-64kfj" Apr 22 14:24:07.170167 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:07.170114 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-5q956" podStartSLOduration=1.406034201 podStartE2EDuration="4.170099009s" podCreationTimestamp="2026-04-22 14:24:03 +0000 UTC" firstStartedPulling="2026-04-22 14:24:03.540559096 +0000 UTC m=+490.516524514" lastFinishedPulling="2026-04-22 14:24:06.304623897 +0000 UTC m=+493.280589322" observedRunningTime="2026-04-22 14:24:07.168344158 +0000 UTC m=+494.144309598" watchObservedRunningTime="2026-04-22 14:24:07.170099009 +0000 UTC m=+494.146064449" Apr 22 14:24:07.193444 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:07.193396 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-64kfj" podStartSLOduration=1.429522487 podStartE2EDuration="4.193379811s" podCreationTimestamp="2026-04-22 14:24:03 +0000 UTC" firstStartedPulling="2026-04-22 14:24:03.493413307 +0000 UTC m=+490.469378725" lastFinishedPulling="2026-04-22 14:24:06.257270627 +0000 UTC m=+493.233236049" observedRunningTime="2026-04-22 14:24:07.192605433 +0000 UTC m=+494.168570873" watchObservedRunningTime="2026-04-22 14:24:07.193379811 +0000 UTC m=+494.169345251" Apr 22 14:24:18.152838 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:18.152808 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-5q956" Apr 22 14:24:18.154626 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:18.154609 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-64kfj" Apr 22 14:24:39.173033 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:39.172946 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw"] Apr 22 14:24:39.180127 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:39.180101 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw" Apr 22 14:24:39.183118 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:39.183083 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-n4vmn\"" Apr 22 14:24:39.183724 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:39.183702 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw"] Apr 22 14:24:39.191050 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:39.191026 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw" Apr 22 14:24:39.324446 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:39.324422 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw"] Apr 22 14:24:39.327093 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:24:39.327067 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9912495_2ac2_4d13_9bb0_3bcced8805ba.slice/crio-7a0f5f1586ca74485eb61422cdc391c8921a0235f30e183d34b8103bee5044c6 WatchSource:0}: Error finding container 7a0f5f1586ca74485eb61422cdc391c8921a0235f30e183d34b8103bee5044c6: Status 404 returned error can't find the container with id 7a0f5f1586ca74485eb61422cdc391c8921a0235f30e183d34b8103bee5044c6 Apr 22 14:24:39.410416 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:39.410375 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8"] Apr 22 14:24:39.413778 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:39.413753 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" Apr 22 14:24:39.427533 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:39.427467 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8"] Apr 22 14:24:39.589366 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:39.589330 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12046a98-409e-4b3f-acb6-380b2dfdac5a-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8\" (UID: \"12046a98-409e-4b3f-acb6-380b2dfdac5a\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" Apr 22 14:24:39.690396 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:39.690359 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12046a98-409e-4b3f-acb6-380b2dfdac5a-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8\" (UID: \"12046a98-409e-4b3f-acb6-380b2dfdac5a\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" Apr 22 14:24:39.690805 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:39.690782 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12046a98-409e-4b3f-acb6-380b2dfdac5a-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8\" (UID: \"12046a98-409e-4b3f-acb6-380b2dfdac5a\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" Apr 22 14:24:39.701659 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:39.701622 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj"] Apr 22 14:24:39.705092 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:39.705070 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" Apr 22 14:24:39.717395 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:39.717364 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj"] Apr 22 14:24:39.727385 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:39.727351 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" Apr 22 14:24:39.790998 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:39.790958 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj\" (UID: \"f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" Apr 22 14:24:39.892267 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:39.892206 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj\" (UID: \"f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" Apr 22 14:24:39.892738 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:39.892711 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj\" (UID: \"f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" Apr 22 14:24:39.918004 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:39.917965 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8"] Apr 22 14:24:39.925584 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:24:39.925539 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12046a98_409e_4b3f_acb6_380b2dfdac5a.slice/crio-d502d3295e91195a4a08f23b272267884b6f0a928168424211e027c5ce31a5c2 WatchSource:0}: Error finding container d502d3295e91195a4a08f23b272267884b6f0a928168424211e027c5ce31a5c2: Status 404 returned error can't find the container with id d502d3295e91195a4a08f23b272267884b6f0a928168424211e027c5ce31a5c2 Apr 22 14:24:40.016165 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:40.016015 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" Apr 22 14:24:40.231586 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:40.231528 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj"] Apr 22 14:24:40.263917 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:40.263885 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" event={"ID":"f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7","Type":"ContainerStarted","Data":"e1f05d7b7bc740c38c20cc61ea26b88aa33d95f21193cff889b417d8baf1ba09"} Apr 22 14:24:40.267280 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:40.267206 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" event={"ID":"12046a98-409e-4b3f-acb6-380b2dfdac5a","Type":"ContainerStarted","Data":"d502d3295e91195a4a08f23b272267884b6f0a928168424211e027c5ce31a5c2"} Apr 22 14:24:40.277354 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:40.277316 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw" event={"ID":"b9912495-2ac2-4d13-9bb0-3bcced8805ba","Type":"ContainerStarted","Data":"7a0f5f1586ca74485eb61422cdc391c8921a0235f30e183d34b8103bee5044c6"} Apr 22 14:24:54.341809 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:54.341503 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw" event={"ID":"b9912495-2ac2-4d13-9bb0-3bcced8805ba","Type":"ContainerStarted","Data":"8db438a2d78a787d18d5603a0889d1db4ffad744ef06a30bea8a7177519003b6"} Apr 22 14:24:54.341809 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:54.341558 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw" Apr 22 14:24:54.343107 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:54.343072 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" event={"ID":"f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7","Type":"ContainerStarted","Data":"88942df55ed62f528b819b8c4ec231067d9752e9810e7d2d318bec21e1e5ad83"} Apr 22 14:24:54.343528 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:54.343446 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw" podUID="b9912495-2ac2-4d13-9bb0-3bcced8805ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 14:24:54.344640 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:54.344615 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" event={"ID":"12046a98-409e-4b3f-acb6-380b2dfdac5a","Type":"ContainerStarted","Data":"c30c09a34f960e61cf493702a410dc83af8de9a891e1fc02f66bce2c873866b3"} Apr 22 14:24:54.360023 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:54.359959 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw" podStartSLOduration=0.629497927 podStartE2EDuration="15.359937547s" podCreationTimestamp="2026-04-22 14:24:39 +0000 UTC" firstStartedPulling="2026-04-22 14:24:39.328884685 +0000 UTC m=+526.304850104" lastFinishedPulling="2026-04-22 14:24:54.059324293 +0000 UTC m=+541.035289724" observedRunningTime="2026-04-22 14:24:54.358127461 +0000 UTC m=+541.334092905" watchObservedRunningTime="2026-04-22 14:24:54.359937547 +0000 UTC m=+541.335902989" Apr 22 14:24:55.348786 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:55.348743 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw" podUID="b9912495-2ac2-4d13-9bb0-3bcced8805ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 14:24:58.360261 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:58.360226 2579 generic.go:358] "Generic (PLEG): container finished" podID="f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7" containerID="88942df55ed62f528b819b8c4ec231067d9752e9810e7d2d318bec21e1e5ad83" exitCode=0 Apr 22 14:24:58.360717 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:58.360299 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" event={"ID":"f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7","Type":"ContainerDied","Data":"88942df55ed62f528b819b8c4ec231067d9752e9810e7d2d318bec21e1e5ad83"} Apr 22 14:24:58.361626 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:58.361606 2579 generic.go:358] "Generic (PLEG): container finished" podID="12046a98-409e-4b3f-acb6-380b2dfdac5a" containerID="c30c09a34f960e61cf493702a410dc83af8de9a891e1fc02f66bce2c873866b3" exitCode=0 Apr 22 14:24:58.361730 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:24:58.361695 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" event={"ID":"12046a98-409e-4b3f-acb6-380b2dfdac5a","Type":"ContainerDied","Data":"c30c09a34f960e61cf493702a410dc83af8de9a891e1fc02f66bce2c873866b3"} Apr 22 14:25:05.349203 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:05.349150 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw" podUID="b9912495-2ac2-4d13-9bb0-3bcced8805ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 14:25:07.396469 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:07.396427 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" event={"ID":"f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7","Type":"ContainerStarted","Data":"e0956d8a9e4a531fd6786b2f44c57cad1f5c4a93c5887b04e84f225ab37595d7"} Apr 22 14:25:07.396922 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:07.396737 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" Apr 22 14:25:07.398160 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:07.398133 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" event={"ID":"12046a98-409e-4b3f-acb6-380b2dfdac5a","Type":"ContainerStarted","Data":"b644b7332ce5e05056a5a22afb1da00edab1c6234d50ecadf0d6275cb30adb64"} Apr 22 14:25:07.398392 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:07.398355 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" podUID="f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 14:25:07.398526 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:07.398393 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" Apr 22 14:25:07.399236 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:07.399215 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" podUID="12046a98-409e-4b3f-acb6-380b2dfdac5a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 14:25:07.418490 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:07.418417 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" podStartSLOduration=1.83864798 podStartE2EDuration="28.418401464s" podCreationTimestamp="2026-04-22 14:24:39 +0000 UTC" firstStartedPulling="2026-04-22 14:24:40.258533664 +0000 UTC m=+527.234499088" lastFinishedPulling="2026-04-22 14:25:06.838287154 +0000 UTC m=+553.814252572" observedRunningTime="2026-04-22 14:25:07.416603999 +0000 UTC m=+554.392569453" watchObservedRunningTime="2026-04-22 14:25:07.418401464 +0000 UTC m=+554.394366904" Apr 22 14:25:07.436263 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:07.436212 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" podStartSLOduration=1.5097918460000002 podStartE2EDuration="28.436195255s" podCreationTimestamp="2026-04-22 14:24:39 +0000 UTC" firstStartedPulling="2026-04-22 14:24:39.927429348 +0000 UTC m=+526.903394782" lastFinishedPulling="2026-04-22 14:25:06.85383277 +0000 UTC m=+553.829798191" observedRunningTime="2026-04-22 14:25:07.434709878 +0000 UTC m=+554.410675317" watchObservedRunningTime="2026-04-22 14:25:07.436195255 +0000 UTC m=+554.412160694" Apr 22 14:25:08.402235 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:08.402192 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" podUID="f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 14:25:08.402684 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:08.402193 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" podUID="12046a98-409e-4b3f-acb6-380b2dfdac5a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 14:25:15.349662 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:15.349611 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw" podUID="b9912495-2ac2-4d13-9bb0-3bcced8805ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 14:25:18.403031 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:18.402980 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" podUID="f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 14:25:18.403408 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:18.402980 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" podUID="12046a98-409e-4b3f-acb6-380b2dfdac5a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 14:25:25.349524 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:25.349478 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw" podUID="b9912495-2ac2-4d13-9bb0-3bcced8805ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 14:25:28.403263 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:28.403216 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" podUID="f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 14:25:28.403753 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:28.403221 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" podUID="12046a98-409e-4b3f-acb6-380b2dfdac5a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 14:25:35.349132 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:35.349072 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw" podUID="b9912495-2ac2-4d13-9bb0-3bcced8805ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 22 14:25:38.402583 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:38.402532 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" podUID="f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 14:25:38.403040 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:38.402532 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" podUID="12046a98-409e-4b3f-acb6-380b2dfdac5a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 14:25:45.350255 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:45.350212 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw" Apr 22 14:25:48.402475 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:48.402407 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" podUID="12046a98-409e-4b3f-acb6-380b2dfdac5a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 14:25:48.402979 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:48.402415 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" podUID="f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 14:25:53.421484 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:53.421439 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 14:25:53.421889 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:53.421616 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 14:25:58.402850 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:58.402806 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" podUID="12046a98-409e-4b3f-acb6-380b2dfdac5a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 14:25:58.403275 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:25:58.402812 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" podUID="f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 14:26:08.402720 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:08.402634 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" podUID="f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 14:26:08.403086 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:08.402634 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" podUID="12046a98-409e-4b3f-acb6-380b2dfdac5a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 22 14:26:09.565259 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:09.565224 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw"] Apr 22 14:26:09.565658 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:09.565465 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw" podUID="b9912495-2ac2-4d13-9bb0-3bcced8805ba" containerName="kserve-container" containerID="cri-o://8db438a2d78a787d18d5603a0889d1db4ffad744ef06a30bea8a7177519003b6" gracePeriod=30 Apr 22 14:26:09.661802 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:09.659202 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75"] Apr 22 14:26:09.672236 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:09.671612 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75" Apr 22 14:26:09.674044 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:09.674011 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75"] Apr 22 14:26:09.685343 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:09.685322 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75" Apr 22 14:26:09.850731 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:09.849768 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75"] Apr 22 14:26:09.853160 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:26:09.853115 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0149eb89_a3dc_4a58_8e88_3460aef15834.slice/crio-7ec217ec91b9c852e68a70b4b1695ec289b4c001e1d30b9a17645003d194611c WatchSource:0}: Error finding container 7ec217ec91b9c852e68a70b4b1695ec289b4c001e1d30b9a17645003d194611c: Status 404 returned error can't find the container with id 7ec217ec91b9c852e68a70b4b1695ec289b4c001e1d30b9a17645003d194611c Apr 22 14:26:10.616123 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:10.616085 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75" event={"ID":"0149eb89-a3dc-4a58-8e88-3460aef15834","Type":"ContainerStarted","Data":"5b8a9db436a29c6a08b71b8bdc94a34a39f2275115cc1618da2fc3e26938be4b"} Apr 22 14:26:10.616123 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:10.616122 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75" event={"ID":"0149eb89-a3dc-4a58-8e88-3460aef15834","Type":"ContainerStarted","Data":"7ec217ec91b9c852e68a70b4b1695ec289b4c001e1d30b9a17645003d194611c"} Apr 22 14:26:10.616637 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:10.616215 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75" Apr 22 14:26:10.617785 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:10.617755 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75" podUID="0149eb89-a3dc-4a58-8e88-3460aef15834" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 22 14:26:10.636068 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:10.636010 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75" podStartSLOduration=1.635995691 podStartE2EDuration="1.635995691s" podCreationTimestamp="2026-04-22 14:26:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:26:10.63322414 +0000 UTC m=+617.609189578" watchObservedRunningTime="2026-04-22 14:26:10.635995691 +0000 UTC m=+617.611961131" Apr 22 14:26:11.620284 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:11.620241 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75" podUID="0149eb89-a3dc-4a58-8e88-3460aef15834" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 22 14:26:12.910359 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:12.910335 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw" Apr 22 14:26:13.627822 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:13.627785 2579 generic.go:358] "Generic (PLEG): container finished" podID="b9912495-2ac2-4d13-9bb0-3bcced8805ba" containerID="8db438a2d78a787d18d5603a0889d1db4ffad744ef06a30bea8a7177519003b6" exitCode=0 Apr 22 14:26:13.628005 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:13.627847 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw" event={"ID":"b9912495-2ac2-4d13-9bb0-3bcced8805ba","Type":"ContainerDied","Data":"8db438a2d78a787d18d5603a0889d1db4ffad744ef06a30bea8a7177519003b6"} Apr 22 14:26:13.628005 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:13.627855 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw" Apr 22 14:26:13.628005 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:13.627881 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw" event={"ID":"b9912495-2ac2-4d13-9bb0-3bcced8805ba","Type":"ContainerDied","Data":"7a0f5f1586ca74485eb61422cdc391c8921a0235f30e183d34b8103bee5044c6"} Apr 22 14:26:13.628005 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:13.627901 2579 scope.go:117] "RemoveContainer" containerID="8db438a2d78a787d18d5603a0889d1db4ffad744ef06a30bea8a7177519003b6" Apr 22 14:26:13.635935 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:13.635889 2579 scope.go:117] "RemoveContainer" containerID="8db438a2d78a787d18d5603a0889d1db4ffad744ef06a30bea8a7177519003b6" Apr 22 14:26:13.636225 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:26:13.636199 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8db438a2d78a787d18d5603a0889d1db4ffad744ef06a30bea8a7177519003b6\": container with ID starting with 8db438a2d78a787d18d5603a0889d1db4ffad744ef06a30bea8a7177519003b6 not found: ID does not exist" containerID="8db438a2d78a787d18d5603a0889d1db4ffad744ef06a30bea8a7177519003b6" Apr 22 14:26:13.636309 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:13.636239 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8db438a2d78a787d18d5603a0889d1db4ffad744ef06a30bea8a7177519003b6"} err="failed to get container status \"8db438a2d78a787d18d5603a0889d1db4ffad744ef06a30bea8a7177519003b6\": rpc error: code = NotFound desc = could not find container \"8db438a2d78a787d18d5603a0889d1db4ffad744ef06a30bea8a7177519003b6\": container with ID starting with 8db438a2d78a787d18d5603a0889d1db4ffad744ef06a30bea8a7177519003b6 not found: ID does not exist" Apr 22 14:26:13.646929 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:13.646886 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw"] Apr 22 14:26:13.649543 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:13.649512 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1cfdf-predictor-6985558666-ml6vw"] Apr 22 14:26:15.517666 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:15.517619 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9912495-2ac2-4d13-9bb0-3bcced8805ba" path="/var/lib/kubelet/pods/b9912495-2ac2-4d13-9bb0-3bcced8805ba/volumes" Apr 22 14:26:18.403678 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:18.403642 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" Apr 22 14:26:18.406078 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:18.403716 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" Apr 22 14:26:21.621310 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:21.621266 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75" podUID="0149eb89-a3dc-4a58-8e88-3460aef15834" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 22 14:26:31.620658 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:31.620611 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75" podUID="0149eb89-a3dc-4a58-8e88-3460aef15834" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 22 14:26:41.620743 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:41.620687 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75" podUID="0149eb89-a3dc-4a58-8e88-3460aef15834" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 22 14:26:49.512485 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:49.512435 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj"] Apr 22 14:26:49.512981 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:49.512819 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" podUID="f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7" containerName="kserve-container" containerID="cri-o://e0956d8a9e4a531fd6786b2f44c57cad1f5c4a93c5887b04e84f225ab37595d7" gracePeriod=30 Apr 22 14:26:49.543105 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:49.543064 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv"] Apr 22 14:26:49.543655 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:49.543632 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9912495-2ac2-4d13-9bb0-3bcced8805ba" containerName="kserve-container" Apr 22 14:26:49.543655 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:49.543652 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9912495-2ac2-4d13-9bb0-3bcced8805ba" containerName="kserve-container" Apr 22 14:26:49.543868 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:49.543741 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9912495-2ac2-4d13-9bb0-3bcced8805ba" containerName="kserve-container" Apr 22 14:26:49.548645 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:49.548620 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv" Apr 22 14:26:49.554681 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:49.554656 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv"] Apr 22 14:26:49.562173 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:49.562149 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv" Apr 22 14:26:49.572729 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:49.572511 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8"] Apr 22 14:26:49.572937 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:49.572877 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" podUID="12046a98-409e-4b3f-acb6-380b2dfdac5a" containerName="kserve-container" containerID="cri-o://b644b7332ce5e05056a5a22afb1da00edab1c6234d50ecadf0d6275cb30adb64" gracePeriod=30 Apr 22 14:26:49.739045 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:49.739015 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv"] Apr 22 14:26:49.740249 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:26:49.740217 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71c4c9fd_84c0_47d4_ab0a_204be014b8d0.slice/crio-87c5164ae63ff4a8951cc3fb56d99334441d5c5e0d26cb3d5fe87551c0271469 WatchSource:0}: Error finding container 87c5164ae63ff4a8951cc3fb56d99334441d5c5e0d26cb3d5fe87551c0271469: Status 404 returned error can't find the container with id 87c5164ae63ff4a8951cc3fb56d99334441d5c5e0d26cb3d5fe87551c0271469 Apr 22 14:26:49.742322 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:49.742295 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:26:49.755662 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:49.755628 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv" event={"ID":"71c4c9fd-84c0-47d4-ab0a-204be014b8d0","Type":"ContainerStarted","Data":"87c5164ae63ff4a8951cc3fb56d99334441d5c5e0d26cb3d5fe87551c0271469"} Apr 22 14:26:50.760034 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:50.759994 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv" event={"ID":"71c4c9fd-84c0-47d4-ab0a-204be014b8d0","Type":"ContainerStarted","Data":"ea02e36139b5997353e2b2709b185ca8a13e7cfc0f7b246e28d5ca266f63b234"} Apr 22 14:26:50.760469 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:50.760154 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv" Apr 22 14:26:50.761260 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:50.761233 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv" podUID="71c4c9fd-84c0-47d4-ab0a-204be014b8d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 22 14:26:50.778734 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:50.778683 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv" podStartSLOduration=1.778666505 podStartE2EDuration="1.778666505s" podCreationTimestamp="2026-04-22 14:26:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:26:50.777321639 +0000 UTC m=+657.753287082" watchObservedRunningTime="2026-04-22 14:26:50.778666505 +0000 UTC m=+657.754631939" Apr 22 14:26:51.620637 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:51.620597 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75" podUID="0149eb89-a3dc-4a58-8e88-3460aef15834" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 22 14:26:51.763127 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:51.763090 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv" podUID="71c4c9fd-84c0-47d4-ab0a-204be014b8d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 22 14:26:54.175476 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.175431 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" Apr 22 14:26:54.230183 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.230159 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" Apr 22 14:26:54.276680 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.276649 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12046a98-409e-4b3f-acb6-380b2dfdac5a-kserve-provision-location\") pod \"12046a98-409e-4b3f-acb6-380b2dfdac5a\" (UID: \"12046a98-409e-4b3f-acb6-380b2dfdac5a\") " Apr 22 14:26:54.276841 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.276735 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7-kserve-provision-location\") pod \"f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7\" (UID: \"f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7\") " Apr 22 14:26:54.277028 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.277008 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12046a98-409e-4b3f-acb6-380b2dfdac5a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "12046a98-409e-4b3f-acb6-380b2dfdac5a" (UID: "12046a98-409e-4b3f-acb6-380b2dfdac5a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:26:54.277081 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.277024 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7" (UID: "f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:26:54.377936 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.377876 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12046a98-409e-4b3f-acb6-380b2dfdac5a-kserve-provision-location\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:26:54.377936 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.377923 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7-kserve-provision-location\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Apr 22 14:26:54.774398 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.774348 2579 generic.go:358] "Generic (PLEG): container finished" podID="f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7" containerID="e0956d8a9e4a531fd6786b2f44c57cad1f5c4a93c5887b04e84f225ab37595d7" exitCode=0 Apr 22 14:26:54.774628 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.774406 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" event={"ID":"f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7","Type":"ContainerDied","Data":"e0956d8a9e4a531fd6786b2f44c57cad1f5c4a93c5887b04e84f225ab37595d7"} Apr 22 14:26:54.774628 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.774421 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" Apr 22 14:26:54.774628 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.774488 2579 scope.go:117] "RemoveContainer" containerID="e0956d8a9e4a531fd6786b2f44c57cad1f5c4a93c5887b04e84f225ab37595d7" Apr 22 14:26:54.774628 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.774445 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj" event={"ID":"f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7","Type":"ContainerDied","Data":"e1f05d7b7bc740c38c20cc61ea26b88aa33d95f21193cff889b417d8baf1ba09"} Apr 22 14:26:54.776247 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.776221 2579 generic.go:358] "Generic (PLEG): container finished" podID="12046a98-409e-4b3f-acb6-380b2dfdac5a" containerID="b644b7332ce5e05056a5a22afb1da00edab1c6234d50ecadf0d6275cb30adb64" exitCode=0 Apr 22 14:26:54.776381 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.776268 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" Apr 22 14:26:54.776381 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.776294 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" event={"ID":"12046a98-409e-4b3f-acb6-380b2dfdac5a","Type":"ContainerDied","Data":"b644b7332ce5e05056a5a22afb1da00edab1c6234d50ecadf0d6275cb30adb64"} Apr 22 14:26:54.776381 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.776323 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8" event={"ID":"12046a98-409e-4b3f-acb6-380b2dfdac5a","Type":"ContainerDied","Data":"d502d3295e91195a4a08f23b272267884b6f0a928168424211e027c5ce31a5c2"} Apr 22 14:26:54.785411 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.785245 2579 scope.go:117] "RemoveContainer" containerID="88942df55ed62f528b819b8c4ec231067d9752e9810e7d2d318bec21e1e5ad83" Apr 22 14:26:54.793421 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.793402 2579 scope.go:117] "RemoveContainer" containerID="e0956d8a9e4a531fd6786b2f44c57cad1f5c4a93c5887b04e84f225ab37595d7" Apr 22 14:26:54.793675 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:26:54.793657 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0956d8a9e4a531fd6786b2f44c57cad1f5c4a93c5887b04e84f225ab37595d7\": container with ID starting with e0956d8a9e4a531fd6786b2f44c57cad1f5c4a93c5887b04e84f225ab37595d7 not found: ID does not exist" containerID="e0956d8a9e4a531fd6786b2f44c57cad1f5c4a93c5887b04e84f225ab37595d7" Apr 22 14:26:54.793744 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.793688 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0956d8a9e4a531fd6786b2f44c57cad1f5c4a93c5887b04e84f225ab37595d7"} err="failed to get container status \"e0956d8a9e4a531fd6786b2f44c57cad1f5c4a93c5887b04e84f225ab37595d7\": rpc error: code = NotFound desc = could not find container \"e0956d8a9e4a531fd6786b2f44c57cad1f5c4a93c5887b04e84f225ab37595d7\": container with ID starting with e0956d8a9e4a531fd6786b2f44c57cad1f5c4a93c5887b04e84f225ab37595d7 not found: ID does not exist" Apr 22 14:26:54.793744 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.793713 2579 scope.go:117] "RemoveContainer" containerID="88942df55ed62f528b819b8c4ec231067d9752e9810e7d2d318bec21e1e5ad83" Apr 22 14:26:54.793947 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:26:54.793930 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88942df55ed62f528b819b8c4ec231067d9752e9810e7d2d318bec21e1e5ad83\": container with ID starting with 88942df55ed62f528b819b8c4ec231067d9752e9810e7d2d318bec21e1e5ad83 not found: ID does not exist" containerID="88942df55ed62f528b819b8c4ec231067d9752e9810e7d2d318bec21e1e5ad83" Apr 22 14:26:54.793989 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.793954 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88942df55ed62f528b819b8c4ec231067d9752e9810e7d2d318bec21e1e5ad83"} err="failed to get container status \"88942df55ed62f528b819b8c4ec231067d9752e9810e7d2d318bec21e1e5ad83\": rpc error: code = NotFound desc = could not find container \"88942df55ed62f528b819b8c4ec231067d9752e9810e7d2d318bec21e1e5ad83\": container with ID starting with 88942df55ed62f528b819b8c4ec231067d9752e9810e7d2d318bec21e1e5ad83 not found: ID does not exist" Apr 22 14:26:54.793989 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.793968 2579 scope.go:117] "RemoveContainer" containerID="b644b7332ce5e05056a5a22afb1da00edab1c6234d50ecadf0d6275cb30adb64" Apr 22 14:26:54.801598 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.801467 2579 scope.go:117] "RemoveContainer" containerID="c30c09a34f960e61cf493702a410dc83af8de9a891e1fc02f66bce2c873866b3" Apr 22 14:26:54.803422 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.803397 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj"] Apr 22 14:26:54.807859 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.807839 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-56f57765b4-c8xfj"] Apr 22 14:26:54.809298 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.809282 2579 scope.go:117] "RemoveContainer" containerID="b644b7332ce5e05056a5a22afb1da00edab1c6234d50ecadf0d6275cb30adb64" Apr 22 14:26:54.809591 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:26:54.809572 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b644b7332ce5e05056a5a22afb1da00edab1c6234d50ecadf0d6275cb30adb64\": container with ID starting with b644b7332ce5e05056a5a22afb1da00edab1c6234d50ecadf0d6275cb30adb64 not found: ID does not exist" containerID="b644b7332ce5e05056a5a22afb1da00edab1c6234d50ecadf0d6275cb30adb64" Apr 22 14:26:54.809668 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.809600 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b644b7332ce5e05056a5a22afb1da00edab1c6234d50ecadf0d6275cb30adb64"} err="failed to get container status \"b644b7332ce5e05056a5a22afb1da00edab1c6234d50ecadf0d6275cb30adb64\": rpc error: code = NotFound desc = could not find container \"b644b7332ce5e05056a5a22afb1da00edab1c6234d50ecadf0d6275cb30adb64\": container with ID starting with b644b7332ce5e05056a5a22afb1da00edab1c6234d50ecadf0d6275cb30adb64 not found: ID does not exist" Apr 22 14:26:54.809668 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.809619 2579 scope.go:117] "RemoveContainer" containerID="c30c09a34f960e61cf493702a410dc83af8de9a891e1fc02f66bce2c873866b3" Apr 22 14:26:54.809846 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:26:54.809832 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c30c09a34f960e61cf493702a410dc83af8de9a891e1fc02f66bce2c873866b3\": container with ID starting with c30c09a34f960e61cf493702a410dc83af8de9a891e1fc02f66bce2c873866b3 not found: ID does not exist" containerID="c30c09a34f960e61cf493702a410dc83af8de9a891e1fc02f66bce2c873866b3" Apr 22 14:26:54.809896 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.809850 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c30c09a34f960e61cf493702a410dc83af8de9a891e1fc02f66bce2c873866b3"} err="failed to get container status \"c30c09a34f960e61cf493702a410dc83af8de9a891e1fc02f66bce2c873866b3\": rpc error: code = NotFound desc = could not find container \"c30c09a34f960e61cf493702a410dc83af8de9a891e1fc02f66bce2c873866b3\": container with ID starting with c30c09a34f960e61cf493702a410dc83af8de9a891e1fc02f66bce2c873866b3 not found: ID does not exist" Apr 22 14:26:54.820725 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.820692 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8"] Apr 22 14:26:54.822825 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:54.822804 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6d86c7594-9vjr8"] Apr 22 14:26:55.517501 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:55.517437 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12046a98-409e-4b3f-acb6-380b2dfdac5a" path="/var/lib/kubelet/pods/12046a98-409e-4b3f-acb6-380b2dfdac5a/volumes" Apr 22 14:26:55.517875 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:26:55.517847 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7" path="/var/lib/kubelet/pods/f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7/volumes" Apr 22 14:27:01.622274 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:27:01.622239 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75" Apr 22 14:27:01.763940 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:27:01.763895 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv" podUID="71c4c9fd-84c0-47d4-ab0a-204be014b8d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 22 14:27:11.764081 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:27:11.764025 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv" podUID="71c4c9fd-84c0-47d4-ab0a-204be014b8d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 22 14:27:21.763524 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:27:21.763479 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv" podUID="71c4c9fd-84c0-47d4-ab0a-204be014b8d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 22 14:27:31.763178 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:27:31.763129 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv" podUID="71c4c9fd-84c0-47d4-ab0a-204be014b8d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 22 14:27:41.764734 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:27:41.764644 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv" Apr 22 14:30:53.443212 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:30:53.443182 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 14:30:53.444400 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:30:53.444378 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 14:35:34.488238 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:34.488200 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75"] Apr 22 14:35:34.488711 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:34.488421 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75" podUID="0149eb89-a3dc-4a58-8e88-3460aef15834" containerName="kserve-container" containerID="cri-o://5b8a9db436a29c6a08b71b8bdc94a34a39f2275115cc1618da2fc3e26938be4b" gracePeriod=30 Apr 22 14:35:34.644496 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:34.640870 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm"] Apr 22 14:35:34.644496 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:34.641693 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7" containerName="storage-initializer" Apr 22 14:35:34.644496 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:34.641716 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7" containerName="storage-initializer" Apr 22 14:35:34.644496 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:34.641739 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12046a98-409e-4b3f-acb6-380b2dfdac5a" containerName="kserve-container" Apr 22 14:35:34.644496 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:34.641747 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="12046a98-409e-4b3f-acb6-380b2dfdac5a" containerName="kserve-container" Apr 22 14:35:34.644496 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:34.641779 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12046a98-409e-4b3f-acb6-380b2dfdac5a" containerName="storage-initializer" Apr 22 14:35:34.644496 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:34.641788 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="12046a98-409e-4b3f-acb6-380b2dfdac5a" containerName="storage-initializer" Apr 22 14:35:34.644496 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:34.641817 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7" containerName="kserve-container" Apr 22 14:35:34.644496 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:34.641825 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7" containerName="kserve-container" Apr 22 14:35:34.644496 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:34.641963 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="12046a98-409e-4b3f-acb6-380b2dfdac5a" containerName="kserve-container" Apr 22 14:35:34.644496 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:34.641978 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8a6759e-3a03-4c29-91aa-f6fc1e0cc7c7" containerName="kserve-container" Apr 22 14:35:34.645704 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:34.645675 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm" Apr 22 14:35:34.660217 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:34.660189 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm" Apr 22 14:35:34.679252 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:34.679218 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm"] Apr 22 14:35:34.814554 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:34.814520 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm"] Apr 22 14:35:34.818293 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:35:34.818259 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ea302b2_3e21_4ca0_b1df_eab23c178bde.slice/crio-d6ffb5d1bc7080777dc0a6f728818b79e411f9e30b59cdf82a661f44840c21a4 WatchSource:0}: Error finding container d6ffb5d1bc7080777dc0a6f728818b79e411f9e30b59cdf82a661f44840c21a4: Status 404 returned error can't find the container with id d6ffb5d1bc7080777dc0a6f728818b79e411f9e30b59cdf82a661f44840c21a4 Apr 22 14:35:34.820430 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:34.820406 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:35:35.493042 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:35.493010 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm" event={"ID":"5ea302b2-3e21-4ca0-b1df-eab23c178bde","Type":"ContainerStarted","Data":"15008bbf5fe7f120ce30ca6d4a9d71255bafe766e11f7bde55980a063a2ec3e7"} Apr 22 14:35:35.493042 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:35.493047 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm" event={"ID":"5ea302b2-3e21-4ca0-b1df-eab23c178bde","Type":"ContainerStarted","Data":"d6ffb5d1bc7080777dc0a6f728818b79e411f9e30b59cdf82a661f44840c21a4"} Apr 22 14:35:35.493547 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:35.493257 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm" Apr 22 14:35:35.494786 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:35.494756 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm" podUID="5ea302b2-3e21-4ca0-b1df-eab23c178bde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 22 14:35:35.509423 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:35.509368 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm" podStartSLOduration=1.509355513 podStartE2EDuration="1.509355513s" podCreationTimestamp="2026-04-22 14:35:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:35:35.508274023 +0000 UTC m=+1182.484239464" watchObservedRunningTime="2026-04-22 14:35:35.509355513 +0000 UTC m=+1182.485320953" Apr 22 14:35:36.496271 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:36.496221 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm" podUID="5ea302b2-3e21-4ca0-b1df-eab23c178bde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 22 14:35:37.736822 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:37.736798 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75" Apr 22 14:35:38.503376 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:38.503341 2579 generic.go:358] "Generic (PLEG): container finished" podID="0149eb89-a3dc-4a58-8e88-3460aef15834" containerID="5b8a9db436a29c6a08b71b8bdc94a34a39f2275115cc1618da2fc3e26938be4b" exitCode=0 Apr 22 14:35:38.503584 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:38.503394 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75" event={"ID":"0149eb89-a3dc-4a58-8e88-3460aef15834","Type":"ContainerDied","Data":"5b8a9db436a29c6a08b71b8bdc94a34a39f2275115cc1618da2fc3e26938be4b"} Apr 22 14:35:38.503584 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:38.503404 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75" Apr 22 14:35:38.503584 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:38.503420 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75" event={"ID":"0149eb89-a3dc-4a58-8e88-3460aef15834","Type":"ContainerDied","Data":"7ec217ec91b9c852e68a70b4b1695ec289b4c001e1d30b9a17645003d194611c"} Apr 22 14:35:38.503584 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:38.503437 2579 scope.go:117] "RemoveContainer" containerID="5b8a9db436a29c6a08b71b8bdc94a34a39f2275115cc1618da2fc3e26938be4b" Apr 22 14:35:38.512154 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:38.512137 2579 scope.go:117] "RemoveContainer" containerID="5b8a9db436a29c6a08b71b8bdc94a34a39f2275115cc1618da2fc3e26938be4b" Apr 22 14:35:38.512420 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:35:38.512399 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b8a9db436a29c6a08b71b8bdc94a34a39f2275115cc1618da2fc3e26938be4b\": container with ID starting with 5b8a9db436a29c6a08b71b8bdc94a34a39f2275115cc1618da2fc3e26938be4b not found: ID does not exist" containerID="5b8a9db436a29c6a08b71b8bdc94a34a39f2275115cc1618da2fc3e26938be4b" Apr 22 14:35:38.512531 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:38.512433 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b8a9db436a29c6a08b71b8bdc94a34a39f2275115cc1618da2fc3e26938be4b"} err="failed to get container status \"5b8a9db436a29c6a08b71b8bdc94a34a39f2275115cc1618da2fc3e26938be4b\": rpc error: code = NotFound desc = could not find container \"5b8a9db436a29c6a08b71b8bdc94a34a39f2275115cc1618da2fc3e26938be4b\": container with ID starting with 5b8a9db436a29c6a08b71b8bdc94a34a39f2275115cc1618da2fc3e26938be4b not found: ID does not exist" Apr 22 14:35:38.524493 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:38.524467 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75"] Apr 22 14:35:38.527675 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:38.527651 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-58a83-predictor-85b474db6d-lpp75"] Apr 22 14:35:39.517835 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:39.517803 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0149eb89-a3dc-4a58-8e88-3460aef15834" path="/var/lib/kubelet/pods/0149eb89-a3dc-4a58-8e88-3460aef15834/volumes" Apr 22 14:35:46.497055 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:46.497007 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm" podUID="5ea302b2-3e21-4ca0-b1df-eab23c178bde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 22 14:35:53.465961 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:53.465929 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 14:35:53.468144 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:53.468120 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 14:35:56.496635 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:35:56.496588 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm" podUID="5ea302b2-3e21-4ca0-b1df-eab23c178bde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 22 14:36:06.496547 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:06.496507 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm" podUID="5ea302b2-3e21-4ca0-b1df-eab23c178bde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 22 14:36:14.402648 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:14.402613 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv"] Apr 22 14:36:14.403214 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:14.402885 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv" podUID="71c4c9fd-84c0-47d4-ab0a-204be014b8d0" containerName="kserve-container" containerID="cri-o://ea02e36139b5997353e2b2709b185ca8a13e7cfc0f7b246e28d5ca266f63b234" gracePeriod=30 Apr 22 14:36:14.428834 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:14.428798 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9"] Apr 22 14:36:14.429252 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:14.429233 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0149eb89-a3dc-4a58-8e88-3460aef15834" containerName="kserve-container" Apr 22 14:36:14.429316 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:14.429254 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0149eb89-a3dc-4a58-8e88-3460aef15834" containerName="kserve-container" Apr 22 14:36:14.429369 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:14.429324 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0149eb89-a3dc-4a58-8e88-3460aef15834" containerName="kserve-container" Apr 22 14:36:14.432377 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:14.432358 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9" Apr 22 14:36:14.441792 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:14.441764 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9"] Apr 22 14:36:14.443548 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:14.443529 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9" Apr 22 14:36:14.598406 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:14.596147 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9"] Apr 22 14:36:14.598878 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:36:14.598844 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod980ba7c7_8cc3_4f43_8d02_7158fb43bfe3.slice/crio-dd0ebdec511d15338980eea98b76117e0cb3a1583b54fc769757ec7e12a9f081 WatchSource:0}: Error finding container dd0ebdec511d15338980eea98b76117e0cb3a1583b54fc769757ec7e12a9f081: Status 404 returned error can't find the container with id dd0ebdec511d15338980eea98b76117e0cb3a1583b54fc769757ec7e12a9f081 Apr 22 14:36:14.618010 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:14.617979 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9" event={"ID":"980ba7c7-8cc3-4f43-8d02-7158fb43bfe3","Type":"ContainerStarted","Data":"dd0ebdec511d15338980eea98b76117e0cb3a1583b54fc769757ec7e12a9f081"} Apr 22 14:36:15.622630 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:15.622580 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9" event={"ID":"980ba7c7-8cc3-4f43-8d02-7158fb43bfe3","Type":"ContainerStarted","Data":"9ea8d51175409795743d88ba0aae703db22f47905929a1fd68e243f1181595cd"} Apr 22 14:36:15.623035 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:15.622798 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9" Apr 22 14:36:15.624158 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:15.624130 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9" podUID="980ba7c7-8cc3-4f43-8d02-7158fb43bfe3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 22 14:36:15.638933 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:15.638887 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9" podStartSLOduration=1.638872096 podStartE2EDuration="1.638872096s" podCreationTimestamp="2026-04-22 14:36:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:36:15.637744521 +0000 UTC m=+1222.613709963" watchObservedRunningTime="2026-04-22 14:36:15.638872096 +0000 UTC m=+1222.614837535" Apr 22 14:36:16.496983 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:16.496937 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm" podUID="5ea302b2-3e21-4ca0-b1df-eab23c178bde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 22 14:36:16.627026 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:16.626987 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9" podUID="980ba7c7-8cc3-4f43-8d02-7158fb43bfe3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 22 14:36:17.632622 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:17.632591 2579 generic.go:358] "Generic (PLEG): container finished" podID="71c4c9fd-84c0-47d4-ab0a-204be014b8d0" containerID="ea02e36139b5997353e2b2709b185ca8a13e7cfc0f7b246e28d5ca266f63b234" exitCode=0 Apr 22 14:36:17.632984 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:17.632663 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv" event={"ID":"71c4c9fd-84c0-47d4-ab0a-204be014b8d0","Type":"ContainerDied","Data":"ea02e36139b5997353e2b2709b185ca8a13e7cfc0f7b246e28d5ca266f63b234"} Apr 22 14:36:17.756137 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:17.756113 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv" Apr 22 14:36:18.637130 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:18.637103 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv" Apr 22 14:36:18.637612 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:18.637107 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv" event={"ID":"71c4c9fd-84c0-47d4-ab0a-204be014b8d0","Type":"ContainerDied","Data":"87c5164ae63ff4a8951cc3fb56d99334441d5c5e0d26cb3d5fe87551c0271469"} Apr 22 14:36:18.637612 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:18.637215 2579 scope.go:117] "RemoveContainer" containerID="ea02e36139b5997353e2b2709b185ca8a13e7cfc0f7b246e28d5ca266f63b234" Apr 22 14:36:18.658957 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:18.658921 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv"] Apr 22 14:36:18.660417 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:18.660397 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1091a-predictor-579b887b74-bdjvv"] Apr 22 14:36:19.517434 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:19.517403 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71c4c9fd-84c0-47d4-ab0a-204be014b8d0" path="/var/lib/kubelet/pods/71c4c9fd-84c0-47d4-ab0a-204be014b8d0/volumes" Apr 22 14:36:26.497627 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:26.497586 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm" Apr 22 14:36:26.627462 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:26.627410 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9" podUID="980ba7c7-8cc3-4f43-8d02-7158fb43bfe3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 22 14:36:36.627574 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:36.627530 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9" podUID="980ba7c7-8cc3-4f43-8d02-7158fb43bfe3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 22 14:36:46.627662 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:46.627574 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9" podUID="980ba7c7-8cc3-4f43-8d02-7158fb43bfe3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 22 14:36:54.834512 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:54.834478 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm"] Apr 22 14:36:54.834905 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:54.834736 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm" podUID="5ea302b2-3e21-4ca0-b1df-eab23c178bde" containerName="kserve-container" containerID="cri-o://15008bbf5fe7f120ce30ca6d4a9d71255bafe766e11f7bde55980a063a2ec3e7" gracePeriod=30 Apr 22 14:36:54.846431 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:54.846399 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm"] Apr 22 14:36:54.847444 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:54.847415 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71c4c9fd-84c0-47d4-ab0a-204be014b8d0" containerName="kserve-container" Apr 22 14:36:54.847444 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:54.847443 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c4c9fd-84c0-47d4-ab0a-204be014b8d0" containerName="kserve-container" Apr 22 14:36:54.847657 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:54.847642 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="71c4c9fd-84c0-47d4-ab0a-204be014b8d0" containerName="kserve-container" Apr 22 14:36:54.856016 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:54.855982 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm" Apr 22 14:36:54.856753 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:54.856726 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm"] Apr 22 14:36:54.867667 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:54.867646 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm" Apr 22 14:36:55.030503 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:55.030475 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm"] Apr 22 14:36:55.033816 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:36:55.033778 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda174bb8d_ae21_4fd2_80c9_7369a13331e6.slice/crio-7ae8dda905664210d3679795dc9e965e45a49d94d709015f2bf59fa49c4c108f WatchSource:0}: Error finding container 7ae8dda905664210d3679795dc9e965e45a49d94d709015f2bf59fa49c4c108f: Status 404 returned error can't find the container with id 7ae8dda905664210d3679795dc9e965e45a49d94d709015f2bf59fa49c4c108f Apr 22 14:36:55.757875 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:55.757834 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm" event={"ID":"a174bb8d-ae21-4fd2-80c9-7369a13331e6","Type":"ContainerStarted","Data":"0e0dc7c669c3dff2e3d2e10b304989642e651c7fe37ec1a8d8be4346ccdf9260"} Apr 22 14:36:55.757875 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:55.757881 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm" event={"ID":"a174bb8d-ae21-4fd2-80c9-7369a13331e6","Type":"ContainerStarted","Data":"7ae8dda905664210d3679795dc9e965e45a49d94d709015f2bf59fa49c4c108f"} Apr 22 14:36:55.758109 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:55.758045 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm" Apr 22 14:36:55.759373 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:55.759347 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm" podUID="a174bb8d-ae21-4fd2-80c9-7369a13331e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 14:36:55.773017 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:55.772961 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm" podStartSLOduration=1.772943444 podStartE2EDuration="1.772943444s" podCreationTimestamp="2026-04-22 14:36:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:36:55.77252908 +0000 UTC m=+1262.748494523" watchObservedRunningTime="2026-04-22 14:36:55.772943444 +0000 UTC m=+1262.748908886" Apr 22 14:36:56.496400 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:56.496358 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm" podUID="5ea302b2-3e21-4ca0-b1df-eab23c178bde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 22 14:36:56.627162 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:56.627121 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9" podUID="980ba7c7-8cc3-4f43-8d02-7158fb43bfe3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 22 14:36:56.761440 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:56.761356 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm" podUID="a174bb8d-ae21-4fd2-80c9-7369a13331e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 14:36:58.177362 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:58.177339 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm" Apr 22 14:36:58.769032 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:58.768996 2579 generic.go:358] "Generic (PLEG): container finished" podID="5ea302b2-3e21-4ca0-b1df-eab23c178bde" containerID="15008bbf5fe7f120ce30ca6d4a9d71255bafe766e11f7bde55980a063a2ec3e7" exitCode=0 Apr 22 14:36:58.769275 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:58.769077 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm" Apr 22 14:36:58.769275 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:58.769077 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm" event={"ID":"5ea302b2-3e21-4ca0-b1df-eab23c178bde","Type":"ContainerDied","Data":"15008bbf5fe7f120ce30ca6d4a9d71255bafe766e11f7bde55980a063a2ec3e7"} Apr 22 14:36:58.769275 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:58.769115 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm" event={"ID":"5ea302b2-3e21-4ca0-b1df-eab23c178bde","Type":"ContainerDied","Data":"d6ffb5d1bc7080777dc0a6f728818b79e411f9e30b59cdf82a661f44840c21a4"} Apr 22 14:36:58.769275 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:58.769129 2579 scope.go:117] "RemoveContainer" containerID="15008bbf5fe7f120ce30ca6d4a9d71255bafe766e11f7bde55980a063a2ec3e7" Apr 22 14:36:58.777522 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:58.777505 2579 scope.go:117] "RemoveContainer" containerID="15008bbf5fe7f120ce30ca6d4a9d71255bafe766e11f7bde55980a063a2ec3e7" Apr 22 14:36:58.777764 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:36:58.777740 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15008bbf5fe7f120ce30ca6d4a9d71255bafe766e11f7bde55980a063a2ec3e7\": container with ID starting with 15008bbf5fe7f120ce30ca6d4a9d71255bafe766e11f7bde55980a063a2ec3e7 not found: ID does not exist" containerID="15008bbf5fe7f120ce30ca6d4a9d71255bafe766e11f7bde55980a063a2ec3e7" Apr 22 14:36:58.777833 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:58.777778 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15008bbf5fe7f120ce30ca6d4a9d71255bafe766e11f7bde55980a063a2ec3e7"} err="failed to get container status \"15008bbf5fe7f120ce30ca6d4a9d71255bafe766e11f7bde55980a063a2ec3e7\": rpc error: code = NotFound desc = could not find container \"15008bbf5fe7f120ce30ca6d4a9d71255bafe766e11f7bde55980a063a2ec3e7\": container with ID starting with 15008bbf5fe7f120ce30ca6d4a9d71255bafe766e11f7bde55980a063a2ec3e7 not found: ID does not exist" Apr 22 14:36:58.789575 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:58.789552 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm"] Apr 22 14:36:58.793139 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:58.793118 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dee6f-predictor-7d88946f7d-b9mxm"] Apr 22 14:36:59.517510 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:36:59.517475 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ea302b2-3e21-4ca0-b1df-eab23c178bde" path="/var/lib/kubelet/pods/5ea302b2-3e21-4ca0-b1df-eab23c178bde/volumes" Apr 22 14:37:06.628271 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:06.628238 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9" Apr 22 14:37:06.762054 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:06.762006 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm" podUID="a174bb8d-ae21-4fd2-80c9-7369a13331e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 14:37:16.761473 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:16.761415 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm" podUID="a174bb8d-ae21-4fd2-80c9-7369a13331e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 14:37:26.761863 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:26.761814 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm" podUID="a174bb8d-ae21-4fd2-80c9-7369a13331e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 14:37:34.635010 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:34.634930 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9"] Apr 22 14:37:34.635522 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:34.635255 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9" podUID="980ba7c7-8cc3-4f43-8d02-7158fb43bfe3" containerName="kserve-container" containerID="cri-o://9ea8d51175409795743d88ba0aae703db22f47905929a1fd68e243f1181595cd" gracePeriod=30 Apr 22 14:37:34.668234 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:34.668201 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5"] Apr 22 14:37:34.668698 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:34.668675 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ea302b2-3e21-4ca0-b1df-eab23c178bde" containerName="kserve-container" Apr 22 14:37:34.668698 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:34.668697 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ea302b2-3e21-4ca0-b1df-eab23c178bde" containerName="kserve-container" Apr 22 14:37:34.668877 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:34.668788 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ea302b2-3e21-4ca0-b1df-eab23c178bde" containerName="kserve-container" Apr 22 14:37:34.672047 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:34.672022 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5" Apr 22 14:37:34.680348 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:34.680318 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5"] Apr 22 14:37:34.687611 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:34.687580 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5" Apr 22 14:37:34.842396 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:34.842358 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5"] Apr 22 14:37:34.847127 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:37:34.847088 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8774659b_7dd0_4ab9_9cce_0e1114f33e93.slice/crio-3e23e46e81b22d493e421d0d5f7f29caee8e5b3a8715c3e6814ae104862837c8 WatchSource:0}: Error finding container 3e23e46e81b22d493e421d0d5f7f29caee8e5b3a8715c3e6814ae104862837c8: Status 404 returned error can't find the container with id 3e23e46e81b22d493e421d0d5f7f29caee8e5b3a8715c3e6814ae104862837c8 Apr 22 14:37:34.888993 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:34.888948 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5" event={"ID":"8774659b-7dd0-4ab9-9cce-0e1114f33e93","Type":"ContainerStarted","Data":"3e23e46e81b22d493e421d0d5f7f29caee8e5b3a8715c3e6814ae104862837c8"} Apr 22 14:37:35.894019 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:35.893981 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5" event={"ID":"8774659b-7dd0-4ab9-9cce-0e1114f33e93","Type":"ContainerStarted","Data":"f050ecf4d0e3d68922ddd1a1c4f1ab4385afcebbab7abc37cffb719cb835857f"} Apr 22 14:37:35.894413 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:35.894180 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5" Apr 22 14:37:35.895467 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:35.895424 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5" podUID="8774659b-7dd0-4ab9-9cce-0e1114f33e93" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 22 14:37:35.911336 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:35.911280 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5" podStartSLOduration=1.911263876 podStartE2EDuration="1.911263876s" podCreationTimestamp="2026-04-22 14:37:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:37:35.910403472 +0000 UTC m=+1302.886368925" watchObservedRunningTime="2026-04-22 14:37:35.911263876 +0000 UTC m=+1302.887229315" Apr 22 14:37:36.628012 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:36.627967 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9" podUID="980ba7c7-8cc3-4f43-8d02-7158fb43bfe3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 22 14:37:36.762506 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:36.762438 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm" podUID="a174bb8d-ae21-4fd2-80c9-7369a13331e6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 14:37:36.897518 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:36.897401 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5" podUID="8774659b-7dd0-4ab9-9cce-0e1114f33e93" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 22 14:37:38.586974 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:38.586949 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9" Apr 22 14:37:38.904998 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:38.904956 2579 generic.go:358] "Generic (PLEG): container finished" podID="980ba7c7-8cc3-4f43-8d02-7158fb43bfe3" containerID="9ea8d51175409795743d88ba0aae703db22f47905929a1fd68e243f1181595cd" exitCode=0 Apr 22 14:37:38.905186 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:38.905006 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9" event={"ID":"980ba7c7-8cc3-4f43-8d02-7158fb43bfe3","Type":"ContainerDied","Data":"9ea8d51175409795743d88ba0aae703db22f47905929a1fd68e243f1181595cd"} Apr 22 14:37:38.905186 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:38.905026 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9" Apr 22 14:37:38.905186 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:38.905038 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9" event={"ID":"980ba7c7-8cc3-4f43-8d02-7158fb43bfe3","Type":"ContainerDied","Data":"dd0ebdec511d15338980eea98b76117e0cb3a1583b54fc769757ec7e12a9f081"} Apr 22 14:37:38.905186 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:38.905058 2579 scope.go:117] "RemoveContainer" containerID="9ea8d51175409795743d88ba0aae703db22f47905929a1fd68e243f1181595cd" Apr 22 14:37:38.913122 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:38.913099 2579 scope.go:117] "RemoveContainer" containerID="9ea8d51175409795743d88ba0aae703db22f47905929a1fd68e243f1181595cd" Apr 22 14:37:38.913379 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:37:38.913356 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ea8d51175409795743d88ba0aae703db22f47905929a1fd68e243f1181595cd\": container with ID starting with 9ea8d51175409795743d88ba0aae703db22f47905929a1fd68e243f1181595cd not found: ID does not exist" containerID="9ea8d51175409795743d88ba0aae703db22f47905929a1fd68e243f1181595cd" Apr 22 14:37:38.913445 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:38.913392 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ea8d51175409795743d88ba0aae703db22f47905929a1fd68e243f1181595cd"} err="failed to get container status \"9ea8d51175409795743d88ba0aae703db22f47905929a1fd68e243f1181595cd\": rpc error: code = NotFound desc = could not find container \"9ea8d51175409795743d88ba0aae703db22f47905929a1fd68e243f1181595cd\": container with ID starting with 9ea8d51175409795743d88ba0aae703db22f47905929a1fd68e243f1181595cd not found: ID does not exist" Apr 22 14:37:38.925591 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:38.925562 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9"] Apr 22 14:37:38.928776 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:38.928749 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-33e8e-predictor-595856cb86-2rfh9"] Apr 22 14:37:39.517461 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:39.517412 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="980ba7c7-8cc3-4f43-8d02-7158fb43bfe3" path="/var/lib/kubelet/pods/980ba7c7-8cc3-4f43-8d02-7158fb43bfe3/volumes" Apr 22 14:37:46.763331 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:46.763294 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm" Apr 22 14:37:46.898099 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:46.898055 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5" podUID="8774659b-7dd0-4ab9-9cce-0e1114f33e93" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 22 14:37:56.897543 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:37:56.897483 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5" podUID="8774659b-7dd0-4ab9-9cce-0e1114f33e93" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 22 14:38:06.897976 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:38:06.897931 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5" podUID="8774659b-7dd0-4ab9-9cce-0e1114f33e93" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 22 14:38:16.898328 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:38:16.898231 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5" podUID="8774659b-7dd0-4ab9-9cce-0e1114f33e93" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 22 14:38:26.898276 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:38:26.898245 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5" Apr 22 14:40:53.496980 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:40:53.496952 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 14:40:53.504103 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:40:53.504063 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 14:45:53.526741 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:45:53.526638 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 14:45:53.531226 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:45:53.531203 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 14:46:19.948969 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:19.948926 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm"] Apr 22 14:46:19.950044 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:19.949224 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm" podUID="a174bb8d-ae21-4fd2-80c9-7369a13331e6" containerName="kserve-container" containerID="cri-o://0e0dc7c669c3dff2e3d2e10b304989642e651c7fe37ec1a8d8be4346ccdf9260" gracePeriod=30 Apr 22 14:46:20.030718 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:20.025662 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj"] Apr 22 14:46:20.030718 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:20.026043 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="980ba7c7-8cc3-4f43-8d02-7158fb43bfe3" containerName="kserve-container" Apr 22 14:46:20.030718 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:20.026055 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="980ba7c7-8cc3-4f43-8d02-7158fb43bfe3" containerName="kserve-container" Apr 22 14:46:20.030718 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:20.026107 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="980ba7c7-8cc3-4f43-8d02-7158fb43bfe3" containerName="kserve-container" Apr 22 14:46:20.030718 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:20.029109 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj" Apr 22 14:46:20.041695 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:20.041668 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj" Apr 22 14:46:20.054407 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:20.054331 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj"] Apr 22 14:46:20.202501 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:20.202432 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj"] Apr 22 14:46:20.205195 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:46:20.205163 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30622015_c7f1_47a6_9b54_e7ed8a2820cd.slice/crio-f4638e08d1c077aae450b43932dc6404100d2a187eebe2b92a6eeb7739b7dce3 WatchSource:0}: Error finding container f4638e08d1c077aae450b43932dc6404100d2a187eebe2b92a6eeb7739b7dce3: Status 404 returned error can't find the container with id f4638e08d1c077aae450b43932dc6404100d2a187eebe2b92a6eeb7739b7dce3 Apr 22 14:46:20.207058 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:20.207040 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:46:20.617191 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:20.617102 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj" event={"ID":"30622015-c7f1-47a6-9b54-e7ed8a2820cd","Type":"ContainerStarted","Data":"18a43ca2fc48c528418a168fb8800c5c241dc9e22de1fc8a2fa14f1565ce48b9"} Apr 22 14:46:20.617191 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:20.617138 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj" event={"ID":"30622015-c7f1-47a6-9b54-e7ed8a2820cd","Type":"ContainerStarted","Data":"f4638e08d1c077aae450b43932dc6404100d2a187eebe2b92a6eeb7739b7dce3"} Apr 22 14:46:20.617427 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:20.617283 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj" Apr 22 14:46:20.618576 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:20.618543 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj" podUID="30622015-c7f1-47a6-9b54-e7ed8a2820cd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 14:46:20.633483 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:20.633419 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj" podStartSLOduration=1.633403122 podStartE2EDuration="1.633403122s" podCreationTimestamp="2026-04-22 14:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:46:20.631769044 +0000 UTC m=+1827.607734484" watchObservedRunningTime="2026-04-22 14:46:20.633403122 +0000 UTC m=+1827.609368631" Apr 22 14:46:21.620594 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:21.620554 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj" podUID="30622015-c7f1-47a6-9b54-e7ed8a2820cd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 14:46:23.196092 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:23.196069 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm" Apr 22 14:46:23.629220 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:23.629182 2579 generic.go:358] "Generic (PLEG): container finished" podID="a174bb8d-ae21-4fd2-80c9-7369a13331e6" containerID="0e0dc7c669c3dff2e3d2e10b304989642e651c7fe37ec1a8d8be4346ccdf9260" exitCode=0 Apr 22 14:46:23.629389 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:23.629241 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm" event={"ID":"a174bb8d-ae21-4fd2-80c9-7369a13331e6","Type":"ContainerDied","Data":"0e0dc7c669c3dff2e3d2e10b304989642e651c7fe37ec1a8d8be4346ccdf9260"} Apr 22 14:46:23.629389 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:23.629266 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm" event={"ID":"a174bb8d-ae21-4fd2-80c9-7369a13331e6","Type":"ContainerDied","Data":"7ae8dda905664210d3679795dc9e965e45a49d94d709015f2bf59fa49c4c108f"} Apr 22 14:46:23.629389 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:23.629265 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm" Apr 22 14:46:23.629389 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:23.629279 2579 scope.go:117] "RemoveContainer" containerID="0e0dc7c669c3dff2e3d2e10b304989642e651c7fe37ec1a8d8be4346ccdf9260" Apr 22 14:46:23.637067 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:23.637053 2579 scope.go:117] "RemoveContainer" containerID="0e0dc7c669c3dff2e3d2e10b304989642e651c7fe37ec1a8d8be4346ccdf9260" Apr 22 14:46:23.637300 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:46:23.637282 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e0dc7c669c3dff2e3d2e10b304989642e651c7fe37ec1a8d8be4346ccdf9260\": container with ID starting with 0e0dc7c669c3dff2e3d2e10b304989642e651c7fe37ec1a8d8be4346ccdf9260 not found: ID does not exist" containerID="0e0dc7c669c3dff2e3d2e10b304989642e651c7fe37ec1a8d8be4346ccdf9260" Apr 22 14:46:23.637360 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:23.637308 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e0dc7c669c3dff2e3d2e10b304989642e651c7fe37ec1a8d8be4346ccdf9260"} err="failed to get container status \"0e0dc7c669c3dff2e3d2e10b304989642e651c7fe37ec1a8d8be4346ccdf9260\": rpc error: code = NotFound desc = could not find container \"0e0dc7c669c3dff2e3d2e10b304989642e651c7fe37ec1a8d8be4346ccdf9260\": container with ID starting with 0e0dc7c669c3dff2e3d2e10b304989642e651c7fe37ec1a8d8be4346ccdf9260 not found: ID does not exist" Apr 22 14:46:23.646753 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:23.646721 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm"] Apr 22 14:46:23.648521 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:23.648499 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4547e-predictor-776ddf9848-mm7qm"] Apr 22 14:46:25.517255 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:25.517210 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a174bb8d-ae21-4fd2-80c9-7369a13331e6" path="/var/lib/kubelet/pods/a174bb8d-ae21-4fd2-80c9-7369a13331e6/volumes" Apr 22 14:46:31.620988 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:31.620941 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj" podUID="30622015-c7f1-47a6-9b54-e7ed8a2820cd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 14:46:41.621176 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:41.621127 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj" podUID="30622015-c7f1-47a6-9b54-e7ed8a2820cd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 14:46:51.621529 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:51.621485 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj" podUID="30622015-c7f1-47a6-9b54-e7ed8a2820cd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 14:46:59.753498 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:59.752858 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz"] Apr 22 14:46:59.753498 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:59.753397 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a174bb8d-ae21-4fd2-80c9-7369a13331e6" containerName="kserve-container" Apr 22 14:46:59.753498 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:59.753417 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a174bb8d-ae21-4fd2-80c9-7369a13331e6" containerName="kserve-container" Apr 22 14:46:59.754638 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:59.754247 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="a174bb8d-ae21-4fd2-80c9-7369a13331e6" containerName="kserve-container" Apr 22 14:46:59.758300 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:59.758277 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz" Apr 22 14:46:59.768860 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:59.768064 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz"] Apr 22 14:46:59.773863 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:59.773838 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz" Apr 22 14:46:59.804713 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:59.804595 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5"] Apr 22 14:46:59.804962 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:59.804932 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5" podUID="8774659b-7dd0-4ab9-9cce-0e1114f33e93" containerName="kserve-container" containerID="cri-o://f050ecf4d0e3d68922ddd1a1c4f1ab4385afcebbab7abc37cffb719cb835857f" gracePeriod=30 Apr 22 14:46:59.927479 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:46:59.927421 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz"] Apr 22 14:46:59.930409 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:46:59.930370 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4772875d_82e0_4ea5_8a80_cc629fcd4679.slice/crio-9a0c7367c358e292409a2b4bd316fe8d18f8cd0b42328ab66d711288843fe440 WatchSource:0}: Error finding container 9a0c7367c358e292409a2b4bd316fe8d18f8cd0b42328ab66d711288843fe440: Status 404 returned error can't find the container with id 9a0c7367c358e292409a2b4bd316fe8d18f8cd0b42328ab66d711288843fe440 Apr 22 14:47:00.752022 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:00.751985 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz" event={"ID":"4772875d-82e0-4ea5-8a80-cc629fcd4679","Type":"ContainerStarted","Data":"a1eb316ff78ceb52fa4df7becc5929463091c4c42e50e4dd5c4dbbff9f9c11b1"} Apr 22 14:47:00.752022 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:00.752029 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz" event={"ID":"4772875d-82e0-4ea5-8a80-cc629fcd4679","Type":"ContainerStarted","Data":"9a0c7367c358e292409a2b4bd316fe8d18f8cd0b42328ab66d711288843fe440"} Apr 22 14:47:00.752337 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:00.752315 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz" Apr 22 14:47:00.753711 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:00.753683 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz" podUID="4772875d-82e0-4ea5-8a80-cc629fcd4679" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 14:47:00.770541 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:00.770486 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz" podStartSLOduration=1.770440937 podStartE2EDuration="1.770440937s" podCreationTimestamp="2026-04-22 14:46:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:47:00.77010821 +0000 UTC m=+1867.746073677" watchObservedRunningTime="2026-04-22 14:47:00.770440937 +0000 UTC m=+1867.746406379" Apr 22 14:47:01.621156 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:01.621108 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj" podUID="30622015-c7f1-47a6-9b54-e7ed8a2820cd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 14:47:01.754929 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:01.754890 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz" podUID="4772875d-82e0-4ea5-8a80-cc629fcd4679" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 14:47:03.151187 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:03.151164 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5" Apr 22 14:47:03.762328 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:03.762291 2579 generic.go:358] "Generic (PLEG): container finished" podID="8774659b-7dd0-4ab9-9cce-0e1114f33e93" containerID="f050ecf4d0e3d68922ddd1a1c4f1ab4385afcebbab7abc37cffb719cb835857f" exitCode=0 Apr 22 14:47:03.762540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:03.762359 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5" Apr 22 14:47:03.762540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:03.762367 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5" event={"ID":"8774659b-7dd0-4ab9-9cce-0e1114f33e93","Type":"ContainerDied","Data":"f050ecf4d0e3d68922ddd1a1c4f1ab4385afcebbab7abc37cffb719cb835857f"} Apr 22 14:47:03.762540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:03.762396 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5" event={"ID":"8774659b-7dd0-4ab9-9cce-0e1114f33e93","Type":"ContainerDied","Data":"3e23e46e81b22d493e421d0d5f7f29caee8e5b3a8715c3e6814ae104862837c8"} Apr 22 14:47:03.762540 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:03.762414 2579 scope.go:117] "RemoveContainer" containerID="f050ecf4d0e3d68922ddd1a1c4f1ab4385afcebbab7abc37cffb719cb835857f" Apr 22 14:47:03.770165 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:03.770143 2579 scope.go:117] "RemoveContainer" containerID="f050ecf4d0e3d68922ddd1a1c4f1ab4385afcebbab7abc37cffb719cb835857f" Apr 22 14:47:03.770406 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:47:03.770388 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f050ecf4d0e3d68922ddd1a1c4f1ab4385afcebbab7abc37cffb719cb835857f\": container with ID starting with f050ecf4d0e3d68922ddd1a1c4f1ab4385afcebbab7abc37cffb719cb835857f not found: ID does not exist" containerID="f050ecf4d0e3d68922ddd1a1c4f1ab4385afcebbab7abc37cffb719cb835857f" Apr 22 14:47:03.770488 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:03.770420 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f050ecf4d0e3d68922ddd1a1c4f1ab4385afcebbab7abc37cffb719cb835857f"} err="failed to get container status \"f050ecf4d0e3d68922ddd1a1c4f1ab4385afcebbab7abc37cffb719cb835857f\": rpc error: code = NotFound desc = could not find container \"f050ecf4d0e3d68922ddd1a1c4f1ab4385afcebbab7abc37cffb719cb835857f\": container with ID starting with f050ecf4d0e3d68922ddd1a1c4f1ab4385afcebbab7abc37cffb719cb835857f not found: ID does not exist" Apr 22 14:47:03.790192 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:03.790086 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5"] Apr 22 14:47:03.799168 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:03.799141 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0eb25-predictor-bf4c6c446-wvzg5"] Apr 22 14:47:05.517325 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:05.517277 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8774659b-7dd0-4ab9-9cce-0e1114f33e93" path="/var/lib/kubelet/pods/8774659b-7dd0-4ab9-9cce-0e1114f33e93/volumes" Apr 22 14:47:11.621283 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:11.621193 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj" Apr 22 14:47:11.755015 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:11.754972 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz" podUID="4772875d-82e0-4ea5-8a80-cc629fcd4679" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 14:47:21.755339 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:21.755294 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz" podUID="4772875d-82e0-4ea5-8a80-cc629fcd4679" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 14:47:31.755567 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:31.755526 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz" podUID="4772875d-82e0-4ea5-8a80-cc629fcd4679" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 14:47:40.324095 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:40.324051 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r"] Apr 22 14:47:40.324684 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:40.324661 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8774659b-7dd0-4ab9-9cce-0e1114f33e93" containerName="kserve-container" Apr 22 14:47:40.324769 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:40.324685 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8774659b-7dd0-4ab9-9cce-0e1114f33e93" containerName="kserve-container" Apr 22 14:47:40.324836 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:40.324779 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="8774659b-7dd0-4ab9-9cce-0e1114f33e93" containerName="kserve-container" Apr 22 14:47:40.327510 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:40.327487 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r" Apr 22 14:47:40.337585 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:40.337562 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r" Apr 22 14:47:40.342890 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:40.342861 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r"] Apr 22 14:47:40.375623 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:40.375587 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj"] Apr 22 14:47:40.376180 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:40.376124 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj" podUID="30622015-c7f1-47a6-9b54-e7ed8a2820cd" containerName="kserve-container" containerID="cri-o://18a43ca2fc48c528418a168fb8800c5c241dc9e22de1fc8a2fa14f1565ce48b9" gracePeriod=30 Apr 22 14:47:40.509075 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:40.509045 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r"] Apr 22 14:47:40.511437 ip-10-0-130-255 kubenswrapper[2579]: W0422 14:47:40.511404 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8230180_72c1_4546_a36a_6c222a313b31.slice/crio-ccd87ac49cd3413ac13f45019eb0c8cfc6de0201a5ef3817fa17a56a6a2bffad WatchSource:0}: Error finding container ccd87ac49cd3413ac13f45019eb0c8cfc6de0201a5ef3817fa17a56a6a2bffad: Status 404 returned error can't find the container with id ccd87ac49cd3413ac13f45019eb0c8cfc6de0201a5ef3817fa17a56a6a2bffad Apr 22 14:47:40.886580 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:40.886542 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r" event={"ID":"a8230180-72c1-4546-a36a-6c222a313b31","Type":"ContainerStarted","Data":"57b80d4a770f915f2496679e1cc09112e71a1f6de295e6ec41ef7feca9370536"} Apr 22 14:47:40.886580 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:40.886581 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r" event={"ID":"a8230180-72c1-4546-a36a-6c222a313b31","Type":"ContainerStarted","Data":"ccd87ac49cd3413ac13f45019eb0c8cfc6de0201a5ef3817fa17a56a6a2bffad"} Apr 22 14:47:40.886783 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:40.886720 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r" Apr 22 14:47:40.887970 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:40.887947 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r" podUID="a8230180-72c1-4546-a36a-6c222a313b31" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 14:47:40.903436 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:40.903339 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r" podStartSLOduration=0.903322989 podStartE2EDuration="903.322989ms" podCreationTimestamp="2026-04-22 14:47:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:47:40.902880209 +0000 UTC m=+1907.878845651" watchObservedRunningTime="2026-04-22 14:47:40.903322989 +0000 UTC m=+1907.879288423" Apr 22 14:47:41.621052 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:41.621003 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj" podUID="30622015-c7f1-47a6-9b54-e7ed8a2820cd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 14:47:41.755338 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:41.755294 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz" podUID="4772875d-82e0-4ea5-8a80-cc629fcd4679" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 14:47:41.890474 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:41.890355 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r" podUID="a8230180-72c1-4546-a36a-6c222a313b31" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 14:47:43.631407 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:43.631380 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj" Apr 22 14:47:43.896967 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:43.896880 2579 generic.go:358] "Generic (PLEG): container finished" podID="30622015-c7f1-47a6-9b54-e7ed8a2820cd" containerID="18a43ca2fc48c528418a168fb8800c5c241dc9e22de1fc8a2fa14f1565ce48b9" exitCode=0 Apr 22 14:47:43.896967 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:43.896938 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj" Apr 22 14:47:43.896967 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:43.896952 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj" event={"ID":"30622015-c7f1-47a6-9b54-e7ed8a2820cd","Type":"ContainerDied","Data":"18a43ca2fc48c528418a168fb8800c5c241dc9e22de1fc8a2fa14f1565ce48b9"} Apr 22 14:47:43.897223 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:43.896985 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj" event={"ID":"30622015-c7f1-47a6-9b54-e7ed8a2820cd","Type":"ContainerDied","Data":"f4638e08d1c077aae450b43932dc6404100d2a187eebe2b92a6eeb7739b7dce3"} Apr 22 14:47:43.897223 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:43.897004 2579 scope.go:117] "RemoveContainer" containerID="18a43ca2fc48c528418a168fb8800c5c241dc9e22de1fc8a2fa14f1565ce48b9" Apr 22 14:47:43.907163 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:43.906073 2579 scope.go:117] "RemoveContainer" containerID="18a43ca2fc48c528418a168fb8800c5c241dc9e22de1fc8a2fa14f1565ce48b9" Apr 22 14:47:43.907163 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:47:43.906412 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18a43ca2fc48c528418a168fb8800c5c241dc9e22de1fc8a2fa14f1565ce48b9\": container with ID starting with 18a43ca2fc48c528418a168fb8800c5c241dc9e22de1fc8a2fa14f1565ce48b9 not found: ID does not exist" containerID="18a43ca2fc48c528418a168fb8800c5c241dc9e22de1fc8a2fa14f1565ce48b9" Apr 22 14:47:43.907163 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:43.906445 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18a43ca2fc48c528418a168fb8800c5c241dc9e22de1fc8a2fa14f1565ce48b9"} err="failed to get container status \"18a43ca2fc48c528418a168fb8800c5c241dc9e22de1fc8a2fa14f1565ce48b9\": rpc error: code = NotFound desc = could not find container \"18a43ca2fc48c528418a168fb8800c5c241dc9e22de1fc8a2fa14f1565ce48b9\": container with ID starting with 18a43ca2fc48c528418a168fb8800c5c241dc9e22de1fc8a2fa14f1565ce48b9 not found: ID does not exist" Apr 22 14:47:43.922270 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:43.922240 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj"] Apr 22 14:47:43.926634 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:43.926607 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f4200-predictor-6cc4586c58-bvqcj"] Apr 22 14:47:45.517690 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:45.517658 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30622015-c7f1-47a6-9b54-e7ed8a2820cd" path="/var/lib/kubelet/pods/30622015-c7f1-47a6-9b54-e7ed8a2820cd/volumes" Apr 22 14:47:51.756576 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:51.756537 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz" Apr 22 14:47:51.890527 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:47:51.890482 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r" podUID="a8230180-72c1-4546-a36a-6c222a313b31" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 14:48:01.891404 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:48:01.891351 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r" podUID="a8230180-72c1-4546-a36a-6c222a313b31" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 14:48:11.890698 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:48:11.890642 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r" podUID="a8230180-72c1-4546-a36a-6c222a313b31" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 14:48:21.890510 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:48:21.890439 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r" podUID="a8230180-72c1-4546-a36a-6c222a313b31" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 14:48:31.891292 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:48:31.891261 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r" Apr 22 14:50:53.552474 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:50:53.552343 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 14:50:53.557653 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:50:53.557632 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 14:55:53.577125 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:55:53.577020 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 14:55:53.583309 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:55:53.583288 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 14:57:05.034772 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:57:05.034736 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r"] Apr 22 14:57:05.035257 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:57:05.034971 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r" podUID="a8230180-72c1-4546-a36a-6c222a313b31" containerName="kserve-container" containerID="cri-o://57b80d4a770f915f2496679e1cc09112e71a1f6de295e6ec41ef7feca9370536" gracePeriod=30 Apr 22 14:57:08.293284 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:57:08.293258 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r" Apr 22 14:57:08.764026 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:57:08.763990 2579 generic.go:358] "Generic (PLEG): container finished" podID="a8230180-72c1-4546-a36a-6c222a313b31" containerID="57b80d4a770f915f2496679e1cc09112e71a1f6de295e6ec41ef7feca9370536" exitCode=0 Apr 22 14:57:08.764204 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:57:08.764058 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r" Apr 22 14:57:08.764204 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:57:08.764084 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r" event={"ID":"a8230180-72c1-4546-a36a-6c222a313b31","Type":"ContainerDied","Data":"57b80d4a770f915f2496679e1cc09112e71a1f6de295e6ec41ef7feca9370536"} Apr 22 14:57:08.764204 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:57:08.764125 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r" event={"ID":"a8230180-72c1-4546-a36a-6c222a313b31","Type":"ContainerDied","Data":"ccd87ac49cd3413ac13f45019eb0c8cfc6de0201a5ef3817fa17a56a6a2bffad"} Apr 22 14:57:08.764204 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:57:08.764143 2579 scope.go:117] "RemoveContainer" containerID="57b80d4a770f915f2496679e1cc09112e71a1f6de295e6ec41ef7feca9370536" Apr 22 14:57:08.772886 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:57:08.772865 2579 scope.go:117] "RemoveContainer" containerID="57b80d4a770f915f2496679e1cc09112e71a1f6de295e6ec41ef7feca9370536" Apr 22 14:57:08.773185 ip-10-0-130-255 kubenswrapper[2579]: E0422 14:57:08.773165 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b80d4a770f915f2496679e1cc09112e71a1f6de295e6ec41ef7feca9370536\": container with ID starting with 57b80d4a770f915f2496679e1cc09112e71a1f6de295e6ec41ef7feca9370536 not found: ID does not exist" containerID="57b80d4a770f915f2496679e1cc09112e71a1f6de295e6ec41ef7feca9370536" Apr 22 14:57:08.773228 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:57:08.773198 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b80d4a770f915f2496679e1cc09112e71a1f6de295e6ec41ef7feca9370536"} err="failed to get container status \"57b80d4a770f915f2496679e1cc09112e71a1f6de295e6ec41ef7feca9370536\": rpc error: code = NotFound desc = could not find container \"57b80d4a770f915f2496679e1cc09112e71a1f6de295e6ec41ef7feca9370536\": container with ID starting with 57b80d4a770f915f2496679e1cc09112e71a1f6de295e6ec41ef7feca9370536 not found: ID does not exist" Apr 22 14:57:08.789463 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:57:08.789420 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r"] Apr 22 14:57:08.792722 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:57:08.792699 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-055dc-predictor-757d8d68d-tlq9r"] Apr 22 14:57:09.517249 ip-10-0-130-255 kubenswrapper[2579]: I0422 14:57:09.517205 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8230180-72c1-4546-a36a-6c222a313b31" path="/var/lib/kubelet/pods/a8230180-72c1-4546-a36a-6c222a313b31/volumes" Apr 22 15:00:53.599321 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:00:53.599213 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 15:00:53.606291 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:00:53.606269 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 15:04:29.326795 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:04:29.326758 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz"] Apr 22 15:04:29.327287 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:04:29.327009 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz" podUID="4772875d-82e0-4ea5-8a80-cc629fcd4679" containerName="kserve-container" containerID="cri-o://a1eb316ff78ceb52fa4df7becc5929463091c4c42e50e4dd5c4dbbff9f9c11b1" gracePeriod=30 Apr 22 15:04:31.754952 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:04:31.754906 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz" podUID="4772875d-82e0-4ea5-8a80-cc629fcd4679" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 15:04:32.567663 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:04:32.567637 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz" Apr 22 15:04:33.221035 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:04:33.220999 2579 generic.go:358] "Generic (PLEG): container finished" podID="4772875d-82e0-4ea5-8a80-cc629fcd4679" containerID="a1eb316ff78ceb52fa4df7becc5929463091c4c42e50e4dd5c4dbbff9f9c11b1" exitCode=0 Apr 22 15:04:33.221474 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:04:33.221066 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz" Apr 22 15:04:33.221474 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:04:33.221084 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz" event={"ID":"4772875d-82e0-4ea5-8a80-cc629fcd4679","Type":"ContainerDied","Data":"a1eb316ff78ceb52fa4df7becc5929463091c4c42e50e4dd5c4dbbff9f9c11b1"} Apr 22 15:04:33.221474 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:04:33.221121 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz" event={"ID":"4772875d-82e0-4ea5-8a80-cc629fcd4679","Type":"ContainerDied","Data":"9a0c7367c358e292409a2b4bd316fe8d18f8cd0b42328ab66d711288843fe440"} Apr 22 15:04:33.221474 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:04:33.221143 2579 scope.go:117] "RemoveContainer" containerID="a1eb316ff78ceb52fa4df7becc5929463091c4c42e50e4dd5c4dbbff9f9c11b1" Apr 22 15:04:33.229655 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:04:33.229632 2579 scope.go:117] "RemoveContainer" containerID="a1eb316ff78ceb52fa4df7becc5929463091c4c42e50e4dd5c4dbbff9f9c11b1" Apr 22 15:04:33.229950 ip-10-0-130-255 kubenswrapper[2579]: E0422 15:04:33.229928 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1eb316ff78ceb52fa4df7becc5929463091c4c42e50e4dd5c4dbbff9f9c11b1\": container with ID starting with a1eb316ff78ceb52fa4df7becc5929463091c4c42e50e4dd5c4dbbff9f9c11b1 not found: ID does not exist" containerID="a1eb316ff78ceb52fa4df7becc5929463091c4c42e50e4dd5c4dbbff9f9c11b1" Apr 22 15:04:33.230014 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:04:33.229958 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1eb316ff78ceb52fa4df7becc5929463091c4c42e50e4dd5c4dbbff9f9c11b1"} err="failed to get container status \"a1eb316ff78ceb52fa4df7becc5929463091c4c42e50e4dd5c4dbbff9f9c11b1\": rpc error: code = NotFound desc = could not find container \"a1eb316ff78ceb52fa4df7becc5929463091c4c42e50e4dd5c4dbbff9f9c11b1\": container with ID starting with a1eb316ff78ceb52fa4df7becc5929463091c4c42e50e4dd5c4dbbff9f9c11b1 not found: ID does not exist" Apr 22 15:04:33.244644 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:04:33.244610 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz"] Apr 22 15:04:33.248582 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:04:33.248553 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4e291-predictor-794cfddb48-kvpsz"] Apr 22 15:04:33.517762 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:04:33.517683 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4772875d-82e0-4ea5-8a80-cc629fcd4679" path="/var/lib/kubelet/pods/4772875d-82e0-4ea5-8a80-cc629fcd4679/volumes" Apr 22 15:05:01.403251 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:01.403218 2579 ???:1] "http: TLS handshake error from 10.0.130.255:60682: EOF" Apr 22 15:05:01.411603 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:01.411574 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hfm6z_a00d0fb2-d626-4537-b0c0-59d6dc7c1b06/global-pull-secret-syncer/0.log" Apr 22 15:05:01.712767 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:01.712732 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-l652r_f9ffd782-8eeb-457e-afbf-6985dab9b00d/konnectivity-agent/0.log" Apr 22 15:05:01.819466 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:01.819414 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-255.ec2.internal_a7f3a71818e5c0ac42d79312a43ec6c3/haproxy/0.log" Apr 22 15:05:04.826013 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:04.825982 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_31dc4779-192f-40e7-acf5-dcf542039bd6/alertmanager/0.log" Apr 22 15:05:04.855298 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:04.855267 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_31dc4779-192f-40e7-acf5-dcf542039bd6/config-reloader/0.log" Apr 22 15:05:04.883644 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:04.883606 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_31dc4779-192f-40e7-acf5-dcf542039bd6/kube-rbac-proxy-web/0.log" Apr 22 15:05:04.913959 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:04.913932 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_31dc4779-192f-40e7-acf5-dcf542039bd6/kube-rbac-proxy/0.log" Apr 22 15:05:04.942388 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:04.942361 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_31dc4779-192f-40e7-acf5-dcf542039bd6/kube-rbac-proxy-metric/0.log" Apr 22 15:05:04.968143 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:04.968118 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_31dc4779-192f-40e7-acf5-dcf542039bd6/prom-label-proxy/0.log" Apr 22 15:05:04.998033 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:04.997999 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_31dc4779-192f-40e7-acf5-dcf542039bd6/init-config-reloader/0.log" Apr 22 15:05:05.069865 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:05.069832 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-9zhfn_36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f/kube-state-metrics/0.log" Apr 22 15:05:05.099543 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:05.099468 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-9zhfn_36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f/kube-rbac-proxy-main/0.log" Apr 22 15:05:05.125647 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:05.125616 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-9zhfn_36e0bfc7-28c8-4bec-bcaa-5e1e03b1566f/kube-rbac-proxy-self/0.log" Apr 22 15:05:05.162248 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:05.162220 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-67cfb96f85-ctd8m_2aaec84b-b5bb-45a2-9797-35532daa757c/metrics-server/0.log" Apr 22 15:05:05.341686 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:05.341659 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nxqcw_c4ab8bf9-0890-4bcc-af6a-32e0e67eed93/node-exporter/0.log" Apr 22 15:05:05.370885 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:05.370797 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nxqcw_c4ab8bf9-0890-4bcc-af6a-32e0e67eed93/kube-rbac-proxy/0.log" Apr 22 15:05:05.398787 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:05.398696 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nxqcw_c4ab8bf9-0890-4bcc-af6a-32e0e67eed93/init-textfile/0.log" Apr 22 15:05:05.946855 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:05.946818 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-46kdf_07ac2127-6b42-4a9f-88f5-cc04565a32dd/prometheus-operator/0.log" Apr 22 15:05:05.975884 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:05.975858 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-46kdf_07ac2127-6b42-4a9f-88f5-cc04565a32dd/kube-rbac-proxy/0.log" Apr 22 15:05:08.121534 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.121500 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6"] Apr 22 15:05:08.121914 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.121827 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30622015-c7f1-47a6-9b54-e7ed8a2820cd" containerName="kserve-container" Apr 22 15:05:08.121914 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.121838 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="30622015-c7f1-47a6-9b54-e7ed8a2820cd" containerName="kserve-container" Apr 22 15:05:08.121914 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.121858 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4772875d-82e0-4ea5-8a80-cc629fcd4679" containerName="kserve-container" Apr 22 15:05:08.121914 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.121864 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="4772875d-82e0-4ea5-8a80-cc629fcd4679" containerName="kserve-container" Apr 22 15:05:08.121914 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.121871 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8230180-72c1-4546-a36a-6c222a313b31" containerName="kserve-container" Apr 22 15:05:08.121914 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.121877 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8230180-72c1-4546-a36a-6c222a313b31" containerName="kserve-container" Apr 22 15:05:08.122101 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.121928 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="4772875d-82e0-4ea5-8a80-cc629fcd4679" containerName="kserve-container" Apr 22 15:05:08.122101 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.121937 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="30622015-c7f1-47a6-9b54-e7ed8a2820cd" containerName="kserve-container" Apr 22 15:05:08.122101 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.121947 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8230180-72c1-4546-a36a-6c222a313b31" containerName="kserve-container" Apr 22 15:05:08.125106 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.125082 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6" Apr 22 15:05:08.127789 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.127765 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zbnb6\"/\"kube-root-ca.crt\"" Apr 22 15:05:08.127925 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.127767 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-zbnb6\"/\"default-dockercfg-g9xjx\"" Apr 22 15:05:08.128769 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.128752 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zbnb6\"/\"openshift-service-ca.crt\"" Apr 22 15:05:08.136577 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.136556 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6"] Apr 22 15:05:08.170253 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.170218 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6a50596f-82aa-45be-9c26-c50cc4e90d52-proc\") pod \"perf-node-gather-daemonset-22ll6\" (UID: \"6a50596f-82aa-45be-9c26-c50cc4e90d52\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6" Apr 22 15:05:08.170420 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.170259 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a50596f-82aa-45be-9c26-c50cc4e90d52-sys\") pod \"perf-node-gather-daemonset-22ll6\" (UID: \"6a50596f-82aa-45be-9c26-c50cc4e90d52\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6" Apr 22 15:05:08.170420 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.170279 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vkts\" (UniqueName: \"kubernetes.io/projected/6a50596f-82aa-45be-9c26-c50cc4e90d52-kube-api-access-9vkts\") pod \"perf-node-gather-daemonset-22ll6\" (UID: \"6a50596f-82aa-45be-9c26-c50cc4e90d52\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6" Apr 22 15:05:08.170420 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.170354 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a50596f-82aa-45be-9c26-c50cc4e90d52-lib-modules\") pod \"perf-node-gather-daemonset-22ll6\" (UID: \"6a50596f-82aa-45be-9c26-c50cc4e90d52\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6" Apr 22 15:05:08.170420 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.170408 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6a50596f-82aa-45be-9c26-c50cc4e90d52-podres\") pod \"perf-node-gather-daemonset-22ll6\" (UID: \"6a50596f-82aa-45be-9c26-c50cc4e90d52\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6" Apr 22 15:05:08.177353 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.177310 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-655c8696c9-49h4h_3d898a11-5438-403f-8954-683e2aa1a385/console/0.log" Apr 22 15:05:08.231943 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.231867 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-4cwk9_17fffa76-898d-4138-9000-2a55d4ce4960/download-server/0.log" Apr 22 15:05:08.271519 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.271482 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6a50596f-82aa-45be-9c26-c50cc4e90d52-proc\") pod \"perf-node-gather-daemonset-22ll6\" (UID: \"6a50596f-82aa-45be-9c26-c50cc4e90d52\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6" Apr 22 15:05:08.271681 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.271534 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a50596f-82aa-45be-9c26-c50cc4e90d52-sys\") pod \"perf-node-gather-daemonset-22ll6\" (UID: \"6a50596f-82aa-45be-9c26-c50cc4e90d52\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6" Apr 22 15:05:08.271681 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.271562 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vkts\" (UniqueName: \"kubernetes.io/projected/6a50596f-82aa-45be-9c26-c50cc4e90d52-kube-api-access-9vkts\") pod \"perf-node-gather-daemonset-22ll6\" (UID: \"6a50596f-82aa-45be-9c26-c50cc4e90d52\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6" Apr 22 15:05:08.271681 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.271589 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6a50596f-82aa-45be-9c26-c50cc4e90d52-proc\") pod \"perf-node-gather-daemonset-22ll6\" (UID: \"6a50596f-82aa-45be-9c26-c50cc4e90d52\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6" Apr 22 15:05:08.271681 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.271606 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a50596f-82aa-45be-9c26-c50cc4e90d52-lib-modules\") pod \"perf-node-gather-daemonset-22ll6\" (UID: \"6a50596f-82aa-45be-9c26-c50cc4e90d52\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6" Apr 22 15:05:08.271681 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.271624 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a50596f-82aa-45be-9c26-c50cc4e90d52-sys\") pod \"perf-node-gather-daemonset-22ll6\" (UID: \"6a50596f-82aa-45be-9c26-c50cc4e90d52\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6" Apr 22 15:05:08.271681 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.271644 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6a50596f-82aa-45be-9c26-c50cc4e90d52-podres\") pod \"perf-node-gather-daemonset-22ll6\" (UID: \"6a50596f-82aa-45be-9c26-c50cc4e90d52\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6" Apr 22 15:05:08.271890 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.271742 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a50596f-82aa-45be-9c26-c50cc4e90d52-lib-modules\") pod \"perf-node-gather-daemonset-22ll6\" (UID: \"6a50596f-82aa-45be-9c26-c50cc4e90d52\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6" Apr 22 15:05:08.271890 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.271751 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6a50596f-82aa-45be-9c26-c50cc4e90d52-podres\") pod \"perf-node-gather-daemonset-22ll6\" (UID: \"6a50596f-82aa-45be-9c26-c50cc4e90d52\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6" Apr 22 15:05:08.280664 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.280636 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vkts\" (UniqueName: \"kubernetes.io/projected/6a50596f-82aa-45be-9c26-c50cc4e90d52-kube-api-access-9vkts\") pod \"perf-node-gather-daemonset-22ll6\" (UID: \"6a50596f-82aa-45be-9c26-c50cc4e90d52\") " pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6" Apr 22 15:05:08.435520 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.435407 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6" Apr 22 15:05:08.558530 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.558500 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6"] Apr 22 15:05:08.561170 ip-10-0-130-255 kubenswrapper[2579]: W0422 15:05:08.561143 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6a50596f_82aa_45be_9c26_c50cc4e90d52.slice/crio-eeb172de2a50b106a854349eabc124b78ee6a0202a68d253bb82e830bb50ddc5 WatchSource:0}: Error finding container eeb172de2a50b106a854349eabc124b78ee6a0202a68d253bb82e830bb50ddc5: Status 404 returned error can't find the container with id eeb172de2a50b106a854349eabc124b78ee6a0202a68d253bb82e830bb50ddc5 Apr 22 15:05:08.562757 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:08.562742 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:05:09.338785 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:09.338743 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6" event={"ID":"6a50596f-82aa-45be-9c26-c50cc4e90d52","Type":"ContainerStarted","Data":"3702eb4e1c18348bd500983273d10885b337bf21dae5dfb450bd131d83632d39"} Apr 22 15:05:09.338785 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:09.338784 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6" event={"ID":"6a50596f-82aa-45be-9c26-c50cc4e90d52","Type":"ContainerStarted","Data":"eeb172de2a50b106a854349eabc124b78ee6a0202a68d253bb82e830bb50ddc5"} Apr 22 15:05:09.339306 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:09.338837 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6" Apr 22 15:05:09.358597 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:09.358547 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6" podStartSLOduration=1.358531232 podStartE2EDuration="1.358531232s" podCreationTimestamp="2026-04-22 15:05:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:05:09.356969231 +0000 UTC m=+2956.332934670" watchObservedRunningTime="2026-04-22 15:05:09.358531232 +0000 UTC m=+2956.334496672" Apr 22 15:05:09.437442 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:09.437405 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bjv7t_7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69/dns/0.log" Apr 22 15:05:09.468026 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:09.467993 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bjv7t_7ca0f0f5-3ecb-4a3e-ab6c-6cce9ed1ed69/kube-rbac-proxy/0.log" Apr 22 15:05:09.618324 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:09.618228 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9gzdh_b6ddc308-4c95-4675-8dae-ecfb2c08991e/dns-node-resolver/0.log" Apr 22 15:05:10.184152 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:10.184119 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7ff5958d8f-z6c4r_79d69b78-10e2-4e01-9575-73c2f9dcfc79/registry/0.log" Apr 22 15:05:10.226467 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:10.226394 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2wcll_ac56b2b8-1a22-40f8-a9b6-5a85a4f1f547/node-ca/0.log" Apr 22 15:05:11.574334 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:11.574296 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8dbtw_15b22104-ccf3-493a-b1a4-c4984e2ead3b/serve-healthcheck-canary/0.log" Apr 22 15:05:12.212795 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:12.212763 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ws9fs_2a82bf3e-7d0c-4072-926d-b4ef0afaa57f/kube-rbac-proxy/0.log" Apr 22 15:05:12.238662 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:12.238632 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ws9fs_2a82bf3e-7d0c-4072-926d-b4ef0afaa57f/exporter/0.log" Apr 22 15:05:12.273370 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:12.273345 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ws9fs_2a82bf3e-7d0c-4072-926d-b4ef0afaa57f/extractor/0.log" Apr 22 15:05:14.290728 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:14.290691 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-66cf78b85b-wbnkc_0f64bfa0-4d5f-49ea-9bc6-02620079ddc5/manager/0.log" Apr 22 15:05:14.344748 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:14.344713 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-64kfj_a0f76e86-1dc2-4c14-9764-5ab70dd7d62f/server/0.log" Apr 22 15:05:14.795186 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:14.795155 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-5q956_26ce42c1-27d0-4bf4-8dd5-28989480af95/manager/0.log" Apr 22 15:05:14.861234 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:14.861204 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-ncwcf_e4b14417-7924-4bcc-9c8e-87a4156ca1ec/seaweedfs/0.log" Apr 22 15:05:15.352151 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:15.352125 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-zbnb6/perf-node-gather-daemonset-22ll6" Apr 22 15:05:20.959031 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:20.958992 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-886zv_b22e95d1-d9a7-4282-b3bd-1e77a23c5db0/kube-multus/0.log" Apr 22 15:05:21.000392 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:21.000362 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5v8nc_bfa3b731-8569-4e90-9bb7-4b1898b25baa/kube-multus-additional-cni-plugins/0.log" Apr 22 15:05:21.039888 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:21.039861 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5v8nc_bfa3b731-8569-4e90-9bb7-4b1898b25baa/egress-router-binary-copy/0.log" Apr 22 15:05:21.084125 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:21.084092 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5v8nc_bfa3b731-8569-4e90-9bb7-4b1898b25baa/cni-plugins/0.log" Apr 22 15:05:21.123645 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:21.123608 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5v8nc_bfa3b731-8569-4e90-9bb7-4b1898b25baa/bond-cni-plugin/0.log" Apr 22 15:05:21.163433 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:21.163398 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5v8nc_bfa3b731-8569-4e90-9bb7-4b1898b25baa/routeoverride-cni/0.log" Apr 22 15:05:21.208338 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:21.208307 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5v8nc_bfa3b731-8569-4e90-9bb7-4b1898b25baa/whereabouts-cni-bincopy/0.log" Apr 22 15:05:21.253398 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:21.253324 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5v8nc_bfa3b731-8569-4e90-9bb7-4b1898b25baa/whereabouts-cni/0.log" Apr 22 15:05:21.905908 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:21.905874 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9f46g_11aa4ec9-8bc6-4d4d-b02e-6441451ad30d/network-metrics-daemon/0.log" Apr 22 15:05:21.953879 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:21.953845 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9f46g_11aa4ec9-8bc6-4d4d-b02e-6441451ad30d/kube-rbac-proxy/0.log" Apr 22 15:05:23.641043 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:23.641012 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-controller/0.log" Apr 22 15:05:23.674404 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:23.674369 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/0.log" Apr 22 15:05:23.699420 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:23.699383 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovn-acl-logging/1.log" Apr 22 15:05:23.730867 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:23.730834 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/kube-rbac-proxy-node/0.log" Apr 22 15:05:23.756646 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:23.756615 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 15:05:23.784253 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:23.784225 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/northd/0.log" Apr 22 15:05:23.812954 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:23.812924 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/nbdb/0.log" Apr 22 15:05:23.853393 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:23.853367 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/sbdb/0.log" Apr 22 15:05:23.992053 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:23.992012 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ttkqb_6a4ee0b1-bb53-4f98-96c3-633c3ae1dd9f/ovnkube-controller/0.log" Apr 22 15:05:25.232017 ip-10-0-130-255 kubenswrapper[2579]: I0422 15:05:25.231979 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-2wdfd_ace85c9c-a18e-408d-b8d6-d5d2d99bc11a/network-check-target-container/0.log"