Apr 17 16:28:45.203343 ip-10-0-128-217 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 16:28:45.203354 ip-10-0-128-217 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 16:28:45.203362 ip-10-0-128-217 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 16:28:45.203592 ip-10-0-128-217 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 16:28:55.303324 ip-10-0-128-217 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 16:28:55.303343 ip-10-0-128-217 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 4b9dc2d2522c44539d7b29466a060ff8 -- Apr 17 16:31:22.565209 ip-10-0-128-217 systemd[1]: Starting Kubernetes Kubelet... Apr 17 16:31:22.932569 ip-10-0-128-217 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:22.932569 ip-10-0-128-217 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 16:31:22.932569 ip-10-0-128-217 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:22.932569 ip-10-0-128-217 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 16:31:22.932569 ip-10-0-128-217 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:22.933987 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.933893 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 16:31:22.938115 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938099 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:22.938115 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938114 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:22.938185 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938118 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:22.938185 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938122 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:22.938185 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938126 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:22.938185 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938129 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:22.938185 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938132 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:22.938185 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938136 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:22.938185 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938141 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:22.938185 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938144 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:22.938185 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938147 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:22.938185 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938149 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:22.938185 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938152 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:22.938185 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938155 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:22.938185 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938158 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:22.938185 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938160 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:22.938185 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938163 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:22.938185 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938165 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:22.938185 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938168 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:22.938185 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938177 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:22.938185 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938180 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:22.938185 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938182 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:22.938657 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938185 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:22.938657 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938188 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:22.938657 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938193 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:22.938657 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938197 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:22.938657 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938200 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:22.938657 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938203 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:22.938657 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938206 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:22.938657 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938209 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:22.938657 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938212 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:22.938657 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938215 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:22.938657 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938218 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:22.938657 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938220 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:22.938657 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938223 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:22.938657 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938227 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:22.938657 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938230 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:22.938657 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938232 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:22.938657 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938235 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:22.938657 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938237 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:22.938657 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938240 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:22.939171 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938242 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:22.939171 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938245 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:22.939171 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938247 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:22.939171 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938251 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:22.939171 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938253 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:22.939171 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938256 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:22.939171 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938258 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:22.939171 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938261 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:22.939171 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938263 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:22.939171 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938266 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:22.939171 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938269 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:22.939171 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938271 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:22.939171 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938274 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:22.939171 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938277 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:22.939171 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938280 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:22.939171 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938283 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:22.939171 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938286 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:22.939171 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938289 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:22.939171 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938291 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:22.939628 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938294 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:22.939628 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938297 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:22.939628 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938299 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:22.939628 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938302 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:22.939628 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938305 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:22.939628 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938307 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:22.939628 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938310 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:22.939628 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938313 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:22.939628 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938316 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:22.939628 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938319 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:22.939628 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938322 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:22.939628 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938324 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:22.939628 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938327 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:22.939628 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938330 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:22.939628 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938332 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:22.939628 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938334 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:22.939628 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938337 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:22.939628 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938339 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:22.939628 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938342 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:22.939628 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938344 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:22.940117 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938347 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:22.940117 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938349 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:22.940117 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938352 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:22.940117 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938355 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:22.940117 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938357 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:22.940117 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938360 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:22.940117 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938751 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:22.940117 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938757 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:22.940117 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938760 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:22.940117 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938764 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:22.940117 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938766 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:22.940117 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938769 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:22.940117 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938772 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:22.940117 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938775 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:22.940117 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938777 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:22.940117 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938780 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:22.940117 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938782 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:22.940117 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938785 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:22.940117 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938788 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:22.940117 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938790 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:22.940594 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938794 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:22.940594 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938796 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:22.940594 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938799 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:22.940594 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938801 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:22.940594 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938804 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:22.940594 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938806 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:22.940594 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938808 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:22.940594 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938811 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:22.940594 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938814 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:22.940594 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938816 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:22.940594 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938818 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:22.940594 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938834 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:22.940594 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938837 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:22.940594 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938840 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:22.940594 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938842 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:22.940594 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938846 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:22.940594 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938851 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:22.940594 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938854 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:22.940594 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938857 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:22.941069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938860 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:22.941069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938862 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:22.941069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938865 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:22.941069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938867 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:22.941069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938870 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:22.941069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938873 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:22.941069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938875 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:22.941069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938878 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:22.941069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938880 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:22.941069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938883 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:22.941069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938886 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:22.941069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938888 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:22.941069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938891 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:22.941069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938893 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:22.941069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938896 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:22.941069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938899 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:22.941069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938902 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:22.941069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938905 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:22.941069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938907 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:22.941069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938910 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:22.941568 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938912 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:22.941568 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938915 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:22.941568 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938917 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:22.941568 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938920 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:22.941568 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938922 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:22.941568 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938925 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:22.941568 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938928 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:22.941568 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938931 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:22.941568 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938934 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:22.941568 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938938 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:22.941568 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938941 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:22.941568 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938944 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:22.941568 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938947 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:22.941568 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938950 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:22.941568 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938952 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:22.941568 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938955 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:22.941568 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938958 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:22.941568 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938961 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:22.941568 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938964 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:22.941568 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938966 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:22.942103 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938969 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:22.942103 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938972 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:22.942103 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938975 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:22.942103 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938978 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:22.942103 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938981 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:22.942103 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938983 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:22.942103 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938986 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:22.942103 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938989 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:22.942103 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938992 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:22.942103 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938994 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:22.942103 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938997 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:22.942103 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.938999 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:22.942103 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.939002 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:22.942103 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940277 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 16:31:22.942103 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940290 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 16:31:22.942103 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940298 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 16:31:22.942103 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940303 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 16:31:22.942103 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940308 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 16:31:22.942103 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940312 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 16:31:22.942103 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940317 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 16:31:22.942103 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940321 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940325 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940328 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940331 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940334 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940338 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940341 2573 flags.go:64] FLAG: --cgroup-root="" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940343 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940347 2573 flags.go:64] FLAG: --client-ca-file="" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940349 2573 flags.go:64] FLAG: --cloud-config="" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940352 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940355 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940359 2573 flags.go:64] FLAG: --cluster-domain="" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940363 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940366 2573 flags.go:64] FLAG: --config-dir="" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940369 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940372 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940376 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940379 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940383 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940387 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940390 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940394 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940397 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940400 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 16:31:22.942611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940403 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940408 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940411 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940414 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940417 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940420 2573 flags.go:64] FLAG: --enable-server="true" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940423 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940428 2573 flags.go:64] FLAG: --event-burst="100" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940432 2573 flags.go:64] FLAG: --event-qps="50" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940435 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940438 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940441 2573 flags.go:64] FLAG: --eviction-hard="" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940444 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940448 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940451 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940454 2573 flags.go:64] FLAG: --eviction-soft="" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940457 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940460 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940463 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940466 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940471 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940474 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940477 2573 flags.go:64] FLAG: --feature-gates="" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940481 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940484 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 16:31:22.943270 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940487 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940491 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940494 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940497 2573 flags.go:64] FLAG: --help="false" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940500 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-128-217.ec2.internal" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940504 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940507 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940510 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940514 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940517 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940520 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940523 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940526 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940528 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940531 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940534 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940537 2573 flags.go:64] FLAG: --kube-reserved="" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940540 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940543 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940546 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940548 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940551 2573 flags.go:64] FLAG: --lock-file="" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940554 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940557 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 16:31:22.943884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940560 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940565 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940568 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940572 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940575 2573 flags.go:64] FLAG: --logging-format="text" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940578 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940581 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940584 2573 flags.go:64] FLAG: --manifest-url="" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940587 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940591 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940594 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940599 2573 flags.go:64] FLAG: --max-pods="110" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940603 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940606 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940609 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940612 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940616 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940619 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940622 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940630 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940633 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940636 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940639 2573 flags.go:64] FLAG: --pod-cidr="" Apr 17 16:31:22.944452 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940642 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940648 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940651 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940654 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940657 2573 flags.go:64] FLAG: --port="10250" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940660 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940663 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0b5f75e4d80934dcf" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940666 2573 flags.go:64] FLAG: --qos-reserved="" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940670 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940673 2573 flags.go:64] FLAG: --register-node="true" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940676 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940678 2573 flags.go:64] FLAG: --register-with-taints="" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940683 2573 flags.go:64] FLAG: --registry-burst="10" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940686 2573 flags.go:64] FLAG: --registry-qps="5" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940688 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940691 2573 flags.go:64] FLAG: --reserved-memory="" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940695 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940698 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940701 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940703 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940706 2573 flags.go:64] FLAG: --runonce="false" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940709 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940712 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940715 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940718 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940721 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 16:31:22.945057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940725 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940728 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940731 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940734 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940737 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940739 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940743 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940746 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940749 2573 flags.go:64] FLAG: --system-cgroups="" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940752 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940757 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940760 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940763 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940768 2573 flags.go:64] FLAG: --tls-min-version="" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940771 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940773 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940776 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940779 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940782 2573 flags.go:64] FLAG: --v="2" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940787 2573 flags.go:64] FLAG: --version="false" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940791 2573 flags.go:64] FLAG: --vmodule="" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940795 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.940798 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940910 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:22.945684 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940914 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:22.946290 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940917 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:22.946290 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940920 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:22.946290 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940923 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:22.946290 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940926 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:22.946290 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940930 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:22.946290 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940933 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:22.946290 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940936 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:22.946290 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940938 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:22.946290 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940941 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:22.946290 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940943 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:22.946290 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940947 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:22.946290 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940949 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:22.946290 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940952 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:22.946290 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940954 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:22.946290 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940957 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:22.946290 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940960 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:22.946290 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940962 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:22.946290 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940965 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:22.946290 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940967 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:22.946290 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940970 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:22.946836 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940972 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:22.946836 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940975 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:22.946836 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940977 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:22.946836 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940980 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:22.946836 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940982 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:22.946836 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940985 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:22.946836 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940988 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:22.946836 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940990 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:22.946836 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940993 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:22.946836 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940996 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:22.946836 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.940998 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:22.946836 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941001 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:22.946836 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941003 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:22.946836 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941006 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:22.946836 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941008 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:22.946836 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941011 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:22.946836 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941013 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:22.946836 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941016 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:22.946836 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941020 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:22.947308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941024 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:22.947308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941028 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:22.947308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941031 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:22.947308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941033 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:22.947308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941036 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:22.947308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941038 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:22.947308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941042 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:22.947308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941046 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:22.947308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941049 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:22.947308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941051 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:22.947308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941054 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:22.947308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941056 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:22.947308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941059 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:22.947308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941062 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:22.947308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941064 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:22.947308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941067 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:22.947308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941069 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:22.947308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941072 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:22.947308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941075 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:22.948069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941078 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:22.948069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941081 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:22.948069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941083 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:22.948069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941086 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:22.948069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941089 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:22.948069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941091 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:22.948069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941094 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:22.948069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941096 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:22.948069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941099 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:22.948069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941101 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:22.948069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941104 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:22.948069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941106 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:22.948069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941113 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:22.948069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941116 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:22.948069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941118 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:22.948069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941121 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:22.948069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941124 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:22.948069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941126 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:22.948069 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941129 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:22.948761 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941131 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:22.948761 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941134 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:22.948761 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941137 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:22.948761 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941140 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:22.948761 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941143 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:22.948761 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941145 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:22.948761 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.941148 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:22.948761 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.941633 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:22.949185 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.949166 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 16:31:22.949185 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.949186 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 16:31:22.949247 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949236 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:22.949247 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949242 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:22.949247 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949247 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:22.949341 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949250 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:22.949341 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949254 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:22.949341 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949257 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:22.949341 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949259 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:22.949341 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949268 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:22.949341 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949272 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:22.949341 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949275 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:22.949341 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949278 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:22.949341 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949281 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:22.949341 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949285 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:22.949341 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949289 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:22.949341 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949292 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:22.949341 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949295 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:22.949341 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949298 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:22.949341 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949300 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:22.949341 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949303 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:22.949341 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949305 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:22.949341 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949309 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:22.949341 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949311 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:22.949795 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949314 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:22.949795 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949316 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:22.949795 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949319 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:22.949795 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949321 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:22.949795 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949324 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:22.949795 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949327 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:22.949795 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949329 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:22.949795 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949333 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:22.949795 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949335 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:22.949795 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949338 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:22.949795 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949341 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:22.949795 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949343 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:22.949795 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949346 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:22.949795 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949349 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:22.949795 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949352 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:22.949795 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949355 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:22.949795 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949357 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:22.949795 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949360 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:22.949795 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949362 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:22.949795 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949365 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:22.950308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949367 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:22.950308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949370 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:22.950308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949372 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:22.950308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949375 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:22.950308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949378 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:22.950308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949380 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:22.950308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949383 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:22.950308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949385 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:22.950308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949388 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:22.950308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949390 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:22.950308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949393 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:22.950308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949395 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:22.950308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949398 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:22.950308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949402 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:22.950308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949404 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:22.950308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949407 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:22.950308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949409 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:22.950308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949412 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:22.950308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949415 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:22.950308 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949418 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:22.950793 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949420 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:22.950793 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949423 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:22.950793 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949426 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:22.950793 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949428 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:22.950793 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949431 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:22.950793 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949433 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:22.950793 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949436 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:22.950793 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949438 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:22.950793 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949441 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:22.950793 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949443 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:22.950793 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949446 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:22.950793 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949448 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:22.950793 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949451 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:22.950793 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949453 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:22.950793 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949456 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:22.950793 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949459 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:22.950793 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949462 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:22.950793 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949464 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:22.950793 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949466 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:22.950793 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949469 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:22.951391 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949471 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:22.951391 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949474 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:22.951391 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949477 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:22.951391 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949479 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:22.951391 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.949484 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:22.951391 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949580 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:22.951391 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949586 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:22.951391 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949589 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:22.951391 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949592 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:22.951391 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949595 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:22.951391 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949598 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:22.951391 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949601 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:22.951391 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949604 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:22.951391 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949606 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:22.951391 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949609 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:22.951391 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949611 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:22.951788 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949614 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:22.951788 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949617 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:22.951788 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949619 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:22.951788 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949621 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:22.951788 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949624 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:22.951788 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949627 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:22.951788 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949629 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:22.951788 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949632 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:22.951788 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949634 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:22.951788 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949637 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:22.951788 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949639 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:22.951788 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949642 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:22.951788 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949645 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:22.951788 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949647 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:22.951788 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949650 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:22.951788 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949653 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:22.951788 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949655 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:22.951788 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949658 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:22.951788 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949661 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:22.951788 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949663 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:22.952295 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949666 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:22.952295 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949668 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:22.952295 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949671 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:22.952295 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949674 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:22.952295 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949677 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:22.952295 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949679 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:22.952295 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949682 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:22.952295 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949684 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:22.952295 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949687 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:22.952295 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949690 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:22.952295 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949694 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:22.952295 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949697 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:22.952295 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949699 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:22.952295 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949702 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:22.952295 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949704 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:22.952295 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949707 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:22.952295 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949709 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:22.952295 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949712 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:22.952295 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949715 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:22.952295 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949717 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:22.952877 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949719 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:22.952877 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949722 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:22.952877 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949725 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:22.952877 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949727 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:22.952877 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949730 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:22.952877 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949734 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:22.952877 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949737 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:22.952877 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949740 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:22.952877 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949742 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:22.952877 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949745 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:22.952877 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949747 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:22.952877 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949750 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:22.952877 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949752 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:22.952877 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949754 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:22.952877 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949757 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:22.952877 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949759 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:22.952877 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949762 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:22.952877 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949764 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:22.952877 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949766 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:22.953350 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949769 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:22.953350 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949771 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:22.953350 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949774 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:22.953350 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949776 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:22.953350 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949779 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:22.953350 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949781 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:22.953350 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949784 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:22.953350 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949786 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:22.953350 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949789 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:22.953350 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949791 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:22.953350 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949793 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:22.953350 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949796 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:22.953350 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949799 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:22.953350 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949802 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:22.953350 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949805 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:22.953350 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:22.949808 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:22.953745 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.949813 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:22.953745 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.950507 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 16:31:22.954212 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.954198 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 16:31:22.954971 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.954960 2573 server.go:1019] "Starting client certificate rotation" Apr 17 16:31:22.955079 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.955061 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:31:22.955116 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.955106 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:31:22.976184 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.976165 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:31:22.978512 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.978492 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:31:22.993703 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.993679 2573 log.go:25] "Validated CRI v1 runtime API" Apr 17 16:31:22.999943 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:22.999914 2573 log.go:25] "Validated CRI v1 image API" Apr 17 16:31:23.001737 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.001719 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 16:31:23.007964 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.006094 2573 fs.go:135] Filesystem UUIDs: map[12280818-3864-4752-8519-8f70543315e2:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 cee6b15e-fd85-4784-b4ca-9828e9e4ed49:/dev/nvme0n1p3] Apr 17 16:31:23.007964 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.006842 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 16:31:23.008109 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.008002 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:31:23.013382 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.013269 2573 manager.go:217] Machine: {Timestamp:2026-04-17 16:31:23.011517261 +0000 UTC m=+0.343018155 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099803 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2ed98ec9a953578c25cdf08ad1680a SystemUUID:ec2ed98e-c9a9-5357-8c25-cdf08ad1680a BootID:4b9dc2d2-522c-4453-9d7b-29466a060ff8 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a3:af:93:32:a5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a3:af:93:32:a5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ee:f8:4e:e4:2c:79 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 16:31:23.013382 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.013376 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 16:31:23.013526 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.013464 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 16:31:23.013802 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.013783 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 16:31:23.013955 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.013805 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-217.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 16:31:23.014004 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.013965 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 16:31:23.014004 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.013974 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 16:31:23.014484 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.014473 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:31:23.015122 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.015112 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:31:23.015927 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.015916 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:31:23.016035 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.016026 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 16:31:23.018187 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.018177 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 17 16:31:23.018234 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.018191 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 16:31:23.018234 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.018202 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 16:31:23.018234 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.018211 2573 kubelet.go:397] "Adding apiserver pod source" Apr 17 16:31:23.018234 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.018220 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 16:31:23.019193 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.019181 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:31:23.019243 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.019201 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:31:23.022231 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.022213 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 16:31:23.023611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.023598 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 16:31:23.025254 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.025237 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vdkqz" Apr 17 16:31:23.025449 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.025438 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 16:31:23.025490 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.025455 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 16:31:23.025490 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.025461 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 16:31:23.025490 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.025469 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 16:31:23.025490 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.025474 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 16:31:23.025490 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.025480 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 16:31:23.025490 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.025486 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 16:31:23.025490 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.025492 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 16:31:23.025668 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.025499 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 16:31:23.025668 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.025505 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 16:31:23.025668 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.025520 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 16:31:23.025668 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.025530 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 16:31:23.026154 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.026145 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 16:31:23.026191 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.026155 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 16:31:23.029407 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.029389 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-217.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 16:31:23.029407 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:23.029383 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 16:31:23.029515 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:23.029465 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-217.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 16:31:23.029751 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.029739 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 16:31:23.029785 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.029775 2573 server.go:1295] "Started kubelet" Apr 17 16:31:23.029909 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.029857 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 16:31:23.029959 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.029907 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 16:31:23.030003 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.029992 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 16:31:23.030742 ip-10-0-128-217 systemd[1]: Started Kubernetes Kubelet. Apr 17 16:31:23.031330 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.031143 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 16:31:23.031954 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.031940 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 17 16:31:23.033674 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.033650 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vdkqz" Apr 17 16:31:23.037240 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.037183 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 16:31:23.037637 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.037615 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 16:31:23.037738 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:23.037710 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 16:31:23.038275 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.038259 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 16:31:23.038275 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.038262 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 16:31:23.038395 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.038285 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 16:31:23.038395 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.038324 2573 factory.go:55] Registering systemd factory Apr 17 16:31:23.038395 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.038344 2573 factory.go:223] Registration of the systemd container factory successfully Apr 17 16:31:23.038395 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.038381 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 17 16:31:23.038532 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.038397 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 17 16:31:23.038532 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:23.038494 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 16:31:23.038590 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.038538 2573 factory.go:153] Registering CRI-O factory Apr 17 16:31:23.038590 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.038550 2573 factory.go:223] Registration of the crio container factory successfully Apr 17 16:31:23.038648 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.038593 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 16:31:23.038648 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.038616 2573 factory.go:103] Registering Raw factory Apr 17 16:31:23.038648 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.038629 2573 manager.go:1196] Started watching for new ooms in manager Apr 17 16:31:23.039610 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.039594 2573 manager.go:319] Starting recovery of all containers Apr 17 16:31:23.040206 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:23.040187 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-217.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 16:31:23.040275 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:23.040245 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 16:31:23.040513 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:23.039598 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-217.ec2.internal.18a731ef918bc5c1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-217.ec2.internal,UID:ip-10-0-128-217.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-217.ec2.internal,},FirstTimestamp:2026-04-17 16:31:23.029751233 +0000 UTC m=+0.361252126,LastTimestamp:2026-04-17 16:31:23.029751233 +0000 UTC m=+0.361252126,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-217.ec2.internal,}" Apr 17 16:31:23.049411 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.049258 2573 manager.go:324] Recovery completed Apr 17 16:31:23.053811 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.053798 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:23.056101 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.056084 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:23.056155 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.056120 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:23.056155 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.056137 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:23.056638 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.056621 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 16:31:23.056638 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.056635 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 16:31:23.056779 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.056653 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:31:23.058629 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.058615 2573 policy_none.go:49] "None policy: Start" Apr 17 16:31:23.058705 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.058646 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 16:31:23.058705 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.058659 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 17 16:31:23.094600 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.094569 2573 manager.go:341] "Starting Device Plugin manager" Apr 17 16:31:23.094742 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:23.094607 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 16:31:23.094742 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.094618 2573 server.go:85] "Starting device plugin registration server" Apr 17 16:31:23.094943 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.094931 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 16:31:23.094999 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.094945 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 16:31:23.095048 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.095013 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 16:31:23.095098 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.095085 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 16:31:23.095098 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.095093 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 16:31:23.095630 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:23.095611 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 16:31:23.095747 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:23.095648 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 16:31:23.168915 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.168880 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 16:31:23.171051 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.170082 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 16:31:23.171051 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.170107 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 16:31:23.171051 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.170129 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 16:31:23.171051 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.170136 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 16:31:23.171051 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:23.170167 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 16:31:23.173228 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.173208 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:23.196042 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.195967 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:23.197132 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.197111 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:23.197234 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.197146 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:23.197234 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.197159 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:23.197234 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.197186 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-217.ec2.internal" Apr 17 16:31:23.205215 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.205196 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-217.ec2.internal" Apr 17 16:31:23.205301 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:23.205223 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-217.ec2.internal\": node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 16:31:23.216364 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:23.216344 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 16:31:23.271278 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.271236 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-217.ec2.internal"] Apr 17 16:31:23.271402 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.271341 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:23.273159 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.273140 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:23.273247 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.273172 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:23.273247 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.273183 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:23.275456 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.275440 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:23.275601 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.275587 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" Apr 17 16:31:23.275635 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.275617 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:23.276229 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.276212 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:23.276306 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.276244 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:23.276306 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.276255 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:23.276306 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.276212 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:23.276463 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.276310 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:23.276463 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.276323 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:23.278562 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.278544 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-217.ec2.internal" Apr 17 16:31:23.278635 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.278582 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:23.279255 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.279241 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:23.279317 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.279274 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:23.279317 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.279291 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:23.300442 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:23.300421 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-217.ec2.internal\" not found" node="ip-10-0-128-217.ec2.internal" Apr 17 16:31:23.304417 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:23.304401 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-217.ec2.internal\" not found" node="ip-10-0-128-217.ec2.internal" Apr 17 16:31:23.317097 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:23.317064 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 16:31:23.340159 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.340134 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2807b2563fb554c003c51001f381c040-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal\" (UID: \"2807b2563fb554c003c51001f381c040\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" Apr 17 16:31:23.340234 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.340162 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2807b2563fb554c003c51001f381c040-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal\" (UID: \"2807b2563fb554c003c51001f381c040\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" Apr 17 16:31:23.340234 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.340180 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/57a323f22f4a50ec542cb175406e5b82-config\") pod \"kube-apiserver-proxy-ip-10-0-128-217.ec2.internal\" (UID: \"57a323f22f4a50ec542cb175406e5b82\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-217.ec2.internal" Apr 17 16:31:23.417831 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:23.417806 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 16:31:23.440895 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.440872 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/57a323f22f4a50ec542cb175406e5b82-config\") pod \"kube-apiserver-proxy-ip-10-0-128-217.ec2.internal\" (UID: \"57a323f22f4a50ec542cb175406e5b82\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-217.ec2.internal" Apr 17 16:31:23.440959 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.440904 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2807b2563fb554c003c51001f381c040-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal\" (UID: \"2807b2563fb554c003c51001f381c040\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" Apr 17 16:31:23.440959 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.440926 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2807b2563fb554c003c51001f381c040-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal\" (UID: \"2807b2563fb554c003c51001f381c040\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" Apr 17 16:31:23.441033 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.440959 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2807b2563fb554c003c51001f381c040-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal\" (UID: \"2807b2563fb554c003c51001f381c040\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" Apr 17 16:31:23.441033 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.440959 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/57a323f22f4a50ec542cb175406e5b82-config\") pod \"kube-apiserver-proxy-ip-10-0-128-217.ec2.internal\" (UID: \"57a323f22f4a50ec542cb175406e5b82\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-217.ec2.internal" Apr 17 16:31:23.441033 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.440979 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2807b2563fb554c003c51001f381c040-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal\" (UID: \"2807b2563fb554c003c51001f381c040\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" Apr 17 16:31:23.518136 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:23.518080 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 16:31:23.602733 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.602698 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" Apr 17 16:31:23.607407 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.607389 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-217.ec2.internal" Apr 17 16:31:23.619131 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:23.619114 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 16:31:23.719703 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:23.719671 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 16:31:23.820272 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:23.820188 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 16:31:23.920710 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:23.920672 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 16:31:23.955201 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.955160 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 16:31:23.955867 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:23.955338 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:31:24.021600 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:24.021558 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 16:31:24.037432 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:24.037402 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 16:31:24.037574 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:24.037437 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 16:26:23 +0000 UTC" deadline="2028-01-25 10:03:22.039137126 +0000 UTC" Apr 17 16:31:24.037574 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:24.037482 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15545h31m58.001658685s" Apr 17 16:31:24.037647 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:24.037627 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:24.047328 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:24.047308 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:31:24.070683 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:24.070633 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-7sbbs" Apr 17 16:31:24.079864 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:24.079841 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-7sbbs" Apr 17 16:31:24.122566 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:24.122532 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 16:31:24.175553 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:24.175515 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2807b2563fb554c003c51001f381c040.slice/crio-ebc8b32febd6c6171f2a904a4a0a081b3c5532450d32f32d35c013ebc650e812 WatchSource:0}: Error finding container ebc8b32febd6c6171f2a904a4a0a081b3c5532450d32f32d35c013ebc650e812: Status 404 returned error can't find the container with id ebc8b32febd6c6171f2a904a4a0a081b3c5532450d32f32d35c013ebc650e812 Apr 17 16:31:24.175997 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:24.175970 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57a323f22f4a50ec542cb175406e5b82.slice/crio-fe2d679b820bd2e4622d73726f0b1ab5cdaee186e77883493a3823c4b1f6b4d9 WatchSource:0}: Error finding container fe2d679b820bd2e4622d73726f0b1ab5cdaee186e77883493a3823c4b1f6b4d9: Status 404 returned error can't find the container with id fe2d679b820bd2e4622d73726f0b1ab5cdaee186e77883493a3823c4b1f6b4d9 Apr 17 16:31:24.179575 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:24.179561 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:31:24.223083 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:24.223045 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-217.ec2.internal\" not found" Apr 17 16:31:24.308220 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:24.308195 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:24.338337 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:24.338269 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" Apr 17 16:31:24.349966 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:24.349947 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:31:24.351295 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:24.351283 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-217.ec2.internal" Apr 17 16:31:24.358904 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:24.358891 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:31:24.622291 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:24.622204 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:24.918767 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:24.918686 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:25.019317 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.019277 2573 apiserver.go:52] "Watching apiserver" Apr 17 16:31:25.025939 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.025913 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 16:31:25.028063 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.028033 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-5kfdr","openshift-multus/multus-additional-cni-plugins-9j69g","openshift-multus/network-metrics-daemon-zkmq8","openshift-ovn-kubernetes/ovnkube-node-4s682","kube-system/konnectivity-agent-s2xcr","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt","openshift-cluster-node-tuning-operator/tuned-9dd68","openshift-dns/node-resolver-hqjjj","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal","openshift-network-diagnostics/network-check-target-z2wfh","openshift-network-operator/iptables-alerter-n5p68","kube-system/kube-apiserver-proxy-ip-10-0-128-217.ec2.internal","openshift-image-registry/node-ca-wtvtv"] Apr 17 16:31:25.030456 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.030433 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.032637 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.032610 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.032910 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.032890 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:25.032997 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.032963 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-7wkvc\"" Apr 17 16:31:25.033183 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.033162 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:25.034654 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.034633 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 16:31:25.035118 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.035101 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 16:31:25.035277 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.035264 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 16:31:25.035369 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.035358 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 16:31:25.035499 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.035485 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-pvr62\"" Apr 17 16:31:25.035682 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.035668 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 16:31:25.035757 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.035742 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:25.035868 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:25.035815 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zkmq8" podUID="0d9d52ff-d172-4b74-90ce-5ef0ac75662c" Apr 17 16:31:25.038110 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.038093 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.040310 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.040291 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 16:31:25.040402 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.040294 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-s2xcr" Apr 17 16:31:25.040731 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.040711 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 16:31:25.040885 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.040856 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 16:31:25.040948 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.040885 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 16:31:25.040999 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.040966 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 16:31:25.041264 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.041243 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-jhkwc\"" Apr 17 16:31:25.041341 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.041277 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 16:31:25.042768 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.042620 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7mm8l\"" Apr 17 16:31:25.042768 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.042624 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 16:31:25.042768 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.042710 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 16:31:25.042768 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.042718 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.045416 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.045087 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8nqx2\"" Apr 17 16:31:25.045416 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.045381 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 16:31:25.045719 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.045696 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 16:31:25.046347 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.045708 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 16:31:25.047640 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.047618 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.049217 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049197 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.049312 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049232 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-host-kubelet\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.049312 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049258 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-node-log\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.049312 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049304 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-host-cni-bin\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.049449 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049337 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-lib-modules\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.049449 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049365 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.049449 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049390 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs\") pod \"network-metrics-daemon-zkmq8\" (UID: \"0d9d52ff-d172-4b74-90ce-5ef0ac75662c\") " pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:25.049449 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049412 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-run-openvswitch\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.049449 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049428 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d463d09-7ae3-4a07-b80e-6078d9f0801d-env-overrides\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.049449 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049445 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlwqx\" (UniqueName: \"kubernetes.io/projected/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-kube-api-access-zlwqx\") pod \"network-metrics-daemon-zkmq8\" (UID: \"0d9d52ff-d172-4b74-90ce-5ef0ac75662c\") " pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:25.049753 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049470 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-cnibin\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.049753 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049521 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-cni-binary-copy\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.049753 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049552 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f7ds\" (UniqueName: \"kubernetes.io/projected/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-kube-api-access-4f7ds\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.049753 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049578 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-host-slash\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.049753 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049602 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d463d09-7ae3-4a07-b80e-6078d9f0801d-ovnkube-config\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.049753 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049624 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/04fe438d-c4b2-4123-8dce-24e40c4f8332-konnectivity-ca\") pod \"konnectivity-agent-s2xcr\" (UID: \"04fe438d-c4b2-4123-8dce-24e40c4f8332\") " pod="kube-system/konnectivity-agent-s2xcr" Apr 17 16:31:25.049753 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049645 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-run-ovn\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.049753 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049667 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-var-lib-kubelet\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.049753 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049706 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-etc-kubernetes\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.049753 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049743 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-etc-systemd\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.050255 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049765 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-system-cni-dir\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.050255 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049790 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-var-lib-openvswitch\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.050255 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049813 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1d463d09-7ae3-4a07-b80e-6078d9f0801d-ovnkube-script-lib\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.050255 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049851 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-m4s7h\"" Apr 17 16:31:25.050255 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049851 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-etc-modprobe-d\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.050255 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049871 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 16:31:25.050255 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049891 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/34116767-97f7-4597-bf99-9ab932940d12-etc-tuned\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.050255 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049915 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/34116767-97f7-4597-bf99-9ab932940d12-tmp\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.050255 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049932 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-log-socket\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.050255 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049945 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-host-run-ovn-kubernetes\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.050255 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049959 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d463d09-7ae3-4a07-b80e-6078d9f0801d-ovn-node-metrics-cert\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.050255 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.049982 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnpsm\" (UniqueName: \"kubernetes.io/projected/1d463d09-7ae3-4a07-b80e-6078d9f0801d-kube-api-access-dnpsm\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.050255 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.050009 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-etc-sysconfig\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.050255 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.050040 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-etc-sysctl-d\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.050255 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.050067 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-sys\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.050255 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.050095 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.050255 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.050109 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hqjjj" Apr 17 16:31:25.050255 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.050135 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-host-run-netns\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.051006 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.050161 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-etc-openvswitch\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.051006 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.050179 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-host-cni-netd\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.051006 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.050196 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-etc-sysctl-conf\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.051006 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.050217 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-host\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.051006 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.050241 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-systemd-units\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.051006 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.050262 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-run-systemd\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.051006 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.050284 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.051006 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.050309 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/04fe438d-c4b2-4123-8dce-24e40c4f8332-agent-certs\") pod \"konnectivity-agent-s2xcr\" (UID: \"04fe438d-c4b2-4123-8dce-24e40c4f8332\") " pod="kube-system/konnectivity-agent-s2xcr" Apr 17 16:31:25.051006 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.050333 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-run\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.051006 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.050382 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4ct5\" (UniqueName: \"kubernetes.io/projected/34116767-97f7-4597-bf99-9ab932940d12-kube-api-access-k4ct5\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.051006 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.050401 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-os-release\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.052062 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.052041 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-s6sps\"" Apr 17 16:31:25.052181 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.052138 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 16:31:25.052247 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.052221 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 16:31:25.053648 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.053596 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:25.053741 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:25.053705 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wfh" podUID="10fbceca-37d7-4803-b22d-19039688034a" Apr 17 16:31:25.055988 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.055971 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-n5p68" Apr 17 16:31:25.058327 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.058290 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wtvtv" Apr 17 16:31:25.058425 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.058365 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:25.058425 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.058367 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 16:31:25.058791 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.058773 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-bsg2m\"" Apr 17 16:31:25.059381 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.059362 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:25.060421 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.060400 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 16:31:25.060769 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.060648 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 16:31:25.060769 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.060663 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 16:31:25.060769 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.060746 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-ps24l\"" Apr 17 16:31:25.080560 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.080525 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:26:24 +0000 UTC" deadline="2027-12-25 19:35:07.285132188 +0000 UTC" Apr 17 16:31:25.080560 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.080557 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14811h3m42.204578906s" Apr 17 16:31:25.140180 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.140133 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 16:31:25.150884 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.150852 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-etc-modprobe-d\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.151057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.150897 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-log-socket\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.151057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.150932 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-cni-binary-copy\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.151057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.150961 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/74a720b2-f2e9-4ae6-98cf-494d329dd9e7-iptables-alerter-script\") pod \"iptables-alerter-n5p68\" (UID: \"74a720b2-f2e9-4ae6-98cf-494d329dd9e7\") " pod="openshift-network-operator/iptables-alerter-n5p68" Apr 17 16:31:25.151057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.150988 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdz8x\" (UniqueName: \"kubernetes.io/projected/74a720b2-f2e9-4ae6-98cf-494d329dd9e7-kube-api-access-xdz8x\") pod \"iptables-alerter-n5p68\" (UID: \"74a720b2-f2e9-4ae6-98cf-494d329dd9e7\") " pod="openshift-network-operator/iptables-alerter-n5p68" Apr 17 16:31:25.151057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151013 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-etc-sysconfig\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.151057 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151031 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-etc-modprobe-d\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.151333 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151075 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-etc-sysconfig\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.151333 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151037 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-etc-sysctl-d\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.151333 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151031 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-log-socket\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.151333 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151158 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-sys\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.151333 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151187 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-host-run-netns\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.151333 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151201 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-sys\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.151333 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151211 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-host-cni-netd\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.151333 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151141 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-etc-sysctl-d\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.151333 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151237 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-host-run-netns\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.151333 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151238 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-os-release\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.151333 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151280 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-host-cni-netd\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.151333 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151333 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-etc-sysctl-conf\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.151787 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151356 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-host\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.151787 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151400 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-host\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.151787 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151439 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-systemd-units\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.151787 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151482 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-etc-sysctl-conf\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.151787 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151484 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.151787 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151518 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-systemd-units\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.151787 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151522 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.151787 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151534 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/04fe438d-c4b2-4123-8dce-24e40c4f8332-agent-certs\") pod \"konnectivity-agent-s2xcr\" (UID: \"04fe438d-c4b2-4123-8dce-24e40c4f8332\") " pod="kube-system/konnectivity-agent-s2xcr" Apr 17 16:31:25.151787 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151562 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/74c40262-7919-4fdb-bb29-ede49709d9a0-socket-dir\") pod \"aws-ebs-csi-driver-node-cbqgt\" (UID: \"74c40262-7919-4fdb-bb29-ede49709d9a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.151787 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151659 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-cnibin\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.151787 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151686 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-host-run-netns\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.151787 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151724 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-run\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.151787 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151750 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4ct5\" (UniqueName: \"kubernetes.io/projected/34116767-97f7-4597-bf99-9ab932940d12-kube-api-access-k4ct5\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.151787 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151790 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-run\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.152431 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151844 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-os-release\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.152431 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151919 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 16:31:25.152431 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151945 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.152431 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151975 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-os-release\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.152431 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.151975 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-host-kubelet\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.152431 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152016 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-host-cni-bin\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.152431 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152013 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-host-kubelet\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.152431 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152050 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zrk6\" (UniqueName: \"kubernetes.io/projected/10fbceca-37d7-4803-b22d-19039688034a-kube-api-access-6zrk6\") pod \"network-check-target-z2wfh\" (UID: \"10fbceca-37d7-4803-b22d-19039688034a\") " pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:25.152431 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152064 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-host-cni-bin\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.152431 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152078 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/74c40262-7919-4fdb-bb29-ede49709d9a0-device-dir\") pod \"aws-ebs-csi-driver-node-cbqgt\" (UID: \"74c40262-7919-4fdb-bb29-ede49709d9a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.152431 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152129 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.152431 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152179 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.152431 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152176 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs\") pod \"network-metrics-daemon-zkmq8\" (UID: \"0d9d52ff-d172-4b74-90ce-5ef0ac75662c\") " pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:25.152431 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152221 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/66fec0a1-a09d-4a76-b857-2877ab654053-serviceca\") pod \"node-ca-wtvtv\" (UID: \"66fec0a1-a09d-4a76-b857-2877ab654053\") " pod="openshift-image-registry/node-ca-wtvtv" Apr 17 16:31:25.152431 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152245 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74c40262-7919-4fdb-bb29-ede49709d9a0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cbqgt\" (UID: \"74c40262-7919-4fdb-bb29-ede49709d9a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.152431 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152270 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-multus-conf-dir\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.152431 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:25.152297 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:25.153257 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152365 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a4e3fb1c-519b-4c02-9326-fd056001ad1b-hosts-file\") pod \"node-resolver-hqjjj\" (UID: \"a4e3fb1c-519b-4c02-9326-fd056001ad1b\") " pod="openshift-dns/node-resolver-hqjjj" Apr 17 16:31:25.153257 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:25.152453 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs podName:0d9d52ff-d172-4b74-90ce-5ef0ac75662c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:25.652417944 +0000 UTC m=+2.983918844 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs") pod "network-metrics-daemon-zkmq8" (UID: "0d9d52ff-d172-4b74-90ce-5ef0ac75662c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:25.153257 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152492 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a4e3fb1c-519b-4c02-9326-fd056001ad1b-tmp-dir\") pod \"node-resolver-hqjjj\" (UID: \"a4e3fb1c-519b-4c02-9326-fd056001ad1b\") " pod="openshift-dns/node-resolver-hqjjj" Apr 17 16:31:25.153257 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152524 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlwqx\" (UniqueName: \"kubernetes.io/projected/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-kube-api-access-zlwqx\") pod \"network-metrics-daemon-zkmq8\" (UID: \"0d9d52ff-d172-4b74-90ce-5ef0ac75662c\") " pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:25.153257 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152550 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-cni-binary-copy\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.153257 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152574 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d463d09-7ae3-4a07-b80e-6078d9f0801d-ovnkube-config\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.153257 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152599 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/04fe438d-c4b2-4123-8dce-24e40c4f8332-konnectivity-ca\") pod \"konnectivity-agent-s2xcr\" (UID: \"04fe438d-c4b2-4123-8dce-24e40c4f8332\") " pod="kube-system/konnectivity-agent-s2xcr" Apr 17 16:31:25.153257 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152628 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/74c40262-7919-4fdb-bb29-ede49709d9a0-sys-fs\") pod \"aws-ebs-csi-driver-node-cbqgt\" (UID: \"74c40262-7919-4fdb-bb29-ede49709d9a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.153257 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152655 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-host-var-lib-cni-bin\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.153257 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152680 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-var-lib-kubelet\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.153257 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152698 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-system-cni-dir\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.153257 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152713 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74a720b2-f2e9-4ae6-98cf-494d329dd9e7-host-slash\") pod \"iptables-alerter-n5p68\" (UID: \"74a720b2-f2e9-4ae6-98cf-494d329dd9e7\") " pod="openshift-network-operator/iptables-alerter-n5p68" Apr 17 16:31:25.153257 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152631 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.153257 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152759 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-var-lib-kubelet\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.153257 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152769 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/34116767-97f7-4597-bf99-9ab932940d12-etc-tuned\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.153257 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152792 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-system-cni-dir\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.154079 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152814 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/34116767-97f7-4597-bf99-9ab932940d12-tmp\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.154079 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152851 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-host-run-ovn-kubernetes\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.154079 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152899 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d463d09-7ae3-4a07-b80e-6078d9f0801d-ovn-node-metrics-cert\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.154079 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152929 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnpsm\" (UniqueName: \"kubernetes.io/projected/1d463d09-7ae3-4a07-b80e-6078d9f0801d-kube-api-access-dnpsm\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.154079 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.152958 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-host-var-lib-cni-multus\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.154079 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153028 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66fec0a1-a09d-4a76-b857-2877ab654053-host\") pod \"node-ca-wtvtv\" (UID: \"66fec0a1-a09d-4a76-b857-2877ab654053\") " pod="openshift-image-registry/node-ca-wtvtv" Apr 17 16:31:25.154079 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153057 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.154079 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153057 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-host-run-ovn-kubernetes\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.154079 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153087 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-etc-openvswitch\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.154079 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153131 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-cni-binary-copy\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.154079 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153135 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/74c40262-7919-4fdb-bb29-ede49709d9a0-etc-selinux\") pod \"aws-ebs-csi-driver-node-cbqgt\" (UID: \"74c40262-7919-4fdb-bb29-ede49709d9a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.154079 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153180 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-etc-openvswitch\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.154079 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153180 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/04fe438d-c4b2-4123-8dce-24e40c4f8332-konnectivity-ca\") pod \"konnectivity-agent-s2xcr\" (UID: \"04fe438d-c4b2-4123-8dce-24e40c4f8332\") " pod="kube-system/konnectivity-agent-s2xcr" Apr 17 16:31:25.154079 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153254 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d463d09-7ae3-4a07-b80e-6078d9f0801d-ovnkube-config\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.154079 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153234 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl8vt\" (UniqueName: \"kubernetes.io/projected/74c40262-7919-4fdb-bb29-ede49709d9a0-kube-api-access-sl8vt\") pod \"aws-ebs-csi-driver-node-cbqgt\" (UID: \"74c40262-7919-4fdb-bb29-ede49709d9a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.154079 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153313 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-host-run-multus-certs\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.154079 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153345 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-run-systemd\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.154961 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153371 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-host-run-k8s-cni-cncf-io\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.154961 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153416 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-host-var-lib-kubelet\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.154961 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153432 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-run-systemd\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.154961 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153491 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-node-log\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.154961 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153519 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d463d09-7ae3-4a07-b80e-6078d9f0801d-env-overrides\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.154961 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153539 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-node-log\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.154961 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153546 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/74c40262-7919-4fdb-bb29-ede49709d9a0-registration-dir\") pod \"aws-ebs-csi-driver-node-cbqgt\" (UID: \"74c40262-7919-4fdb-bb29-ede49709d9a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.154961 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153576 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-system-cni-dir\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.154961 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153606 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-lib-modules\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.154961 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153617 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.154961 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153629 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-run-openvswitch\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.154961 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153692 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jxq4\" (UniqueName: \"kubernetes.io/projected/66fec0a1-a09d-4a76-b857-2877ab654053-kube-api-access-9jxq4\") pod \"node-ca-wtvtv\" (UID: \"66fec0a1-a09d-4a76-b857-2877ab654053\") " pod="openshift-image-registry/node-ca-wtvtv" Apr 17 16:31:25.154961 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153773 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-run-openvswitch\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.154961 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153790 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-lib-modules\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.154961 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153789 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctmgf\" (UniqueName: \"kubernetes.io/projected/a4e3fb1c-519b-4c02-9326-fd056001ad1b-kube-api-access-ctmgf\") pod \"node-resolver-hqjjj\" (UID: \"a4e3fb1c-519b-4c02-9326-fd056001ad1b\") " pod="openshift-dns/node-resolver-hqjjj" Apr 17 16:31:25.154961 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153847 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-cnibin\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.154961 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153874 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4f7ds\" (UniqueName: \"kubernetes.io/projected/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-kube-api-access-4f7ds\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.155805 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153898 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-host-slash\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.155805 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153937 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-multus-socket-dir-parent\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.155805 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153964 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-hostroot\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.155805 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153989 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-multus-daemon-config\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.155805 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.153984 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-cnibin\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.155805 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.154014 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z6z6\" (UniqueName: \"kubernetes.io/projected/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-kube-api-access-5z6z6\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.155805 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.154041 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-run-ovn\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.155805 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.154070 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-etc-kubernetes\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.155805 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.154096 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-etc-kubernetes\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.155805 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.154120 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-etc-systemd\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.155805 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.154148 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-var-lib-openvswitch\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.155805 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.154173 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1d463d09-7ae3-4a07-b80e-6078d9f0801d-ovnkube-script-lib\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.155805 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.154181 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-run-ovn\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.155805 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.154210 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-multus-cni-dir\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.155805 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.154221 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-etc-systemd\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.155805 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.154254 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34116767-97f7-4597-bf99-9ab932940d12-etc-kubernetes\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.155805 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.154288 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-host-slash\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.156604 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.154295 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d463d09-7ae3-4a07-b80e-6078d9f0801d-var-lib-openvswitch\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.156604 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.154540 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d463d09-7ae3-4a07-b80e-6078d9f0801d-env-overrides\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.156604 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.154901 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1d463d09-7ae3-4a07-b80e-6078d9f0801d-ovnkube-script-lib\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.156604 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.156120 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/34116767-97f7-4597-bf99-9ab932940d12-tmp\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.156604 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.156146 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/34116767-97f7-4597-bf99-9ab932940d12-etc-tuned\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.156812 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.156668 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d463d09-7ae3-4a07-b80e-6078d9f0801d-ovn-node-metrics-cert\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.163226 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.163199 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4ct5\" (UniqueName: \"kubernetes.io/projected/34116767-97f7-4597-bf99-9ab932940d12-kube-api-access-k4ct5\") pod \"tuned-9dd68\" (UID: \"34116767-97f7-4597-bf99-9ab932940d12\") " pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.164125 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.164084 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f7ds\" (UniqueName: \"kubernetes.io/projected/6e5430f1-c021-4f4d-bedc-fafa1ec4d260-kube-api-access-4f7ds\") pod \"multus-additional-cni-plugins-9j69g\" (UID: \"6e5430f1-c021-4f4d-bedc-fafa1ec4d260\") " pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.164263 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.164245 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnpsm\" (UniqueName: \"kubernetes.io/projected/1d463d09-7ae3-4a07-b80e-6078d9f0801d-kube-api-access-dnpsm\") pod \"ovnkube-node-4s682\" (UID: \"1d463d09-7ae3-4a07-b80e-6078d9f0801d\") " pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.164607 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.164584 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlwqx\" (UniqueName: \"kubernetes.io/projected/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-kube-api-access-zlwqx\") pod \"network-metrics-daemon-zkmq8\" (UID: \"0d9d52ff-d172-4b74-90ce-5ef0ac75662c\") " pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:25.168144 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.168124 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/04fe438d-c4b2-4123-8dce-24e40c4f8332-agent-certs\") pod \"konnectivity-agent-s2xcr\" (UID: \"04fe438d-c4b2-4123-8dce-24e40c4f8332\") " pod="kube-system/konnectivity-agent-s2xcr" Apr 17 16:31:25.174676 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.174595 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" event={"ID":"2807b2563fb554c003c51001f381c040","Type":"ContainerStarted","Data":"ebc8b32febd6c6171f2a904a4a0a081b3c5532450d32f32d35c013ebc650e812"} Apr 17 16:31:25.175612 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.175568 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-217.ec2.internal" event={"ID":"57a323f22f4a50ec542cb175406e5b82","Type":"ContainerStarted","Data":"fe2d679b820bd2e4622d73726f0b1ab5cdaee186e77883493a3823c4b1f6b4d9"} Apr 17 16:31:25.255245 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.255209 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-hostroot\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.255245 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.255257 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-multus-daemon-config\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.255450 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.255338 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-hostroot\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.255450 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.255395 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5z6z6\" (UniqueName: \"kubernetes.io/projected/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-kube-api-access-5z6z6\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.255537 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.255445 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-etc-kubernetes\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.255537 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.255482 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-multus-cni-dir\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.255537 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.255515 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-cni-binary-copy\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.255671 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.255543 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/74a720b2-f2e9-4ae6-98cf-494d329dd9e7-iptables-alerter-script\") pod \"iptables-alerter-n5p68\" (UID: \"74a720b2-f2e9-4ae6-98cf-494d329dd9e7\") " pod="openshift-network-operator/iptables-alerter-n5p68" Apr 17 16:31:25.255671 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.255574 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdz8x\" (UniqueName: \"kubernetes.io/projected/74a720b2-f2e9-4ae6-98cf-494d329dd9e7-kube-api-access-xdz8x\") pod \"iptables-alerter-n5p68\" (UID: \"74a720b2-f2e9-4ae6-98cf-494d329dd9e7\") " pod="openshift-network-operator/iptables-alerter-n5p68" Apr 17 16:31:25.255671 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.255607 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-os-release\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.255671 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.255640 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/74c40262-7919-4fdb-bb29-ede49709d9a0-socket-dir\") pod \"aws-ebs-csi-driver-node-cbqgt\" (UID: \"74c40262-7919-4fdb-bb29-ede49709d9a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.255671 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.255668 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-cnibin\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.255846 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.255693 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-host-run-netns\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.255846 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.255731 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zrk6\" (UniqueName: \"kubernetes.io/projected/10fbceca-37d7-4803-b22d-19039688034a-kube-api-access-6zrk6\") pod \"network-check-target-z2wfh\" (UID: \"10fbceca-37d7-4803-b22d-19039688034a\") " pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:25.255846 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.255790 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/74c40262-7919-4fdb-bb29-ede49709d9a0-device-dir\") pod \"aws-ebs-csi-driver-node-cbqgt\" (UID: \"74c40262-7919-4fdb-bb29-ede49709d9a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.255960 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.255864 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/66fec0a1-a09d-4a76-b857-2877ab654053-serviceca\") pod \"node-ca-wtvtv\" (UID: \"66fec0a1-a09d-4a76-b857-2877ab654053\") " pod="openshift-image-registry/node-ca-wtvtv" Apr 17 16:31:25.255960 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.255900 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74c40262-7919-4fdb-bb29-ede49709d9a0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cbqgt\" (UID: \"74c40262-7919-4fdb-bb29-ede49709d9a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.255960 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.255925 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-os-release\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.255960 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.255953 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-multus-conf-dir\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.256097 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.255982 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-etc-kubernetes\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.256097 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.255982 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a4e3fb1c-519b-4c02-9326-fd056001ad1b-hosts-file\") pod \"node-resolver-hqjjj\" (UID: \"a4e3fb1c-519b-4c02-9326-fd056001ad1b\") " pod="openshift-dns/node-resolver-hqjjj" Apr 17 16:31:25.256097 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256025 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a4e3fb1c-519b-4c02-9326-fd056001ad1b-tmp-dir\") pod \"node-resolver-hqjjj\" (UID: \"a4e3fb1c-519b-4c02-9326-fd056001ad1b\") " pod="openshift-dns/node-resolver-hqjjj" Apr 17 16:31:25.256097 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256066 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/74c40262-7919-4fdb-bb29-ede49709d9a0-sys-fs\") pod \"aws-ebs-csi-driver-node-cbqgt\" (UID: \"74c40262-7919-4fdb-bb29-ede49709d9a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.256097 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256051 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a4e3fb1c-519b-4c02-9326-fd056001ad1b-hosts-file\") pod \"node-resolver-hqjjj\" (UID: \"a4e3fb1c-519b-4c02-9326-fd056001ad1b\") " pod="openshift-dns/node-resolver-hqjjj" Apr 17 16:31:25.256307 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256097 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-host-var-lib-cni-bin\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.256307 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256132 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74a720b2-f2e9-4ae6-98cf-494d329dd9e7-host-slash\") pod \"iptables-alerter-n5p68\" (UID: \"74a720b2-f2e9-4ae6-98cf-494d329dd9e7\") " pod="openshift-network-operator/iptables-alerter-n5p68" Apr 17 16:31:25.256307 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256135 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/74c40262-7919-4fdb-bb29-ede49709d9a0-device-dir\") pod \"aws-ebs-csi-driver-node-cbqgt\" (UID: \"74c40262-7919-4fdb-bb29-ede49709d9a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.256307 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256152 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-cnibin\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.256307 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256179 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74a720b2-f2e9-4ae6-98cf-494d329dd9e7-host-slash\") pod \"iptables-alerter-n5p68\" (UID: \"74a720b2-f2e9-4ae6-98cf-494d329dd9e7\") " pod="openshift-network-operator/iptables-alerter-n5p68" Apr 17 16:31:25.256307 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256240 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/74c40262-7919-4fdb-bb29-ede49709d9a0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cbqgt\" (UID: \"74c40262-7919-4fdb-bb29-ede49709d9a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.256307 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256247 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/74c40262-7919-4fdb-bb29-ede49709d9a0-socket-dir\") pod \"aws-ebs-csi-driver-node-cbqgt\" (UID: \"74c40262-7919-4fdb-bb29-ede49709d9a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.256638 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256308 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-host-run-netns\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.256638 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256297 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-multus-conf-dir\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.256638 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.255983 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-multus-daemon-config\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.256638 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256370 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/74c40262-7919-4fdb-bb29-ede49709d9a0-sys-fs\") pod \"aws-ebs-csi-driver-node-cbqgt\" (UID: \"74c40262-7919-4fdb-bb29-ede49709d9a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.256638 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256377 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-host-var-lib-cni-bin\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.256638 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256401 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-multus-cni-dir\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.256638 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256445 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-host-var-lib-cni-multus\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.256638 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256480 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66fec0a1-a09d-4a76-b857-2877ab654053-host\") pod \"node-ca-wtvtv\" (UID: \"66fec0a1-a09d-4a76-b857-2877ab654053\") " pod="openshift-image-registry/node-ca-wtvtv" Apr 17 16:31:25.256638 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256510 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/74c40262-7919-4fdb-bb29-ede49709d9a0-etc-selinux\") pod \"aws-ebs-csi-driver-node-cbqgt\" (UID: \"74c40262-7919-4fdb-bb29-ede49709d9a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.256638 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256541 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sl8vt\" (UniqueName: \"kubernetes.io/projected/74c40262-7919-4fdb-bb29-ede49709d9a0-kube-api-access-sl8vt\") pod \"aws-ebs-csi-driver-node-cbqgt\" (UID: \"74c40262-7919-4fdb-bb29-ede49709d9a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.256638 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256545 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a4e3fb1c-519b-4c02-9326-fd056001ad1b-tmp-dir\") pod \"node-resolver-hqjjj\" (UID: \"a4e3fb1c-519b-4c02-9326-fd056001ad1b\") " pod="openshift-dns/node-resolver-hqjjj" Apr 17 16:31:25.256638 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256565 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-host-run-multus-certs\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.256638 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256591 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66fec0a1-a09d-4a76-b857-2877ab654053-host\") pod \"node-ca-wtvtv\" (UID: \"66fec0a1-a09d-4a76-b857-2877ab654053\") " pod="openshift-image-registry/node-ca-wtvtv" Apr 17 16:31:25.256638 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256591 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-host-run-k8s-cni-cncf-io\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.256638 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256634 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-host-var-lib-kubelet\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.256638 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256640 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-host-run-multus-certs\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.257297 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256591 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/66fec0a1-a09d-4a76-b857-2877ab654053-serviceca\") pod \"node-ca-wtvtv\" (UID: \"66fec0a1-a09d-4a76-b857-2877ab654053\") " pod="openshift-image-registry/node-ca-wtvtv" Apr 17 16:31:25.257297 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256667 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/74c40262-7919-4fdb-bb29-ede49709d9a0-registration-dir\") pod \"aws-ebs-csi-driver-node-cbqgt\" (UID: \"74c40262-7919-4fdb-bb29-ede49709d9a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.257297 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256694 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-system-cni-dir\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.257297 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256698 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/74c40262-7919-4fdb-bb29-ede49709d9a0-etc-selinux\") pod \"aws-ebs-csi-driver-node-cbqgt\" (UID: \"74c40262-7919-4fdb-bb29-ede49709d9a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.257297 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256723 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jxq4\" (UniqueName: \"kubernetes.io/projected/66fec0a1-a09d-4a76-b857-2877ab654053-kube-api-access-9jxq4\") pod \"node-ca-wtvtv\" (UID: \"66fec0a1-a09d-4a76-b857-2877ab654053\") " pod="openshift-image-registry/node-ca-wtvtv" Apr 17 16:31:25.257297 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256743 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-host-run-k8s-cni-cncf-io\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.257297 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256750 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctmgf\" (UniqueName: \"kubernetes.io/projected/a4e3fb1c-519b-4c02-9326-fd056001ad1b-kube-api-access-ctmgf\") pod \"node-resolver-hqjjj\" (UID: \"a4e3fb1c-519b-4c02-9326-fd056001ad1b\") " pod="openshift-dns/node-resolver-hqjjj" Apr 17 16:31:25.257297 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256696 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-host-var-lib-cni-multus\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.257297 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256807 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/74c40262-7919-4fdb-bb29-ede49709d9a0-registration-dir\") pod \"aws-ebs-csi-driver-node-cbqgt\" (UID: \"74c40262-7919-4fdb-bb29-ede49709d9a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.257297 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256859 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-multus-socket-dir-parent\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.257297 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256885 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-cni-binary-copy\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.257297 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256724 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-host-var-lib-kubelet\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.257297 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.256951 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-multus-socket-dir-parent\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.257297 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.257000 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-system-cni-dir\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.257297 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.257063 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/74a720b2-f2e9-4ae6-98cf-494d329dd9e7-iptables-alerter-script\") pod \"iptables-alerter-n5p68\" (UID: \"74a720b2-f2e9-4ae6-98cf-494d329dd9e7\") " pod="openshift-network-operator/iptables-alerter-n5p68" Apr 17 16:31:25.262081 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:25.262064 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:25.262121 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:25.262086 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:25.262121 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:25.262096 2573 projected.go:194] Error preparing data for projected volume kube-api-access-6zrk6 for pod openshift-network-diagnostics/network-check-target-z2wfh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:25.262201 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:25.262154 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10fbceca-37d7-4803-b22d-19039688034a-kube-api-access-6zrk6 podName:10fbceca-37d7-4803-b22d-19039688034a nodeName:}" failed. No retries permitted until 2026-04-17 16:31:25.762136006 +0000 UTC m=+3.093636911 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6zrk6" (UniqueName: "kubernetes.io/projected/10fbceca-37d7-4803-b22d-19039688034a-kube-api-access-6zrk6") pod "network-check-target-z2wfh" (UID: "10fbceca-37d7-4803-b22d-19039688034a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:25.264651 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.264627 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jxq4\" (UniqueName: \"kubernetes.io/projected/66fec0a1-a09d-4a76-b857-2877ab654053-kube-api-access-9jxq4\") pod \"node-ca-wtvtv\" (UID: \"66fec0a1-a09d-4a76-b857-2877ab654053\") " pod="openshift-image-registry/node-ca-wtvtv" Apr 17 16:31:25.264790 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.264634 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctmgf\" (UniqueName: \"kubernetes.io/projected/a4e3fb1c-519b-4c02-9326-fd056001ad1b-kube-api-access-ctmgf\") pod \"node-resolver-hqjjj\" (UID: \"a4e3fb1c-519b-4c02-9326-fd056001ad1b\") " pod="openshift-dns/node-resolver-hqjjj" Apr 17 16:31:25.264885 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.264849 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdz8x\" (UniqueName: \"kubernetes.io/projected/74a720b2-f2e9-4ae6-98cf-494d329dd9e7-kube-api-access-xdz8x\") pod \"iptables-alerter-n5p68\" (UID: \"74a720b2-f2e9-4ae6-98cf-494d329dd9e7\") " pod="openshift-network-operator/iptables-alerter-n5p68" Apr 17 16:31:25.265539 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.265520 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl8vt\" (UniqueName: \"kubernetes.io/projected/74c40262-7919-4fdb-bb29-ede49709d9a0-kube-api-access-sl8vt\") pod \"aws-ebs-csi-driver-node-cbqgt\" (UID: \"74c40262-7919-4fdb-bb29-ede49709d9a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.265539 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.265532 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z6z6\" (UniqueName: \"kubernetes.io/projected/7238545f-7cf1-4c61-a1dc-f5a458a0c5ed-kube-api-access-5z6z6\") pod \"multus-5kfdr\" (UID: \"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed\") " pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.341684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.341652 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9dd68" Apr 17 16:31:25.349362 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.349329 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9j69g" Apr 17 16:31:25.360328 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.358378 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:25.364266 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.364241 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-s2xcr" Apr 17 16:31:25.370866 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.370843 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" Apr 17 16:31:25.377523 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.377501 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5kfdr" Apr 17 16:31:25.384110 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.384088 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hqjjj" Apr 17 16:31:25.392650 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.392630 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-n5p68" Apr 17 16:31:25.398166 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.398148 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wtvtv" Apr 17 16:31:25.660130 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.660095 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs\") pod \"network-metrics-daemon-zkmq8\" (UID: \"0d9d52ff-d172-4b74-90ce-5ef0ac75662c\") " pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:25.660338 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:25.660288 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:25.660400 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:25.660363 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs podName:0d9d52ff-d172-4b74-90ce-5ef0ac75662c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:26.66034139 +0000 UTC m=+3.991842274 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs") pod "network-metrics-daemon-zkmq8" (UID: "0d9d52ff-d172-4b74-90ce-5ef0ac75662c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:25.836098 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:25.836022 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d463d09_7ae3_4a07_b80e_6078d9f0801d.slice/crio-04b3026162eed3de3e5ca3e1f112b642dfb24c1062ea5777ffa4f59f14d11330 WatchSource:0}: Error finding container 04b3026162eed3de3e5ca3e1f112b642dfb24c1062ea5777ffa4f59f14d11330: Status 404 returned error can't find the container with id 04b3026162eed3de3e5ca3e1f112b642dfb24c1062ea5777ffa4f59f14d11330 Apr 17 16:31:25.837435 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:25.837413 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66fec0a1_a09d_4a76_b857_2877ab654053.slice/crio-744983e4da3217f95bab1cd0941dc17546e07672df891e5c4b43d30056683b94 WatchSource:0}: Error finding container 744983e4da3217f95bab1cd0941dc17546e07672df891e5c4b43d30056683b94: Status 404 returned error can't find the container with id 744983e4da3217f95bab1cd0941dc17546e07672df891e5c4b43d30056683b94 Apr 17 16:31:25.838133 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:25.838086 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34116767_97f7_4597_bf99_9ab932940d12.slice/crio-0d493c26a9fac92704903ba1b68343fe514be39ae0c6b93186e3afff6bbdabe8 WatchSource:0}: Error finding container 0d493c26a9fac92704903ba1b68343fe514be39ae0c6b93186e3afff6bbdabe8: Status 404 returned error can't find the container with id 0d493c26a9fac92704903ba1b68343fe514be39ae0c6b93186e3afff6bbdabe8 Apr 17 16:31:25.841340 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:25.841314 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74c40262_7919_4fdb_bb29_ede49709d9a0.slice/crio-9eda666f6dec2f4c1d4075aeef6f2a77abe944b047c2969b407276506a239ad9 WatchSource:0}: Error finding container 9eda666f6dec2f4c1d4075aeef6f2a77abe944b047c2969b407276506a239ad9: Status 404 returned error can't find the container with id 9eda666f6dec2f4c1d4075aeef6f2a77abe944b047c2969b407276506a239ad9 Apr 17 16:31:25.842107 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:25.842086 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04fe438d_c4b2_4123_8dce_24e40c4f8332.slice/crio-f3778cef5ac1beef07614f8d7bec95ef7855ac9ebaded36ed0e23e9eeb2f60ca WatchSource:0}: Error finding container f3778cef5ac1beef07614f8d7bec95ef7855ac9ebaded36ed0e23e9eeb2f60ca: Status 404 returned error can't find the container with id f3778cef5ac1beef07614f8d7bec95ef7855ac9ebaded36ed0e23e9eeb2f60ca Apr 17 16:31:25.843223 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:25.843047 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7238545f_7cf1_4c61_a1dc_f5a458a0c5ed.slice/crio-7e319e53de3d8028136a4f5033720374ef8bb58416f8e01229bb3d13820036c9 WatchSource:0}: Error finding container 7e319e53de3d8028136a4f5033720374ef8bb58416f8e01229bb3d13820036c9: Status 404 returned error can't find the container with id 7e319e53de3d8028136a4f5033720374ef8bb58416f8e01229bb3d13820036c9 Apr 17 16:31:25.843978 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:25.843934 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e5430f1_c021_4f4d_bedc_fafa1ec4d260.slice/crio-2efb0ccd1ca5cd35535f423b66243b8993c7ffda65234dc3e2a6429acc788217 WatchSource:0}: Error finding container 2efb0ccd1ca5cd35535f423b66243b8993c7ffda65234dc3e2a6429acc788217: Status 404 returned error can't find the container with id 2efb0ccd1ca5cd35535f423b66243b8993c7ffda65234dc3e2a6429acc788217 Apr 17 16:31:25.844748 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:25.844616 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74a720b2_f2e9_4ae6_98cf_494d329dd9e7.slice/crio-72c51543333accbe2c99cb67d14bc0da5f2e8c4df65193eddcea79bb52cadfd4 WatchSource:0}: Error finding container 72c51543333accbe2c99cb67d14bc0da5f2e8c4df65193eddcea79bb52cadfd4: Status 404 returned error can't find the container with id 72c51543333accbe2c99cb67d14bc0da5f2e8c4df65193eddcea79bb52cadfd4 Apr 17 16:31:25.845634 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:25.845612 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4e3fb1c_519b_4c02_9326_fd056001ad1b.slice/crio-2df9cb5a527c201cb9e33c21d0021ed264bd9e74e2a4fcd5e407b5e045b2396a WatchSource:0}: Error finding container 2df9cb5a527c201cb9e33c21d0021ed264bd9e74e2a4fcd5e407b5e045b2396a: Status 404 returned error can't find the container with id 2df9cb5a527c201cb9e33c21d0021ed264bd9e74e2a4fcd5e407b5e045b2396a Apr 17 16:31:25.861472 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:25.861448 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zrk6\" (UniqueName: \"kubernetes.io/projected/10fbceca-37d7-4803-b22d-19039688034a-kube-api-access-6zrk6\") pod \"network-check-target-z2wfh\" (UID: \"10fbceca-37d7-4803-b22d-19039688034a\") " pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:25.861617 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:25.861601 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:25.861764 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:25.861620 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:25.861814 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:25.861768 2573 projected.go:194] Error preparing data for projected volume kube-api-access-6zrk6 for pod openshift-network-diagnostics/network-check-target-z2wfh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:25.861869 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:25.861814 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10fbceca-37d7-4803-b22d-19039688034a-kube-api-access-6zrk6 podName:10fbceca-37d7-4803-b22d-19039688034a nodeName:}" failed. No retries permitted until 2026-04-17 16:31:26.861799301 +0000 UTC m=+4.193300198 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-6zrk6" (UniqueName: "kubernetes.io/projected/10fbceca-37d7-4803-b22d-19039688034a-kube-api-access-6zrk6") pod "network-check-target-z2wfh" (UID: "10fbceca-37d7-4803-b22d-19039688034a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:26.081385 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:26.081347 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:26:24 +0000 UTC" deadline="2027-12-13 01:41:54.778637765 +0000 UTC" Apr 17 16:31:26.081385 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:26.081378 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14505h10m28.69726215s" Apr 17 16:31:26.179275 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:26.179199 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-217.ec2.internal" event={"ID":"57a323f22f4a50ec542cb175406e5b82","Type":"ContainerStarted","Data":"a1852ff9534579fb557101b95f7185fdffd67ea06178a0a02f6917d2b58207c1"} Apr 17 16:31:26.180207 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:26.180180 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9j69g" event={"ID":"6e5430f1-c021-4f4d-bedc-fafa1ec4d260","Type":"ContainerStarted","Data":"2efb0ccd1ca5cd35535f423b66243b8993c7ffda65234dc3e2a6429acc788217"} Apr 17 16:31:26.181771 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:26.181740 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5kfdr" event={"ID":"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed","Type":"ContainerStarted","Data":"7e319e53de3d8028136a4f5033720374ef8bb58416f8e01229bb3d13820036c9"} Apr 17 16:31:26.182775 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:26.182754 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-s2xcr" event={"ID":"04fe438d-c4b2-4123-8dce-24e40c4f8332","Type":"ContainerStarted","Data":"f3778cef5ac1beef07614f8d7bec95ef7855ac9ebaded36ed0e23e9eeb2f60ca"} Apr 17 16:31:26.183797 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:26.183765 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" event={"ID":"74c40262-7919-4fdb-bb29-ede49709d9a0","Type":"ContainerStarted","Data":"9eda666f6dec2f4c1d4075aeef6f2a77abe944b047c2969b407276506a239ad9"} Apr 17 16:31:26.184840 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:26.184792 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hqjjj" event={"ID":"a4e3fb1c-519b-4c02-9326-fd056001ad1b","Type":"ContainerStarted","Data":"2df9cb5a527c201cb9e33c21d0021ed264bd9e74e2a4fcd5e407b5e045b2396a"} Apr 17 16:31:26.185950 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:26.185924 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-n5p68" event={"ID":"74a720b2-f2e9-4ae6-98cf-494d329dd9e7","Type":"ContainerStarted","Data":"72c51543333accbe2c99cb67d14bc0da5f2e8c4df65193eddcea79bb52cadfd4"} Apr 17 16:31:26.186985 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:26.186946 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9dd68" event={"ID":"34116767-97f7-4597-bf99-9ab932940d12","Type":"ContainerStarted","Data":"0d493c26a9fac92704903ba1b68343fe514be39ae0c6b93186e3afff6bbdabe8"} Apr 17 16:31:26.188055 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:26.188028 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wtvtv" event={"ID":"66fec0a1-a09d-4a76-b857-2877ab654053","Type":"ContainerStarted","Data":"744983e4da3217f95bab1cd0941dc17546e07672df891e5c4b43d30056683b94"} Apr 17 16:31:26.189094 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:26.189068 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4s682" event={"ID":"1d463d09-7ae3-4a07-b80e-6078d9f0801d","Type":"ContainerStarted","Data":"04b3026162eed3de3e5ca3e1f112b642dfb24c1062ea5777ffa4f59f14d11330"} Apr 17 16:31:26.193597 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:26.193559 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-217.ec2.internal" podStartSLOduration=2.193548119 podStartE2EDuration="2.193548119s" podCreationTimestamp="2026-04-17 16:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:31:26.193449819 +0000 UTC m=+3.524950746" watchObservedRunningTime="2026-04-17 16:31:26.193548119 +0000 UTC m=+3.525049022" Apr 17 16:31:26.669881 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:26.669837 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs\") pod \"network-metrics-daemon-zkmq8\" (UID: \"0d9d52ff-d172-4b74-90ce-5ef0ac75662c\") " pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:26.670053 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:26.669998 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:26.670114 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:26.670061 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs podName:0d9d52ff-d172-4b74-90ce-5ef0ac75662c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:28.670040659 +0000 UTC m=+6.001541543 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs") pod "network-metrics-daemon-zkmq8" (UID: "0d9d52ff-d172-4b74-90ce-5ef0ac75662c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:26.871458 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:26.871370 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zrk6\" (UniqueName: \"kubernetes.io/projected/10fbceca-37d7-4803-b22d-19039688034a-kube-api-access-6zrk6\") pod \"network-check-target-z2wfh\" (UID: \"10fbceca-37d7-4803-b22d-19039688034a\") " pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:26.871628 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:26.871599 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:26.871628 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:26.871621 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:26.871746 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:26.871635 2573 projected.go:194] Error preparing data for projected volume kube-api-access-6zrk6 for pod openshift-network-diagnostics/network-check-target-z2wfh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:26.871746 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:26.871693 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10fbceca-37d7-4803-b22d-19039688034a-kube-api-access-6zrk6 podName:10fbceca-37d7-4803-b22d-19039688034a nodeName:}" failed. No retries permitted until 2026-04-17 16:31:28.871674166 +0000 UTC m=+6.203175054 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-6zrk6" (UniqueName: "kubernetes.io/projected/10fbceca-37d7-4803-b22d-19039688034a-kube-api-access-6zrk6") pod "network-check-target-z2wfh" (UID: "10fbceca-37d7-4803-b22d-19039688034a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:27.171930 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:27.171354 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:27.171930 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:27.171480 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wfh" podUID="10fbceca-37d7-4803-b22d-19039688034a" Apr 17 16:31:27.172438 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:27.172407 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:27.172543 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:27.172518 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zkmq8" podUID="0d9d52ff-d172-4b74-90ce-5ef0ac75662c" Apr 17 16:31:27.202928 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:27.202897 2573 generic.go:358] "Generic (PLEG): container finished" podID="2807b2563fb554c003c51001f381c040" containerID="b4b8ad23e332b5aff7a25832de260e2efe9eea98564acf45df1081479e557f7f" exitCode=0 Apr 17 16:31:27.203089 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:27.203064 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" event={"ID":"2807b2563fb554c003c51001f381c040","Type":"ContainerDied","Data":"b4b8ad23e332b5aff7a25832de260e2efe9eea98564acf45df1081479e557f7f"} Apr 17 16:31:28.219535 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:28.219495 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" event={"ID":"2807b2563fb554c003c51001f381c040","Type":"ContainerStarted","Data":"483fcc0a4b433f0a724f19d52297ab858e227c0bc72ca6903298b1a066f38706"} Apr 17 16:31:28.690731 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:28.690695 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs\") pod \"network-metrics-daemon-zkmq8\" (UID: \"0d9d52ff-d172-4b74-90ce-5ef0ac75662c\") " pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:28.690925 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:28.690855 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:28.691001 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:28.690928 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs podName:0d9d52ff-d172-4b74-90ce-5ef0ac75662c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:32.690909159 +0000 UTC m=+10.022410042 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs") pod "network-metrics-daemon-zkmq8" (UID: "0d9d52ff-d172-4b74-90ce-5ef0ac75662c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:28.892594 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:28.892547 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zrk6\" (UniqueName: \"kubernetes.io/projected/10fbceca-37d7-4803-b22d-19039688034a-kube-api-access-6zrk6\") pod \"network-check-target-z2wfh\" (UID: \"10fbceca-37d7-4803-b22d-19039688034a\") " pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:28.892775 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:28.892734 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:28.892775 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:28.892753 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:28.892775 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:28.892765 2573 projected.go:194] Error preparing data for projected volume kube-api-access-6zrk6 for pod openshift-network-diagnostics/network-check-target-z2wfh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:28.892967 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:28.892836 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10fbceca-37d7-4803-b22d-19039688034a-kube-api-access-6zrk6 podName:10fbceca-37d7-4803-b22d-19039688034a nodeName:}" failed. No retries permitted until 2026-04-17 16:31:32.892804265 +0000 UTC m=+10.224305145 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-6zrk6" (UniqueName: "kubernetes.io/projected/10fbceca-37d7-4803-b22d-19039688034a-kube-api-access-6zrk6") pod "network-check-target-z2wfh" (UID: "10fbceca-37d7-4803-b22d-19039688034a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:29.171285 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:29.170993 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:29.171285 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:29.171037 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:29.171285 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:29.171129 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wfh" podUID="10fbceca-37d7-4803-b22d-19039688034a" Apr 17 16:31:29.171556 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:29.171306 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zkmq8" podUID="0d9d52ff-d172-4b74-90ce-5ef0ac75662c" Apr 17 16:31:31.172471 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:31.171383 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:31.172471 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:31.171383 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:31.172471 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:31.171549 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zkmq8" podUID="0d9d52ff-d172-4b74-90ce-5ef0ac75662c" Apr 17 16:31:31.172471 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:31.171629 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wfh" podUID="10fbceca-37d7-4803-b22d-19039688034a" Apr 17 16:31:32.722595 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:32.722545 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs\") pod \"network-metrics-daemon-zkmq8\" (UID: \"0d9d52ff-d172-4b74-90ce-5ef0ac75662c\") " pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:32.723108 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:32.722719 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:32.723108 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:32.722802 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs podName:0d9d52ff-d172-4b74-90ce-5ef0ac75662c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:40.722781085 +0000 UTC m=+18.054281970 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs") pod "network-metrics-daemon-zkmq8" (UID: "0d9d52ff-d172-4b74-90ce-5ef0ac75662c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:32.924420 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:32.924379 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zrk6\" (UniqueName: \"kubernetes.io/projected/10fbceca-37d7-4803-b22d-19039688034a-kube-api-access-6zrk6\") pod \"network-check-target-z2wfh\" (UID: \"10fbceca-37d7-4803-b22d-19039688034a\") " pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:32.924607 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:32.924551 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:32.924607 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:32.924576 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:32.924607 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:32.924589 2573 projected.go:194] Error preparing data for projected volume kube-api-access-6zrk6 for pod openshift-network-diagnostics/network-check-target-z2wfh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:32.924758 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:32.924654 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10fbceca-37d7-4803-b22d-19039688034a-kube-api-access-6zrk6 podName:10fbceca-37d7-4803-b22d-19039688034a nodeName:}" failed. No retries permitted until 2026-04-17 16:31:40.924635314 +0000 UTC m=+18.256136216 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-6zrk6" (UniqueName: "kubernetes.io/projected/10fbceca-37d7-4803-b22d-19039688034a-kube-api-access-6zrk6") pod "network-check-target-z2wfh" (UID: "10fbceca-37d7-4803-b22d-19039688034a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:33.172689 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:33.172055 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:33.172689 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:33.172172 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zkmq8" podUID="0d9d52ff-d172-4b74-90ce-5ef0ac75662c" Apr 17 16:31:33.172689 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:33.172567 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:33.172689 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:33.172644 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wfh" podUID="10fbceca-37d7-4803-b22d-19039688034a" Apr 17 16:31:35.171327 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:35.171240 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:35.171724 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:35.171354 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wfh" podUID="10fbceca-37d7-4803-b22d-19039688034a" Apr 17 16:31:35.171724 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:35.171398 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:35.171724 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:35.171466 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zkmq8" podUID="0d9d52ff-d172-4b74-90ce-5ef0ac75662c" Apr 17 16:31:37.171056 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:37.171018 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:37.171569 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:37.171074 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:37.171569 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:37.171151 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wfh" podUID="10fbceca-37d7-4803-b22d-19039688034a" Apr 17 16:31:37.171569 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:37.171297 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zkmq8" podUID="0d9d52ff-d172-4b74-90ce-5ef0ac75662c" Apr 17 16:31:39.170964 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:39.170927 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:39.171387 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:39.170971 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:39.171387 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:39.171062 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wfh" podUID="10fbceca-37d7-4803-b22d-19039688034a" Apr 17 16:31:39.171387 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:39.171192 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zkmq8" podUID="0d9d52ff-d172-4b74-90ce-5ef0ac75662c" Apr 17 16:31:40.784345 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:40.784306 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs\") pod \"network-metrics-daemon-zkmq8\" (UID: \"0d9d52ff-d172-4b74-90ce-5ef0ac75662c\") " pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:40.784735 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:40.784480 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:40.784735 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:40.784582 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs podName:0d9d52ff-d172-4b74-90ce-5ef0ac75662c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:56.784559532 +0000 UTC m=+34.116060426 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs") pod "network-metrics-daemon-zkmq8" (UID: "0d9d52ff-d172-4b74-90ce-5ef0ac75662c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:40.988986 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:40.987400 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zrk6\" (UniqueName: \"kubernetes.io/projected/10fbceca-37d7-4803-b22d-19039688034a-kube-api-access-6zrk6\") pod \"network-check-target-z2wfh\" (UID: \"10fbceca-37d7-4803-b22d-19039688034a\") " pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:40.988986 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:40.987651 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:40.988986 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:40.987672 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:40.988986 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:40.987685 2573 projected.go:194] Error preparing data for projected volume kube-api-access-6zrk6 for pod openshift-network-diagnostics/network-check-target-z2wfh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:40.988986 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:40.987759 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10fbceca-37d7-4803-b22d-19039688034a-kube-api-access-6zrk6 podName:10fbceca-37d7-4803-b22d-19039688034a nodeName:}" failed. No retries permitted until 2026-04-17 16:31:56.987740655 +0000 UTC m=+34.319241551 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-6zrk6" (UniqueName: "kubernetes.io/projected/10fbceca-37d7-4803-b22d-19039688034a-kube-api-access-6zrk6") pod "network-check-target-z2wfh" (UID: "10fbceca-37d7-4803-b22d-19039688034a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:41.171171 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:41.171082 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:41.171171 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:41.171125 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:41.171380 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:41.171232 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zkmq8" podUID="0d9d52ff-d172-4b74-90ce-5ef0ac75662c" Apr 17 16:31:41.171380 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:41.171308 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wfh" podUID="10fbceca-37d7-4803-b22d-19039688034a" Apr 17 16:31:43.173286 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:43.173049 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:43.173670 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:43.173399 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zkmq8" podUID="0d9d52ff-d172-4b74-90ce-5ef0ac75662c" Apr 17 16:31:43.173670 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:43.173146 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:43.173670 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:43.173491 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wfh" podUID="10fbceca-37d7-4803-b22d-19039688034a" Apr 17 16:31:43.244776 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:43.244745 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5kfdr" event={"ID":"7238545f-7cf1-4c61-a1dc-f5a458a0c5ed","Type":"ContainerStarted","Data":"460553ec57ebf8601dd33ab0f26281809b7389062127b9def9e7b6b0fcb0e509"} Apr 17 16:31:43.245837 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:43.245799 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-s2xcr" event={"ID":"04fe438d-c4b2-4123-8dce-24e40c4f8332","Type":"ContainerStarted","Data":"bda39a73ad6e5fb947734f5a56465da0c264d40f02edd647a7727fa0ed8a2836"} Apr 17 16:31:43.246935 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:43.246848 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" event={"ID":"74c40262-7919-4fdb-bb29-ede49709d9a0","Type":"ContainerStarted","Data":"2f1b635e3e0b1a41f06d533c0239794a3494fd345118966fc53611e017e626d7"} Apr 17 16:31:43.247676 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:43.247658 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hqjjj" event={"ID":"a4e3fb1c-519b-4c02-9326-fd056001ad1b","Type":"ContainerStarted","Data":"f8437be4936a3aa5f8a8d29e5bd4e6ce0fa934ada6e2cac4e6fed8cc182b28ce"} Apr 17 16:31:43.248522 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:43.248499 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9dd68" event={"ID":"34116767-97f7-4597-bf99-9ab932940d12","Type":"ContainerStarted","Data":"36ddb4ba454752d3ef6094faa93bfd77a330d10175512c40e3096717b172034e"} Apr 17 16:31:43.249582 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:43.249560 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4s682" event={"ID":"1d463d09-7ae3-4a07-b80e-6078d9f0801d","Type":"ContainerStarted","Data":"0df3f95e2cc186e627a488c6bcb14a45d3b1a6ffafdb1722c73f181130ba4848"} Apr 17 16:31:43.250438 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:43.250407 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9j69g" event={"ID":"6e5430f1-c021-4f4d-bedc-fafa1ec4d260","Type":"ContainerStarted","Data":"66f64224c7c494415b91520da471d56219f78ce3f0d4eeb800c23465483ec785"} Apr 17 16:31:43.267200 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:43.267153 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5kfdr" podStartSLOduration=3.206247578 podStartE2EDuration="20.267138133s" podCreationTimestamp="2026-04-17 16:31:23 +0000 UTC" firstStartedPulling="2026-04-17 16:31:25.846774226 +0000 UTC m=+3.178275106" lastFinishedPulling="2026-04-17 16:31:42.907664761 +0000 UTC m=+20.239165661" observedRunningTime="2026-04-17 16:31:43.267119733 +0000 UTC m=+20.598620636" watchObservedRunningTime="2026-04-17 16:31:43.267138133 +0000 UTC m=+20.598639037" Apr 17 16:31:43.267778 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:43.267711 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-217.ec2.internal" podStartSLOduration=19.267703454 podStartE2EDuration="19.267703454s" podCreationTimestamp="2026-04-17 16:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:31:28.235795004 +0000 UTC m=+5.567295908" watchObservedRunningTime="2026-04-17 16:31:43.267703454 +0000 UTC m=+20.599204356" Apr 17 16:31:43.320174 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:43.320135 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9dd68" podStartSLOduration=3.266247362 podStartE2EDuration="20.320121818s" podCreationTimestamp="2026-04-17 16:31:23 +0000 UTC" firstStartedPulling="2026-04-17 16:31:25.840460515 +0000 UTC m=+3.171961396" lastFinishedPulling="2026-04-17 16:31:42.894334964 +0000 UTC m=+20.225835852" observedRunningTime="2026-04-17 16:31:43.290469499 +0000 UTC m=+20.621970402" watchObservedRunningTime="2026-04-17 16:31:43.320121818 +0000 UTC m=+20.651622720" Apr 17 16:31:43.349250 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:43.349213 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-s2xcr" podStartSLOduration=11.431515059 podStartE2EDuration="20.349189982s" podCreationTimestamp="2026-04-17 16:31:23 +0000 UTC" firstStartedPulling="2026-04-17 16:31:25.844186944 +0000 UTC m=+3.175687829" lastFinishedPulling="2026-04-17 16:31:34.761861871 +0000 UTC m=+12.093362752" observedRunningTime="2026-04-17 16:31:43.348559366 +0000 UTC m=+20.680060268" watchObservedRunningTime="2026-04-17 16:31:43.349189982 +0000 UTC m=+20.680690884" Apr 17 16:31:44.255461 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:44.255239 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 16:31:44.256125 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:44.255724 2573 generic.go:358] "Generic (PLEG): container finished" podID="1d463d09-7ae3-4a07-b80e-6078d9f0801d" containerID="672db1263c30db7a5ce1c780aed6ed6c21eb8c9f36ec965201280ff96f941058" exitCode=1 Apr 17 16:31:44.256125 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:44.255794 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4s682" event={"ID":"1d463d09-7ae3-4a07-b80e-6078d9f0801d","Type":"ContainerStarted","Data":"e7a82b4cdd58a887c3367140afeee530747cfa957b19fcab7d7c0b0dce1f1e44"} Apr 17 16:31:44.256125 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:44.255840 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4s682" event={"ID":"1d463d09-7ae3-4a07-b80e-6078d9f0801d","Type":"ContainerStarted","Data":"9c9e37981aa837e10ea5873eac712ae1afe9dc3407c8cc80813614bd80cf7c5f"} Apr 17 16:31:44.256125 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:44.255850 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4s682" event={"ID":"1d463d09-7ae3-4a07-b80e-6078d9f0801d","Type":"ContainerStarted","Data":"eda8942114cd40a63859d9d26a1c00ce15a367f33084b01ae74674bd743ad633"} Apr 17 16:31:44.256125 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:44.255858 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4s682" event={"ID":"1d463d09-7ae3-4a07-b80e-6078d9f0801d","Type":"ContainerStarted","Data":"292b9e7fd6913009bd614637d12ccb1cc54d2265f53d70f45850f75bb85900c8"} Apr 17 16:31:44.256125 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:44.255871 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4s682" event={"ID":"1d463d09-7ae3-4a07-b80e-6078d9f0801d","Type":"ContainerDied","Data":"672db1263c30db7a5ce1c780aed6ed6c21eb8c9f36ec965201280ff96f941058"} Apr 17 16:31:44.256992 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:44.256970 2573 generic.go:358] "Generic (PLEG): container finished" podID="6e5430f1-c021-4f4d-bedc-fafa1ec4d260" containerID="66f64224c7c494415b91520da471d56219f78ce3f0d4eeb800c23465483ec785" exitCode=0 Apr 17 16:31:44.257100 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:44.257048 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9j69g" event={"ID":"6e5430f1-c021-4f4d-bedc-fafa1ec4d260","Type":"ContainerDied","Data":"66f64224c7c494415b91520da471d56219f78ce3f0d4eeb800c23465483ec785"} Apr 17 16:31:44.258268 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:44.258237 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wtvtv" event={"ID":"66fec0a1-a09d-4a76-b857-2877ab654053","Type":"ContainerStarted","Data":"bb65130634aeb56177d2b27679c22a5a8073a32c98027aa7a1368b16ebfdc360"} Apr 17 16:31:44.300460 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:44.300411 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hqjjj" podStartSLOduration=4.2567922639999995 podStartE2EDuration="21.300396805s" podCreationTimestamp="2026-04-17 16:31:23 +0000 UTC" firstStartedPulling="2026-04-17 16:31:25.847999331 +0000 UTC m=+3.179500212" lastFinishedPulling="2026-04-17 16:31:42.891603857 +0000 UTC m=+20.223104753" observedRunningTime="2026-04-17 16:31:44.299996284 +0000 UTC m=+21.631497188" watchObservedRunningTime="2026-04-17 16:31:44.300396805 +0000 UTC m=+21.631897706" Apr 17 16:31:44.589783 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:44.589760 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 16:31:45.040779 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:45.040740 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-s2xcr" Apr 17 16:31:45.041686 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:45.041496 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-s2xcr" Apr 17 16:31:45.057333 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:45.057278 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wtvtv" podStartSLOduration=5.005156037 podStartE2EDuration="22.057258207s" podCreationTimestamp="2026-04-17 16:31:23 +0000 UTC" firstStartedPulling="2026-04-17 16:31:25.839503219 +0000 UTC m=+3.171004099" lastFinishedPulling="2026-04-17 16:31:42.891605374 +0000 UTC m=+20.223106269" observedRunningTime="2026-04-17 16:31:44.315714805 +0000 UTC m=+21.647215707" watchObservedRunningTime="2026-04-17 16:31:45.057258207 +0000 UTC m=+22.388759111" Apr 17 16:31:45.106918 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:45.106800 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T16:31:44.589777451Z","UUID":"3dd8483c-179a-49c0-ae12-e1a086bc32fe","Handler":null,"Name":"","Endpoint":""} Apr 17 16:31:45.108852 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:45.108815 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 16:31:45.108852 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:45.108860 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 16:31:45.171006 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:45.170972 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:45.171006 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:45.170986 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:45.171240 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:45.171093 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wfh" podUID="10fbceca-37d7-4803-b22d-19039688034a" Apr 17 16:31:45.171240 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:45.171227 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zkmq8" podUID="0d9d52ff-d172-4b74-90ce-5ef0ac75662c" Apr 17 16:31:45.263105 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:45.263071 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" event={"ID":"74c40262-7919-4fdb-bb29-ede49709d9a0","Type":"ContainerStarted","Data":"72886139ab8483fef906fc4e1cd703d6f24da0c6b80fe9e6f463ed262012205b"} Apr 17 16:31:45.264925 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:45.264682 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-n5p68" event={"ID":"74a720b2-f2e9-4ae6-98cf-494d329dd9e7","Type":"ContainerStarted","Data":"6ea96963c2b06a85fcd855d1bc9d61fcfd54d48d8a2ed9ba64332b9ee0756773"} Apr 17 16:31:45.265052 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:45.265018 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-s2xcr" Apr 17 16:31:45.265728 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:45.265712 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-s2xcr" Apr 17 16:31:45.296539 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:45.296442 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-n5p68" podStartSLOduration=5.303553745 podStartE2EDuration="22.296424937s" podCreationTimestamp="2026-04-17 16:31:23 +0000 UTC" firstStartedPulling="2026-04-17 16:31:25.847379195 +0000 UTC m=+3.178880083" lastFinishedPulling="2026-04-17 16:31:42.84025038 +0000 UTC m=+20.171751275" observedRunningTime="2026-04-17 16:31:45.296042727 +0000 UTC m=+22.627543630" watchObservedRunningTime="2026-04-17 16:31:45.296424937 +0000 UTC m=+22.627925840" Apr 17 16:31:46.269806 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:46.269604 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 16:31:46.270310 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:46.270155 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4s682" event={"ID":"1d463d09-7ae3-4a07-b80e-6078d9f0801d","Type":"ContainerStarted","Data":"ed4973d7f903f8c07a5f81f86808c3fbd59ae871ca565403732dca7425f3ca62"} Apr 17 16:31:46.272166 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:46.272133 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" event={"ID":"74c40262-7919-4fdb-bb29-ede49709d9a0","Type":"ContainerStarted","Data":"6d827551f0f99d1642912d29df5a30c3bb72b0bdd15bdddde2c5638a7242aa41"} Apr 17 16:31:46.292363 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:46.292312 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cbqgt" podStartSLOduration=3.170810749 podStartE2EDuration="23.292298466s" podCreationTimestamp="2026-04-17 16:31:23 +0000 UTC" firstStartedPulling="2026-04-17 16:31:25.843025557 +0000 UTC m=+3.174526451" lastFinishedPulling="2026-04-17 16:31:45.964513288 +0000 UTC m=+23.296014168" observedRunningTime="2026-04-17 16:31:46.29185572 +0000 UTC m=+23.623356623" watchObservedRunningTime="2026-04-17 16:31:46.292298466 +0000 UTC m=+23.623799368" Apr 17 16:31:47.171294 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:47.171260 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:47.171475 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:47.171415 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wfh" podUID="10fbceca-37d7-4803-b22d-19039688034a" Apr 17 16:31:47.171475 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:47.171458 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:47.171583 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:47.171556 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zkmq8" podUID="0d9d52ff-d172-4b74-90ce-5ef0ac75662c" Apr 17 16:31:49.171510 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:49.171278 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:49.172185 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:49.171341 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:49.172185 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:49.171530 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wfh" podUID="10fbceca-37d7-4803-b22d-19039688034a" Apr 17 16:31:49.172185 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:49.171598 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zkmq8" podUID="0d9d52ff-d172-4b74-90ce-5ef0ac75662c" Apr 17 16:31:49.278952 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:49.278916 2573 generic.go:358] "Generic (PLEG): container finished" podID="6e5430f1-c021-4f4d-bedc-fafa1ec4d260" containerID="39a1d2d969e6441c20c7ea659e76a47fce16f0b70177a111d35c897f513053b3" exitCode=0 Apr 17 16:31:49.279108 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:49.278990 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9j69g" event={"ID":"6e5430f1-c021-4f4d-bedc-fafa1ec4d260","Type":"ContainerDied","Data":"39a1d2d969e6441c20c7ea659e76a47fce16f0b70177a111d35c897f513053b3"} Apr 17 16:31:49.282063 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:49.282043 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 16:31:49.282407 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:49.282383 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4s682" event={"ID":"1d463d09-7ae3-4a07-b80e-6078d9f0801d","Type":"ContainerStarted","Data":"250aca70a06cd6f0515fa6244235d8f17e327747181cadcf0683a24d58c6a13f"} Apr 17 16:31:49.282768 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:49.282748 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:49.282867 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:49.282775 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:49.282969 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:49.282957 2573 scope.go:117] "RemoveContainer" containerID="672db1263c30db7a5ce1c780aed6ed6c21eb8c9f36ec965201280ff96f941058" Apr 17 16:31:49.298709 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:49.298688 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:50.286258 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:50.286168 2573 generic.go:358] "Generic (PLEG): container finished" podID="6e5430f1-c021-4f4d-bedc-fafa1ec4d260" containerID="2ad7c1f9c282d3ee1239ea9538474d6049c4ba95b95bf593a6c2eb248352f9a3" exitCode=0 Apr 17 16:31:50.286663 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:50.286249 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9j69g" event={"ID":"6e5430f1-c021-4f4d-bedc-fafa1ec4d260","Type":"ContainerDied","Data":"2ad7c1f9c282d3ee1239ea9538474d6049c4ba95b95bf593a6c2eb248352f9a3"} Apr 17 16:31:50.289908 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:50.289890 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 16:31:50.290243 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:50.290220 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4s682" event={"ID":"1d463d09-7ae3-4a07-b80e-6078d9f0801d","Type":"ContainerStarted","Data":"657238cd91a967d96916ffe95be7064f9d7cdf6ae0e7f8eb50a0a8a3c700f02a"} Apr 17 16:31:50.290517 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:50.290496 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:50.305273 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:50.305246 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:31:50.335723 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:50.335676 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4s682" podStartSLOduration=10.238249201 podStartE2EDuration="27.335660491s" podCreationTimestamp="2026-04-17 16:31:23 +0000 UTC" firstStartedPulling="2026-04-17 16:31:25.838683446 +0000 UTC m=+3.170184331" lastFinishedPulling="2026-04-17 16:31:42.936094723 +0000 UTC m=+20.267595621" observedRunningTime="2026-04-17 16:31:50.334965505 +0000 UTC m=+27.666466407" watchObservedRunningTime="2026-04-17 16:31:50.335660491 +0000 UTC m=+27.667161393" Apr 17 16:31:50.586760 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:50.586681 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zkmq8"] Apr 17 16:31:50.586942 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:50.586848 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:50.587002 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:50.586963 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zkmq8" podUID="0d9d52ff-d172-4b74-90ce-5ef0ac75662c" Apr 17 16:31:50.589380 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:50.589356 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-z2wfh"] Apr 17 16:31:50.589470 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:50.589450 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:50.589538 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:50.589519 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wfh" podUID="10fbceca-37d7-4803-b22d-19039688034a" Apr 17 16:31:51.294902 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:51.294625 2573 generic.go:358] "Generic (PLEG): container finished" podID="6e5430f1-c021-4f4d-bedc-fafa1ec4d260" containerID="6933cb868e78cd2c2a02f1c24b11578a32075fa37475c79c2dfba2ecb4695546" exitCode=0 Apr 17 16:31:51.295271 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:51.294676 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9j69g" event={"ID":"6e5430f1-c021-4f4d-bedc-fafa1ec4d260","Type":"ContainerDied","Data":"6933cb868e78cd2c2a02f1c24b11578a32075fa37475c79c2dfba2ecb4695546"} Apr 17 16:31:52.171170 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:52.171136 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:52.171339 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:52.171143 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:52.171339 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:52.171273 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zkmq8" podUID="0d9d52ff-d172-4b74-90ce-5ef0ac75662c" Apr 17 16:31:52.171449 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:52.171349 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wfh" podUID="10fbceca-37d7-4803-b22d-19039688034a" Apr 17 16:31:54.170819 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:54.170730 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:54.171611 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:54.170737 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:54.171611 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:54.170870 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wfh" podUID="10fbceca-37d7-4803-b22d-19039688034a" Apr 17 16:31:54.171611 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:54.170969 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zkmq8" podUID="0d9d52ff-d172-4b74-90ce-5ef0ac75662c" Apr 17 16:31:56.034860 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.034775 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-217.ec2.internal" event="NodeReady" Apr 17 16:31:56.035453 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.034922 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 16:31:56.092305 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.092276 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jdmrz"] Apr 17 16:31:56.096731 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.096702 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zb6kc"] Apr 17 16:31:56.096917 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.096892 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jdmrz" Apr 17 16:31:56.099814 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.099551 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zb6kc" Apr 17 16:31:56.099814 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.099602 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 16:31:56.099814 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.099619 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 16:31:56.099814 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.099618 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f96dz\"" Apr 17 16:31:56.101588 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.101567 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 16:31:56.101861 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.101841 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-txqpq\"" Apr 17 16:31:56.101983 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.101885 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 16:31:56.101983 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.101929 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 16:31:56.106114 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.105656 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jdmrz"] Apr 17 16:31:56.108817 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.108795 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zb6kc"] Apr 17 16:31:56.171209 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.171176 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:56.171374 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.171176 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:56.173816 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.173792 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:31:56.173816 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.173806 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:31:56.174000 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.173905 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bq7nv\"" Apr 17 16:31:56.174000 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.173910 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:31:56.174000 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.173917 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-z6rg9\"" Apr 17 16:31:56.207596 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.207569 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vrll\" (UniqueName: \"kubernetes.io/projected/20e90cc0-23dd-4714-b716-64ca208935e2-kube-api-access-9vrll\") pod \"ingress-canary-zb6kc\" (UID: \"20e90cc0-23dd-4714-b716-64ca208935e2\") " pod="openshift-ingress-canary/ingress-canary-zb6kc" Apr 17 16:31:56.207704 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.207613 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-tmp-dir\") pod \"dns-default-jdmrz\" (UID: \"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c\") " pod="openshift-dns/dns-default-jdmrz" Apr 17 16:31:56.207704 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.207646 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-config-volume\") pod \"dns-default-jdmrz\" (UID: \"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c\") " pod="openshift-dns/dns-default-jdmrz" Apr 17 16:31:56.207704 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.207678 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert\") pod \"ingress-canary-zb6kc\" (UID: \"20e90cc0-23dd-4714-b716-64ca208935e2\") " pod="openshift-ingress-canary/ingress-canary-zb6kc" Apr 17 16:31:56.207855 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.207775 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls\") pod \"dns-default-jdmrz\" (UID: \"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c\") " pod="openshift-dns/dns-default-jdmrz" Apr 17 16:31:56.207855 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.207800 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnm4l\" (UniqueName: \"kubernetes.io/projected/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-kube-api-access-xnm4l\") pod \"dns-default-jdmrz\" (UID: \"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c\") " pod="openshift-dns/dns-default-jdmrz" Apr 17 16:31:56.308424 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.308350 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls\") pod \"dns-default-jdmrz\" (UID: \"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c\") " pod="openshift-dns/dns-default-jdmrz" Apr 17 16:31:56.308424 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.308406 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnm4l\" (UniqueName: \"kubernetes.io/projected/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-kube-api-access-xnm4l\") pod \"dns-default-jdmrz\" (UID: \"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c\") " pod="openshift-dns/dns-default-jdmrz" Apr 17 16:31:56.308653 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.308427 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vrll\" (UniqueName: \"kubernetes.io/projected/20e90cc0-23dd-4714-b716-64ca208935e2-kube-api-access-9vrll\") pod \"ingress-canary-zb6kc\" (UID: \"20e90cc0-23dd-4714-b716-64ca208935e2\") " pod="openshift-ingress-canary/ingress-canary-zb6kc" Apr 17 16:31:56.308653 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.308448 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-tmp-dir\") pod \"dns-default-jdmrz\" (UID: \"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c\") " pod="openshift-dns/dns-default-jdmrz" Apr 17 16:31:56.308653 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.308474 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-config-volume\") pod \"dns-default-jdmrz\" (UID: \"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c\") " pod="openshift-dns/dns-default-jdmrz" Apr 17 16:31:56.308653 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.308503 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert\") pod \"ingress-canary-zb6kc\" (UID: \"20e90cc0-23dd-4714-b716-64ca208935e2\") " pod="openshift-ingress-canary/ingress-canary-zb6kc" Apr 17 16:31:56.308653 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:56.308517 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:56.308653 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:56.308593 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls podName:fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:56.808572656 +0000 UTC m=+34.140073536 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls") pod "dns-default-jdmrz" (UID: "fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c") : secret "dns-default-metrics-tls" not found Apr 17 16:31:56.308653 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:56.308591 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:56.308653 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:56.308628 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert podName:20e90cc0-23dd-4714-b716-64ca208935e2 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:56.808619574 +0000 UTC m=+34.140120456 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert") pod "ingress-canary-zb6kc" (UID: "20e90cc0-23dd-4714-b716-64ca208935e2") : secret "canary-serving-cert" not found Apr 17 16:31:56.309055 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.308859 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-tmp-dir\") pod \"dns-default-jdmrz\" (UID: \"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c\") " pod="openshift-dns/dns-default-jdmrz" Apr 17 16:31:56.309146 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.309130 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-config-volume\") pod \"dns-default-jdmrz\" (UID: \"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c\") " pod="openshift-dns/dns-default-jdmrz" Apr 17 16:31:56.321411 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.321389 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnm4l\" (UniqueName: \"kubernetes.io/projected/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-kube-api-access-xnm4l\") pod \"dns-default-jdmrz\" (UID: \"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c\") " pod="openshift-dns/dns-default-jdmrz" Apr 17 16:31:56.321540 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.321507 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vrll\" (UniqueName: \"kubernetes.io/projected/20e90cc0-23dd-4714-b716-64ca208935e2-kube-api-access-9vrll\") pod \"ingress-canary-zb6kc\" (UID: \"20e90cc0-23dd-4714-b716-64ca208935e2\") " pod="openshift-ingress-canary/ingress-canary-zb6kc" Apr 17 16:31:56.812058 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.812022 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert\") pod \"ingress-canary-zb6kc\" (UID: \"20e90cc0-23dd-4714-b716-64ca208935e2\") " pod="openshift-ingress-canary/ingress-canary-zb6kc" Apr 17 16:31:56.812218 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.812074 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs\") pod \"network-metrics-daemon-zkmq8\" (UID: \"0d9d52ff-d172-4b74-90ce-5ef0ac75662c\") " pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:31:56.812218 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.812095 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls\") pod \"dns-default-jdmrz\" (UID: \"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c\") " pod="openshift-dns/dns-default-jdmrz" Apr 17 16:31:56.812218 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:56.812175 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:56.812218 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:56.812177 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:31:56.812218 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:56.812219 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls podName:fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:57.812206429 +0000 UTC m=+35.143707309 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls") pod "dns-default-jdmrz" (UID: "fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c") : secret "dns-default-metrics-tls" not found Apr 17 16:31:56.812376 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:56.812179 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:56.812376 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:56.812231 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs podName:0d9d52ff-d172-4b74-90ce-5ef0ac75662c nodeName:}" failed. No retries permitted until 2026-04-17 16:32:28.812225768 +0000 UTC m=+66.143726647 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs") pod "network-metrics-daemon-zkmq8" (UID: "0d9d52ff-d172-4b74-90ce-5ef0ac75662c") : secret "metrics-daemon-secret" not found Apr 17 16:31:56.812376 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:56.812253 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert podName:20e90cc0-23dd-4714-b716-64ca208935e2 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:57.812236539 +0000 UTC m=+35.143737419 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert") pod "ingress-canary-zb6kc" (UID: "20e90cc0-23dd-4714-b716-64ca208935e2") : secret "canary-serving-cert" not found Apr 17 16:31:56.913222 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.913188 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9"] Apr 17 16:31:56.936621 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.936591 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9"] Apr 17 16:31:56.936763 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.936699 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" Apr 17 16:31:56.939184 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.939159 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 16:31:56.939354 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.939335 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 16:31:56.939439 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.939411 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 16:31:56.940165 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.940124 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 16:31:56.940287 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.940194 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 16:31:56.940287 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.940233 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 16:31:56.940380 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:56.940364 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 16:31:57.013458 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.013416 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zrk6\" (UniqueName: \"kubernetes.io/projected/10fbceca-37d7-4803-b22d-19039688034a-kube-api-access-6zrk6\") pod \"network-check-target-z2wfh\" (UID: \"10fbceca-37d7-4803-b22d-19039688034a\") " pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:57.015929 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.015911 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zrk6\" (UniqueName: \"kubernetes.io/projected/10fbceca-37d7-4803-b22d-19039688034a-kube-api-access-6zrk6\") pod \"network-check-target-z2wfh\" (UID: \"10fbceca-37d7-4803-b22d-19039688034a\") " pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:57.086619 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.086602 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:31:57.114089 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.114056 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/26f8780b-28b6-4679-8d3f-03a3a62e0358-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-bbcddcf55-khcj9\" (UID: \"26f8780b-28b6-4679-8d3f-03a3a62e0358\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" Apr 17 16:31:57.114237 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.114097 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prn7k\" (UniqueName: \"kubernetes.io/projected/26f8780b-28b6-4679-8d3f-03a3a62e0358-kube-api-access-prn7k\") pod \"cluster-proxy-proxy-agent-bbcddcf55-khcj9\" (UID: \"26f8780b-28b6-4679-8d3f-03a3a62e0358\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" Apr 17 16:31:57.114237 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.114125 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/26f8780b-28b6-4679-8d3f-03a3a62e0358-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-bbcddcf55-khcj9\" (UID: \"26f8780b-28b6-4679-8d3f-03a3a62e0358\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" Apr 17 16:31:57.114347 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.114243 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/26f8780b-28b6-4679-8d3f-03a3a62e0358-ca\") pod \"cluster-proxy-proxy-agent-bbcddcf55-khcj9\" (UID: \"26f8780b-28b6-4679-8d3f-03a3a62e0358\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" Apr 17 16:31:57.114347 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.114281 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/26f8780b-28b6-4679-8d3f-03a3a62e0358-hub\") pod \"cluster-proxy-proxy-agent-bbcddcf55-khcj9\" (UID: \"26f8780b-28b6-4679-8d3f-03a3a62e0358\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" Apr 17 16:31:57.114347 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.114310 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/26f8780b-28b6-4679-8d3f-03a3a62e0358-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-bbcddcf55-khcj9\" (UID: \"26f8780b-28b6-4679-8d3f-03a3a62e0358\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" Apr 17 16:31:57.215150 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.214941 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/26f8780b-28b6-4679-8d3f-03a3a62e0358-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-bbcddcf55-khcj9\" (UID: \"26f8780b-28b6-4679-8d3f-03a3a62e0358\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" Apr 17 16:31:57.215262 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.215192 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prn7k\" (UniqueName: \"kubernetes.io/projected/26f8780b-28b6-4679-8d3f-03a3a62e0358-kube-api-access-prn7k\") pod \"cluster-proxy-proxy-agent-bbcddcf55-khcj9\" (UID: \"26f8780b-28b6-4679-8d3f-03a3a62e0358\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" Apr 17 16:31:57.215262 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.215234 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/26f8780b-28b6-4679-8d3f-03a3a62e0358-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-bbcddcf55-khcj9\" (UID: \"26f8780b-28b6-4679-8d3f-03a3a62e0358\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" Apr 17 16:31:57.215371 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.215315 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/26f8780b-28b6-4679-8d3f-03a3a62e0358-ca\") pod \"cluster-proxy-proxy-agent-bbcddcf55-khcj9\" (UID: \"26f8780b-28b6-4679-8d3f-03a3a62e0358\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" Apr 17 16:31:57.215371 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.215340 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/26f8780b-28b6-4679-8d3f-03a3a62e0358-hub\") pod \"cluster-proxy-proxy-agent-bbcddcf55-khcj9\" (UID: \"26f8780b-28b6-4679-8d3f-03a3a62e0358\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" Apr 17 16:31:57.215371 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.215362 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/26f8780b-28b6-4679-8d3f-03a3a62e0358-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-bbcddcf55-khcj9\" (UID: \"26f8780b-28b6-4679-8d3f-03a3a62e0358\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" Apr 17 16:31:57.216189 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.216133 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/26f8780b-28b6-4679-8d3f-03a3a62e0358-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-bbcddcf55-khcj9\" (UID: \"26f8780b-28b6-4679-8d3f-03a3a62e0358\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" Apr 17 16:31:57.218853 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.218812 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/26f8780b-28b6-4679-8d3f-03a3a62e0358-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-bbcddcf55-khcj9\" (UID: \"26f8780b-28b6-4679-8d3f-03a3a62e0358\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" Apr 17 16:31:57.218968 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.218851 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/26f8780b-28b6-4679-8d3f-03a3a62e0358-ca\") pod \"cluster-proxy-proxy-agent-bbcddcf55-khcj9\" (UID: \"26f8780b-28b6-4679-8d3f-03a3a62e0358\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" Apr 17 16:31:57.218968 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.218912 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/26f8780b-28b6-4679-8d3f-03a3a62e0358-hub\") pod \"cluster-proxy-proxy-agent-bbcddcf55-khcj9\" (UID: \"26f8780b-28b6-4679-8d3f-03a3a62e0358\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" Apr 17 16:31:57.218968 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.218911 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/26f8780b-28b6-4679-8d3f-03a3a62e0358-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-bbcddcf55-khcj9\" (UID: \"26f8780b-28b6-4679-8d3f-03a3a62e0358\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" Apr 17 16:31:57.223230 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.223209 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prn7k\" (UniqueName: \"kubernetes.io/projected/26f8780b-28b6-4679-8d3f-03a3a62e0358-kube-api-access-prn7k\") pod \"cluster-proxy-proxy-agent-bbcddcf55-khcj9\" (UID: \"26f8780b-28b6-4679-8d3f-03a3a62e0358\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" Apr 17 16:31:57.253893 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.253865 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-z2wfh"] Apr 17 16:31:57.254059 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.253906 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" Apr 17 16:31:57.258858 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:57.258814 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10fbceca_37d7_4803_b22d_19039688034a.slice/crio-839d5ffe683491110f8ceac40ae16cd131f78c507ddbdade5a6bd6650e96c7c7 WatchSource:0}: Error finding container 839d5ffe683491110f8ceac40ae16cd131f78c507ddbdade5a6bd6650e96c7c7: Status 404 returned error can't find the container with id 839d5ffe683491110f8ceac40ae16cd131f78c507ddbdade5a6bd6650e96c7c7 Apr 17 16:31:57.310520 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.310484 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-z2wfh" event={"ID":"10fbceca-37d7-4803-b22d-19039688034a","Type":"ContainerStarted","Data":"839d5ffe683491110f8ceac40ae16cd131f78c507ddbdade5a6bd6650e96c7c7"} Apr 17 16:31:57.313807 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.313770 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9j69g" event={"ID":"6e5430f1-c021-4f4d-bedc-fafa1ec4d260","Type":"ContainerStarted","Data":"7939162174bd8069928d2dc3a1d5333d1eef641957dd06263e5558d40689ceba"} Apr 17 16:31:57.383808 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.383774 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9"] Apr 17 16:31:57.387842 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:31:57.387799 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26f8780b_28b6_4679_8d3f_03a3a62e0358.slice/crio-640ba54749702707c7218afec2f18d20ebd3727d6e01e9218943c1264c495cec WatchSource:0}: Error finding container 640ba54749702707c7218afec2f18d20ebd3727d6e01e9218943c1264c495cec: Status 404 returned error can't find the container with id 640ba54749702707c7218afec2f18d20ebd3727d6e01e9218943c1264c495cec Apr 17 16:31:57.821226 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.821190 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert\") pod \"ingress-canary-zb6kc\" (UID: \"20e90cc0-23dd-4714-b716-64ca208935e2\") " pod="openshift-ingress-canary/ingress-canary-zb6kc" Apr 17 16:31:57.821401 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:57.821252 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls\") pod \"dns-default-jdmrz\" (UID: \"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c\") " pod="openshift-dns/dns-default-jdmrz" Apr 17 16:31:57.821401 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:57.821333 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:57.821401 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:57.821392 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert podName:20e90cc0-23dd-4714-b716-64ca208935e2 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:59.821376398 +0000 UTC m=+37.152877278 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert") pod "ingress-canary-zb6kc" (UID: "20e90cc0-23dd-4714-b716-64ca208935e2") : secret "canary-serving-cert" not found Apr 17 16:31:57.821523 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:57.821339 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:57.821523 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:57.821473 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls podName:fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:59.821459404 +0000 UTC m=+37.152960290 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls") pod "dns-default-jdmrz" (UID: "fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c") : secret "dns-default-metrics-tls" not found Apr 17 16:31:58.319586 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:58.319543 2573 generic.go:358] "Generic (PLEG): container finished" podID="6e5430f1-c021-4f4d-bedc-fafa1ec4d260" containerID="7939162174bd8069928d2dc3a1d5333d1eef641957dd06263e5558d40689ceba" exitCode=0 Apr 17 16:31:58.320108 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:58.319641 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9j69g" event={"ID":"6e5430f1-c021-4f4d-bedc-fafa1ec4d260","Type":"ContainerDied","Data":"7939162174bd8069928d2dc3a1d5333d1eef641957dd06263e5558d40689ceba"} Apr 17 16:31:58.321648 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:58.321142 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" event={"ID":"26f8780b-28b6-4679-8d3f-03a3a62e0358","Type":"ContainerStarted","Data":"640ba54749702707c7218afec2f18d20ebd3727d6e01e9218943c1264c495cec"} Apr 17 16:31:59.326641 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:59.326423 2573 generic.go:358] "Generic (PLEG): container finished" podID="6e5430f1-c021-4f4d-bedc-fafa1ec4d260" containerID="f3de506c458628f57a5822c06298ff4aa24acc46ce72e57ccc81eb9564ef7067" exitCode=0 Apr 17 16:31:59.326641 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:59.326624 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9j69g" event={"ID":"6e5430f1-c021-4f4d-bedc-fafa1ec4d260","Type":"ContainerDied","Data":"f3de506c458628f57a5822c06298ff4aa24acc46ce72e57ccc81eb9564ef7067"} Apr 17 16:31:59.837431 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:59.837395 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert\") pod \"ingress-canary-zb6kc\" (UID: \"20e90cc0-23dd-4714-b716-64ca208935e2\") " pod="openshift-ingress-canary/ingress-canary-zb6kc" Apr 17 16:31:59.837667 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:31:59.837471 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls\") pod \"dns-default-jdmrz\" (UID: \"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c\") " pod="openshift-dns/dns-default-jdmrz" Apr 17 16:31:59.837667 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:59.837523 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:59.837667 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:59.837593 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:59.837667 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:59.837610 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert podName:20e90cc0-23dd-4714-b716-64ca208935e2 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:03.837591928 +0000 UTC m=+41.169092816 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert") pod "ingress-canary-zb6kc" (UID: "20e90cc0-23dd-4714-b716-64ca208935e2") : secret "canary-serving-cert" not found Apr 17 16:31:59.837667 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:31:59.837637 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls podName:fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c nodeName:}" failed. No retries permitted until 2026-04-17 16:32:03.83762499 +0000 UTC m=+41.169125869 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls") pod "dns-default-jdmrz" (UID: "fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c") : secret "dns-default-metrics-tls" not found Apr 17 16:32:01.331907 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:32:01.331872 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" event={"ID":"26f8780b-28b6-4679-8d3f-03a3a62e0358","Type":"ContainerStarted","Data":"38f4007c0e9186c76be10b44ee4ac6a84b7caad0d5e53c3ae0117f8bac97b432"} Apr 17 16:32:01.334446 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:32:01.334422 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9j69g" event={"ID":"6e5430f1-c021-4f4d-bedc-fafa1ec4d260","Type":"ContainerStarted","Data":"7da048dfd265f90820493c017671843b44ed0af0802717c4ed287862c6f7b722"} Apr 17 16:32:01.361077 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:32:01.361026 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9j69g" podStartSLOduration=7.121755264 podStartE2EDuration="38.361012198s" podCreationTimestamp="2026-04-17 16:31:23 +0000 UTC" firstStartedPulling="2026-04-17 16:31:25.847294207 +0000 UTC m=+3.178795093" lastFinishedPulling="2026-04-17 16:31:57.086551129 +0000 UTC m=+34.418052027" observedRunningTime="2026-04-17 16:32:01.359541311 +0000 UTC m=+38.691042213" watchObservedRunningTime="2026-04-17 16:32:01.361012198 +0000 UTC m=+38.692513099" Apr 17 16:32:03.869330 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:32:03.869291 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert\") pod \"ingress-canary-zb6kc\" (UID: \"20e90cc0-23dd-4714-b716-64ca208935e2\") " pod="openshift-ingress-canary/ingress-canary-zb6kc" Apr 17 16:32:03.869715 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:32:03.869348 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls\") pod \"dns-default-jdmrz\" (UID: \"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c\") " pod="openshift-dns/dns-default-jdmrz" Apr 17 16:32:03.869715 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:32:03.869436 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:03.869715 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:32:03.869440 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:03.869715 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:32:03.869486 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls podName:fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c nodeName:}" failed. No retries permitted until 2026-04-17 16:32:11.869472743 +0000 UTC m=+49.200973623 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls") pod "dns-default-jdmrz" (UID: "fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c") : secret "dns-default-metrics-tls" not found Apr 17 16:32:03.869715 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:32:03.869499 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert podName:20e90cc0-23dd-4714-b716-64ca208935e2 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:11.869493514 +0000 UTC m=+49.200994395 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert") pod "ingress-canary-zb6kc" (UID: "20e90cc0-23dd-4714-b716-64ca208935e2") : secret "canary-serving-cert" not found Apr 17 16:32:04.341963 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:32:04.341927 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" event={"ID":"26f8780b-28b6-4679-8d3f-03a3a62e0358","Type":"ContainerStarted","Data":"936df703ac6a14987576cbba4b11353ef39612012e8df3a231cabed3caf6b711"} Apr 17 16:32:04.341963 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:32:04.341968 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" event={"ID":"26f8780b-28b6-4679-8d3f-03a3a62e0358","Type":"ContainerStarted","Data":"ce422af89e42d0e42505bd5a6f4d1995b367ae0a6b72a5052bec5d565e189fc0"} Apr 17 16:32:04.364071 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:32:04.363993 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" podStartSLOduration=2.418330052 podStartE2EDuration="8.363979109s" podCreationTimestamp="2026-04-17 16:31:56 +0000 UTC" firstStartedPulling="2026-04-17 16:31:57.389975571 +0000 UTC m=+34.721476455" lastFinishedPulling="2026-04-17 16:32:03.335624628 +0000 UTC m=+40.667125512" observedRunningTime="2026-04-17 16:32:04.363268832 +0000 UTC m=+41.694769728" watchObservedRunningTime="2026-04-17 16:32:04.363979109 +0000 UTC m=+41.695480040" Apr 17 16:32:07.786270 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:32:07.786224 2573 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: reading blob sha256:74a542d7b83fa2e0529c9b7eaa7b5ec994b842d89812cda7e6ac3ce130b53843: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:321285dca5a2f9b911e21badd1e51ce49841ddc45c5c859b3a29f7982d7376cb" Apr 17 16:32:07.786639 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:32:07.786424 2573 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:network-check-target-container,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:321285dca5a2f9b911e21badd1e51ce49841ddc45c5c859b3a29f7982d7376cb,Command:[cluster-network-check-target],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:K8S_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{15728640 0} {} 15Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6zrk6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:10,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000560000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-check-target-z2wfh_openshift-network-diagnostics(10fbceca-37d7-4803-b22d-19039688034a): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: reading blob sha256:74a542d7b83fa2e0529c9b7eaa7b5ec994b842d89812cda7e6ac3ce130b53843: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 16:32:07.787617 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:32:07.787583 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-check-target-container\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: reading blob sha256:74a542d7b83fa2e0529c9b7eaa7b5ec994b842d89812cda7e6ac3ce130b53843: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-network-diagnostics/network-check-target-z2wfh" podUID="10fbceca-37d7-4803-b22d-19039688034a" Apr 17 16:32:08.350731 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:32:08.350703 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-check-target-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:321285dca5a2f9b911e21badd1e51ce49841ddc45c5c859b3a29f7982d7376cb\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: reading blob sha256:74a542d7b83fa2e0529c9b7eaa7b5ec994b842d89812cda7e6ac3ce130b53843: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-network-diagnostics/network-check-target-z2wfh" podUID="10fbceca-37d7-4803-b22d-19039688034a" Apr 17 16:32:11.926625 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:32:11.926580 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert\") pod \"ingress-canary-zb6kc\" (UID: \"20e90cc0-23dd-4714-b716-64ca208935e2\") " pod="openshift-ingress-canary/ingress-canary-zb6kc" Apr 17 16:32:11.927046 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:32:11.926641 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls\") pod \"dns-default-jdmrz\" (UID: \"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c\") " pod="openshift-dns/dns-default-jdmrz" Apr 17 16:32:11.927046 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:32:11.926722 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:11.927046 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:32:11.926726 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:11.927046 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:32:11.926786 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls podName:fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c nodeName:}" failed. No retries permitted until 2026-04-17 16:32:27.926770841 +0000 UTC m=+65.258271721 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls") pod "dns-default-jdmrz" (UID: "fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c") : secret "dns-default-metrics-tls" not found Apr 17 16:32:11.927046 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:32:11.926800 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert podName:20e90cc0-23dd-4714-b716-64ca208935e2 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:27.926793452 +0000 UTC m=+65.258294332 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert") pod "ingress-canary-zb6kc" (UID: "20e90cc0-23dd-4714-b716-64ca208935e2") : secret "canary-serving-cert" not found Apr 17 16:32:22.308015 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:32:22.307986 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4s682" Apr 17 16:32:25.386300 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:32:25.386260 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-z2wfh" event={"ID":"10fbceca-37d7-4803-b22d-19039688034a","Type":"ContainerStarted","Data":"f29eb40349d88eb1714a0e39240ca72795826117cbb37392449b9a9925a78d11"} Apr 17 16:32:25.386765 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:32:25.386472 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:32:25.403635 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:32:25.403591 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-z2wfh" podStartSLOduration=34.825762462 podStartE2EDuration="1m2.403579837s" podCreationTimestamp="2026-04-17 16:31:23 +0000 UTC" firstStartedPulling="2026-04-17 16:31:57.261248519 +0000 UTC m=+34.592749399" lastFinishedPulling="2026-04-17 16:32:24.839065891 +0000 UTC m=+62.170566774" observedRunningTime="2026-04-17 16:32:25.402849951 +0000 UTC m=+62.734350854" watchObservedRunningTime="2026-04-17 16:32:25.403579837 +0000 UTC m=+62.735080739" Apr 17 16:32:27.931588 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:32:27.931528 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls\") pod \"dns-default-jdmrz\" (UID: \"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c\") " pod="openshift-dns/dns-default-jdmrz" Apr 17 16:32:27.931588 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:32:27.931600 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert\") pod \"ingress-canary-zb6kc\" (UID: \"20e90cc0-23dd-4714-b716-64ca208935e2\") " pod="openshift-ingress-canary/ingress-canary-zb6kc" Apr 17 16:32:27.932051 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:32:27.931695 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:27.932051 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:32:27.931700 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:27.932051 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:32:27.931758 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert podName:20e90cc0-23dd-4714-b716-64ca208935e2 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:59.931743914 +0000 UTC m=+97.263244794 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert") pod "ingress-canary-zb6kc" (UID: "20e90cc0-23dd-4714-b716-64ca208935e2") : secret "canary-serving-cert" not found Apr 17 16:32:27.932051 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:32:27.931771 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls podName:fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c nodeName:}" failed. No retries permitted until 2026-04-17 16:32:59.931765393 +0000 UTC m=+97.263266274 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls") pod "dns-default-jdmrz" (UID: "fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c") : secret "dns-default-metrics-tls" not found Apr 17 16:32:28.837702 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:32:28.837644 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs\") pod \"network-metrics-daemon-zkmq8\" (UID: \"0d9d52ff-d172-4b74-90ce-5ef0ac75662c\") " pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:32:28.837993 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:32:28.837794 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:32:28.837993 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:32:28.837876 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs podName:0d9d52ff-d172-4b74-90ce-5ef0ac75662c nodeName:}" failed. No retries permitted until 2026-04-17 16:33:32.83786089 +0000 UTC m=+130.169361770 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs") pod "network-metrics-daemon-zkmq8" (UID: "0d9d52ff-d172-4b74-90ce-5ef0ac75662c") : secret "metrics-daemon-secret" not found Apr 17 16:32:56.390701 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:32:56.390572 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-z2wfh" Apr 17 16:32:59.962775 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:32:59.962745 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert\") pod \"ingress-canary-zb6kc\" (UID: \"20e90cc0-23dd-4714-b716-64ca208935e2\") " pod="openshift-ingress-canary/ingress-canary-zb6kc" Apr 17 16:32:59.963252 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:32:59.962795 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls\") pod \"dns-default-jdmrz\" (UID: \"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c\") " pod="openshift-dns/dns-default-jdmrz" Apr 17 16:32:59.963252 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:32:59.962902 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:59.963252 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:32:59.962905 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:59.963252 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:32:59.962961 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls podName:fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c nodeName:}" failed. No retries permitted until 2026-04-17 16:34:03.96294836 +0000 UTC m=+161.294449241 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls") pod "dns-default-jdmrz" (UID: "fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c") : secret "dns-default-metrics-tls" not found Apr 17 16:32:59.963252 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:32:59.962973 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert podName:20e90cc0-23dd-4714-b716-64ca208935e2 nodeName:}" failed. No retries permitted until 2026-04-17 16:34:03.962968111 +0000 UTC m=+161.294468990 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert") pod "ingress-canary-zb6kc" (UID: "20e90cc0-23dd-4714-b716-64ca208935e2") : secret "canary-serving-cert" not found Apr 17 16:33:32.894850 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:33:32.894789 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs\") pod \"network-metrics-daemon-zkmq8\" (UID: \"0d9d52ff-d172-4b74-90ce-5ef0ac75662c\") " pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:33:32.895352 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:33:32.894937 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:33:32.895352 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:33:32.895004 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs podName:0d9d52ff-d172-4b74-90ce-5ef0ac75662c nodeName:}" failed. No retries permitted until 2026-04-17 16:35:34.894989473 +0000 UTC m=+252.226490352 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs") pod "network-metrics-daemon-zkmq8" (UID: "0d9d52ff-d172-4b74-90ce-5ef0ac75662c") : secret "metrics-daemon-secret" not found Apr 17 16:33:43.369607 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:33:43.369577 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkjjb"] Apr 17 16:33:43.371426 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:33:43.371412 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkjjb" Apr 17 16:33:43.373905 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:33:43.373873 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 16:33:43.373905 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:33:43.373901 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 16:33:43.374580 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:33:43.374565 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-gnjxd\"" Apr 17 16:33:43.380116 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:33:43.380093 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkjjb"] Apr 17 16:33:43.472412 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:33:43.472376 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sprt2\" (UniqueName: \"kubernetes.io/projected/d1b9bfc8-7736-49f6-8463-ca6a7796d051-kube-api-access-sprt2\") pod \"migrator-74bb7799d9-zkjjb\" (UID: \"d1b9bfc8-7736-49f6-8463-ca6a7796d051\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkjjb" Apr 17 16:33:43.572961 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:33:43.572923 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sprt2\" (UniqueName: \"kubernetes.io/projected/d1b9bfc8-7736-49f6-8463-ca6a7796d051-kube-api-access-sprt2\") pod \"migrator-74bb7799d9-zkjjb\" (UID: \"d1b9bfc8-7736-49f6-8463-ca6a7796d051\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkjjb" Apr 17 16:33:43.584019 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:33:43.583993 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sprt2\" (UniqueName: \"kubernetes.io/projected/d1b9bfc8-7736-49f6-8463-ca6a7796d051-kube-api-access-sprt2\") pod \"migrator-74bb7799d9-zkjjb\" (UID: \"d1b9bfc8-7736-49f6-8463-ca6a7796d051\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkjjb" Apr 17 16:33:43.680934 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:33:43.680851 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkjjb" Apr 17 16:33:43.791636 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:33:43.791609 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkjjb"] Apr 17 16:33:43.794953 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:33:43.794923 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b9bfc8_7736_49f6_8463_ca6a7796d051.slice/crio-a7692e78db7ca63bb163c99fc534ec26b68b5b37427788131d3089851e42067c WatchSource:0}: Error finding container a7692e78db7ca63bb163c99fc534ec26b68b5b37427788131d3089851e42067c: Status 404 returned error can't find the container with id a7692e78db7ca63bb163c99fc534ec26b68b5b37427788131d3089851e42067c Apr 17 16:33:44.542499 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:33:44.542463 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkjjb" event={"ID":"d1b9bfc8-7736-49f6-8463-ca6a7796d051","Type":"ContainerStarted","Data":"a7692e78db7ca63bb163c99fc534ec26b68b5b37427788131d3089851e42067c"} Apr 17 16:33:45.549384 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:33:45.549342 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkjjb" event={"ID":"d1b9bfc8-7736-49f6-8463-ca6a7796d051","Type":"ContainerStarted","Data":"e5f867cde993260bda7767facf118f02b178b0a0aac12ef517f831e7b542d275"} Apr 17 16:33:45.549384 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:33:45.549386 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkjjb" event={"ID":"d1b9bfc8-7736-49f6-8463-ca6a7796d051","Type":"ContainerStarted","Data":"9ea4b08c8748629d9b711ec02429e6706f40bfb187f1e45434e60cf8f4f547d1"} Apr 17 16:33:45.565634 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:33:45.565580 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkjjb" podStartSLOduration=1.594669264 podStartE2EDuration="2.565564791s" podCreationTimestamp="2026-04-17 16:33:43 +0000 UTC" firstStartedPulling="2026-04-17 16:33:43.796632029 +0000 UTC m=+141.128132909" lastFinishedPulling="2026-04-17 16:33:44.767527539 +0000 UTC m=+142.099028436" observedRunningTime="2026-04-17 16:33:45.564887891 +0000 UTC m=+142.896388794" watchObservedRunningTime="2026-04-17 16:33:45.565564791 +0000 UTC m=+142.897065704" Apr 17 16:33:46.592677 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:33:46.592650 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hqjjj_a4e3fb1c-519b-4c02-9326-fd056001ad1b/dns-node-resolver/0.log" Apr 17 16:33:47.998476 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:33:47.998448 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wtvtv_66fec0a1-a09d-4a76-b857-2877ab654053/node-ca/0.log" Apr 17 16:33:48.998975 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:33:48.998945 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zkjjb_d1b9bfc8-7736-49f6-8463-ca6a7796d051/migrator/0.log" Apr 17 16:33:49.194766 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:33:49.194739 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zkjjb_d1b9bfc8-7736-49f6-8463-ca6a7796d051/graceful-termination/0.log" Apr 17 16:33:59.110350 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:33:59.110296 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-jdmrz" podUID="fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c" Apr 17 16:33:59.117271 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:33:59.117246 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-zb6kc" podUID="20e90cc0-23dd-4714-b716-64ca208935e2" Apr 17 16:33:59.181384 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:33:59.181349 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-zkmq8" podUID="0d9d52ff-d172-4b74-90ce-5ef0ac75662c" Apr 17 16:33:59.585496 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:33:59.585465 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zb6kc" Apr 17 16:33:59.585669 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:33:59.585466 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jdmrz" Apr 17 16:34:04.009488 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:04.009432 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls\") pod \"dns-default-jdmrz\" (UID: \"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c\") " pod="openshift-dns/dns-default-jdmrz" Apr 17 16:34:04.009930 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:04.009525 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert\") pod \"ingress-canary-zb6kc\" (UID: \"20e90cc0-23dd-4714-b716-64ca208935e2\") " pod="openshift-ingress-canary/ingress-canary-zb6kc" Apr 17 16:34:04.011895 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:04.011869 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c-metrics-tls\") pod \"dns-default-jdmrz\" (UID: \"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c\") " pod="openshift-dns/dns-default-jdmrz" Apr 17 16:34:04.012007 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:04.011990 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20e90cc0-23dd-4714-b716-64ca208935e2-cert\") pod \"ingress-canary-zb6kc\" (UID: \"20e90cc0-23dd-4714-b716-64ca208935e2\") " pod="openshift-ingress-canary/ingress-canary-zb6kc" Apr 17 16:34:04.096126 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:04.096083 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f96dz\"" Apr 17 16:34:04.097136 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:04.097118 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-txqpq\"" Apr 17 16:34:04.106108 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:04.106075 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jdmrz" Apr 17 16:34:04.106108 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:04.106095 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zb6kc" Apr 17 16:34:04.232384 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:04.232353 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jdmrz"] Apr 17 16:34:04.236138 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:34:04.236112 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfed429ae_fb2e_4ed3_9dc4_7d01dec6bb2c.slice/crio-5fb458d5a02f962012d9b3c40b0f8edc2345e83244397ccc2914d0e389d94b56 WatchSource:0}: Error finding container 5fb458d5a02f962012d9b3c40b0f8edc2345e83244397ccc2914d0e389d94b56: Status 404 returned error can't find the container with id 5fb458d5a02f962012d9b3c40b0f8edc2345e83244397ccc2914d0e389d94b56 Apr 17 16:34:04.249743 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:04.249710 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zb6kc"] Apr 17 16:34:04.253358 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:34:04.253332 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20e90cc0_23dd_4714_b716_64ca208935e2.slice/crio-7975d83b08856f53769df1bc1ec51efa67aa74eed456e675fc8b16c90c907cbb WatchSource:0}: Error finding container 7975d83b08856f53769df1bc1ec51efa67aa74eed456e675fc8b16c90c907cbb: Status 404 returned error can't find the container with id 7975d83b08856f53769df1bc1ec51efa67aa74eed456e675fc8b16c90c907cbb Apr 17 16:34:04.598891 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:04.598773 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zb6kc" event={"ID":"20e90cc0-23dd-4714-b716-64ca208935e2","Type":"ContainerStarted","Data":"7975d83b08856f53769df1bc1ec51efa67aa74eed456e675fc8b16c90c907cbb"} Apr 17 16:34:04.600599 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:04.600555 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jdmrz" event={"ID":"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c","Type":"ContainerStarted","Data":"5fb458d5a02f962012d9b3c40b0f8edc2345e83244397ccc2914d0e389d94b56"} Apr 17 16:34:06.606954 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:06.606916 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jdmrz" event={"ID":"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c","Type":"ContainerStarted","Data":"3f2400ce1e4030f5d281e8a27b30a608d98370412f2fa82a880ff706e66574c7"} Apr 17 16:34:06.606954 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:06.606954 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jdmrz" event={"ID":"fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c","Type":"ContainerStarted","Data":"4e52cb84fcd795c34a1e452d79e9f3c50ddb3d16a0f0f504dfbc9d5d801797ff"} Apr 17 16:34:06.607427 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:06.607103 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-jdmrz" Apr 17 16:34:06.608154 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:06.608124 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zb6kc" event={"ID":"20e90cc0-23dd-4714-b716-64ca208935e2","Type":"ContainerStarted","Data":"b204523d182db9c1c23b5c164a88d05024f692a8827b09f5c4172dcac3e14f6b"} Apr 17 16:34:06.627737 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:06.627701 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jdmrz" podStartSLOduration=128.925780327 podStartE2EDuration="2m10.627689781s" podCreationTimestamp="2026-04-17 16:31:56 +0000 UTC" firstStartedPulling="2026-04-17 16:34:04.237961878 +0000 UTC m=+161.569462758" lastFinishedPulling="2026-04-17 16:34:05.939871326 +0000 UTC m=+163.271372212" observedRunningTime="2026-04-17 16:34:06.627383962 +0000 UTC m=+163.958884863" watchObservedRunningTime="2026-04-17 16:34:06.627689781 +0000 UTC m=+163.959190683" Apr 17 16:34:09.232659 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.232614 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zb6kc" podStartSLOduration=131.544868363 podStartE2EDuration="2m13.232597477s" podCreationTimestamp="2026-04-17 16:31:56 +0000 UTC" firstStartedPulling="2026-04-17 16:34:04.255420121 +0000 UTC m=+161.586921003" lastFinishedPulling="2026-04-17 16:34:05.943149233 +0000 UTC m=+163.274650117" observedRunningTime="2026-04-17 16:34:06.648307676 +0000 UTC m=+163.979808589" watchObservedRunningTime="2026-04-17 16:34:09.232597477 +0000 UTC m=+166.564098407" Apr 17 16:34:09.233180 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.233163 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-7b9dc"] Apr 17 16:34:09.236072 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.236058 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7b9dc" Apr 17 16:34:09.239750 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.239725 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 16:34:09.239885 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.239756 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 16:34:09.239885 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.239733 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 16:34:09.239885 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.239733 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 16:34:09.239885 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.239772 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-m9k6s\"" Apr 17 16:34:09.259291 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.259260 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7b9dc"] Apr 17 16:34:09.278292 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.278259 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6598b846c9-zgtcv"] Apr 17 16:34:09.280996 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.280977 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.283926 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.283898 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 16:34:09.284065 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.284044 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 16:34:09.285162 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.285093 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nd5dg\"" Apr 17 16:34:09.285472 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.285373 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 16:34:09.289577 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.289558 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 16:34:09.295392 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.295368 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6598b846c9-zgtcv"] Apr 17 16:34:09.345321 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.345288 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/11a75458-c09d-4d9f-8d73-5085ca8421a0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7b9dc\" (UID: \"11a75458-c09d-4d9f-8d73-5085ca8421a0\") " pod="openshift-insights/insights-runtime-extractor-7b9dc" Apr 17 16:34:09.345321 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.345322 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kswsv\" (UniqueName: \"kubernetes.io/projected/11a75458-c09d-4d9f-8d73-5085ca8421a0-kube-api-access-kswsv\") pod \"insights-runtime-extractor-7b9dc\" (UID: \"11a75458-c09d-4d9f-8d73-5085ca8421a0\") " pod="openshift-insights/insights-runtime-extractor-7b9dc" Apr 17 16:34:09.345543 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.345340 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bfb79489-ea31-453d-8166-9a00d7bf66e1-image-registry-private-configuration\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.345543 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.345363 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bfb79489-ea31-453d-8166-9a00d7bf66e1-bound-sa-token\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.345543 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.345407 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bfb79489-ea31-453d-8166-9a00d7bf66e1-registry-tls\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.345543 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.345436 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/11a75458-c09d-4d9f-8d73-5085ca8421a0-crio-socket\") pod \"insights-runtime-extractor-7b9dc\" (UID: \"11a75458-c09d-4d9f-8d73-5085ca8421a0\") " pod="openshift-insights/insights-runtime-extractor-7b9dc" Apr 17 16:34:09.345543 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.345493 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bfb79489-ea31-453d-8166-9a00d7bf66e1-registry-certificates\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.345543 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.345519 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bfb79489-ea31-453d-8166-9a00d7bf66e1-trusted-ca\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.345543 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.345542 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/11a75458-c09d-4d9f-8d73-5085ca8421a0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7b9dc\" (UID: \"11a75458-c09d-4d9f-8d73-5085ca8421a0\") " pod="openshift-insights/insights-runtime-extractor-7b9dc" Apr 17 16:34:09.345741 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.345561 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/11a75458-c09d-4d9f-8d73-5085ca8421a0-data-volume\") pod \"insights-runtime-extractor-7b9dc\" (UID: \"11a75458-c09d-4d9f-8d73-5085ca8421a0\") " pod="openshift-insights/insights-runtime-extractor-7b9dc" Apr 17 16:34:09.345741 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.345578 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bfb79489-ea31-453d-8166-9a00d7bf66e1-ca-trust-extracted\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.345741 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.345619 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx8vj\" (UniqueName: \"kubernetes.io/projected/bfb79489-ea31-453d-8166-9a00d7bf66e1-kube-api-access-rx8vj\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.345741 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.345646 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bfb79489-ea31-453d-8166-9a00d7bf66e1-installation-pull-secrets\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.446758 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.446721 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bfb79489-ea31-453d-8166-9a00d7bf66e1-registry-tls\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.446758 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.446760 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/11a75458-c09d-4d9f-8d73-5085ca8421a0-crio-socket\") pod \"insights-runtime-extractor-7b9dc\" (UID: \"11a75458-c09d-4d9f-8d73-5085ca8421a0\") " pod="openshift-insights/insights-runtime-extractor-7b9dc" Apr 17 16:34:09.446991 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.446781 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bfb79489-ea31-453d-8166-9a00d7bf66e1-registry-certificates\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.446991 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.446802 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bfb79489-ea31-453d-8166-9a00d7bf66e1-trusted-ca\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.446991 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.446881 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/11a75458-c09d-4d9f-8d73-5085ca8421a0-crio-socket\") pod \"insights-runtime-extractor-7b9dc\" (UID: \"11a75458-c09d-4d9f-8d73-5085ca8421a0\") " pod="openshift-insights/insights-runtime-extractor-7b9dc" Apr 17 16:34:09.447098 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.447012 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/11a75458-c09d-4d9f-8d73-5085ca8421a0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7b9dc\" (UID: \"11a75458-c09d-4d9f-8d73-5085ca8421a0\") " pod="openshift-insights/insights-runtime-extractor-7b9dc" Apr 17 16:34:09.447098 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.447048 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/11a75458-c09d-4d9f-8d73-5085ca8421a0-data-volume\") pod \"insights-runtime-extractor-7b9dc\" (UID: \"11a75458-c09d-4d9f-8d73-5085ca8421a0\") " pod="openshift-insights/insights-runtime-extractor-7b9dc" Apr 17 16:34:09.447098 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.447072 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bfb79489-ea31-453d-8166-9a00d7bf66e1-ca-trust-extracted\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.447210 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.447112 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rx8vj\" (UniqueName: \"kubernetes.io/projected/bfb79489-ea31-453d-8166-9a00d7bf66e1-kube-api-access-rx8vj\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.447210 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.447153 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bfb79489-ea31-453d-8166-9a00d7bf66e1-installation-pull-secrets\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.447293 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.447241 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/11a75458-c09d-4d9f-8d73-5085ca8421a0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7b9dc\" (UID: \"11a75458-c09d-4d9f-8d73-5085ca8421a0\") " pod="openshift-insights/insights-runtime-extractor-7b9dc" Apr 17 16:34:09.447293 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.447272 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kswsv\" (UniqueName: \"kubernetes.io/projected/11a75458-c09d-4d9f-8d73-5085ca8421a0-kube-api-access-kswsv\") pod \"insights-runtime-extractor-7b9dc\" (UID: \"11a75458-c09d-4d9f-8d73-5085ca8421a0\") " pod="openshift-insights/insights-runtime-extractor-7b9dc" Apr 17 16:34:09.447386 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.447299 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bfb79489-ea31-453d-8166-9a00d7bf66e1-image-registry-private-configuration\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.447386 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.447324 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bfb79489-ea31-453d-8166-9a00d7bf66e1-bound-sa-token\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.447468 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.447439 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/11a75458-c09d-4d9f-8d73-5085ca8421a0-data-volume\") pod \"insights-runtime-extractor-7b9dc\" (UID: \"11a75458-c09d-4d9f-8d73-5085ca8421a0\") " pod="openshift-insights/insights-runtime-extractor-7b9dc" Apr 17 16:34:09.448090 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.447916 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bfb79489-ea31-453d-8166-9a00d7bf66e1-trusted-ca\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.448090 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.447916 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bfb79489-ea31-453d-8166-9a00d7bf66e1-registry-certificates\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.448090 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.447957 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/11a75458-c09d-4d9f-8d73-5085ca8421a0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7b9dc\" (UID: \"11a75458-c09d-4d9f-8d73-5085ca8421a0\") " pod="openshift-insights/insights-runtime-extractor-7b9dc" Apr 17 16:34:09.448090 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.448084 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bfb79489-ea31-453d-8166-9a00d7bf66e1-ca-trust-extracted\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.449579 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.449553 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bfb79489-ea31-453d-8166-9a00d7bf66e1-registry-tls\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.449677 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.449611 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/11a75458-c09d-4d9f-8d73-5085ca8421a0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7b9dc\" (UID: \"11a75458-c09d-4d9f-8d73-5085ca8421a0\") " pod="openshift-insights/insights-runtime-extractor-7b9dc" Apr 17 16:34:09.449753 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.449730 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bfb79489-ea31-453d-8166-9a00d7bf66e1-image-registry-private-configuration\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.450102 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.450083 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bfb79489-ea31-453d-8166-9a00d7bf66e1-installation-pull-secrets\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.456974 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.456948 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kswsv\" (UniqueName: \"kubernetes.io/projected/11a75458-c09d-4d9f-8d73-5085ca8421a0-kube-api-access-kswsv\") pod \"insights-runtime-extractor-7b9dc\" (UID: \"11a75458-c09d-4d9f-8d73-5085ca8421a0\") " pod="openshift-insights/insights-runtime-extractor-7b9dc" Apr 17 16:34:09.457221 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.457195 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bfb79489-ea31-453d-8166-9a00d7bf66e1-bound-sa-token\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.457565 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.457549 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx8vj\" (UniqueName: \"kubernetes.io/projected/bfb79489-ea31-453d-8166-9a00d7bf66e1-kube-api-access-rx8vj\") pod \"image-registry-6598b846c9-zgtcv\" (UID: \"bfb79489-ea31-453d-8166-9a00d7bf66e1\") " pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.545748 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.545660 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7b9dc" Apr 17 16:34:09.593951 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.593897 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:09.669636 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.669600 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7b9dc"] Apr 17 16:34:09.672806 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:34:09.672775 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11a75458_c09d_4d9f_8d73_5085ca8421a0.slice/crio-97da3c73a026f62dd5f92ae78d9a454077541c9c45936d417b105c91fae2c8c5 WatchSource:0}: Error finding container 97da3c73a026f62dd5f92ae78d9a454077541c9c45936d417b105c91fae2c8c5: Status 404 returned error can't find the container with id 97da3c73a026f62dd5f92ae78d9a454077541c9c45936d417b105c91fae2c8c5 Apr 17 16:34:09.724751 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:09.724719 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6598b846c9-zgtcv"] Apr 17 16:34:09.728872 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:34:09.728846 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfb79489_ea31_453d_8166_9a00d7bf66e1.slice/crio-98535017c312c0bc04a28eb1783f601e6939cd0ad0653c4ba1b0f88555d8aaa4 WatchSource:0}: Error finding container 98535017c312c0bc04a28eb1783f601e6939cd0ad0653c4ba1b0f88555d8aaa4: Status 404 returned error can't find the container with id 98535017c312c0bc04a28eb1783f601e6939cd0ad0653c4ba1b0f88555d8aaa4 Apr 17 16:34:10.618402 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:10.618310 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" event={"ID":"bfb79489-ea31-453d-8166-9a00d7bf66e1","Type":"ContainerStarted","Data":"b0bdb3f54eae864db71a2dd5f265003d8df85aab79175acb37d27bca77bac16b"} Apr 17 16:34:10.618402 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:10.618351 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" event={"ID":"bfb79489-ea31-453d-8166-9a00d7bf66e1","Type":"ContainerStarted","Data":"98535017c312c0bc04a28eb1783f601e6939cd0ad0653c4ba1b0f88555d8aaa4"} Apr 17 16:34:10.618900 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:10.618409 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:10.619966 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:10.619947 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7b9dc" event={"ID":"11a75458-c09d-4d9f-8d73-5085ca8421a0","Type":"ContainerStarted","Data":"c990bb449a98abf66b5ff5ac00a560ce63bee22911eb9ac11627cb91b652c13f"} Apr 17 16:34:10.620044 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:10.619969 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7b9dc" event={"ID":"11a75458-c09d-4d9f-8d73-5085ca8421a0","Type":"ContainerStarted","Data":"3ba3538ae4f67144362a3aca4ee3db1343c9571d5a0df5ca7a414a28c3f2c3dc"} Apr 17 16:34:10.620044 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:10.619977 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7b9dc" event={"ID":"11a75458-c09d-4d9f-8d73-5085ca8421a0","Type":"ContainerStarted","Data":"97da3c73a026f62dd5f92ae78d9a454077541c9c45936d417b105c91fae2c8c5"} Apr 17 16:34:10.641250 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:10.641193 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" podStartSLOduration=1.6411801339999998 podStartE2EDuration="1.641180134s" podCreationTimestamp="2026-04-17 16:34:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:34:10.640964427 +0000 UTC m=+167.972465330" watchObservedRunningTime="2026-04-17 16:34:10.641180134 +0000 UTC m=+167.972681027" Apr 17 16:34:12.626976 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:12.626943 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7b9dc" event={"ID":"11a75458-c09d-4d9f-8d73-5085ca8421a0","Type":"ContainerStarted","Data":"92ab4061027fc675981b6a03bdbb25f4c49717277dd879486be596eeac2f0689"} Apr 17 16:34:12.646206 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:12.646153 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-7b9dc" podStartSLOduration=1.6028809050000001 podStartE2EDuration="3.646138986s" podCreationTimestamp="2026-04-17 16:34:09 +0000 UTC" firstStartedPulling="2026-04-17 16:34:09.735309273 +0000 UTC m=+167.066810153" lastFinishedPulling="2026-04-17 16:34:11.778567352 +0000 UTC m=+169.110068234" observedRunningTime="2026-04-17 16:34:12.645674606 +0000 UTC m=+169.977175508" watchObservedRunningTime="2026-04-17 16:34:12.646138986 +0000 UTC m=+169.977639887" Apr 17 16:34:14.170620 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:14.170583 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:34:16.612949 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:16.612921 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jdmrz" Apr 17 16:34:19.471045 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.471009 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-sv26b"] Apr 17 16:34:19.475554 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.475537 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-sv26b" Apr 17 16:34:19.477891 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.477870 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 16:34:19.477891 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.477883 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-nnfpx\"" Apr 17 16:34:19.478014 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.477883 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 16:34:19.478784 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.478770 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 16:34:19.478844 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.478807 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 16:34:19.478883 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.478867 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 16:34:19.484232 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.484210 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-sv26b"] Apr 17 16:34:19.518841 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.518793 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d717c3f3-1354-407d-bd66-ab20ed5781e7-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-sv26b\" (UID: \"d717c3f3-1354-407d-bd66-ab20ed5781e7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sv26b" Apr 17 16:34:19.518985 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.518845 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d717c3f3-1354-407d-bd66-ab20ed5781e7-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-sv26b\" (UID: \"d717c3f3-1354-407d-bd66-ab20ed5781e7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sv26b" Apr 17 16:34:19.518985 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.518874 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45x6q\" (UniqueName: \"kubernetes.io/projected/d717c3f3-1354-407d-bd66-ab20ed5781e7-kube-api-access-45x6q\") pod \"prometheus-operator-5676c8c784-sv26b\" (UID: \"d717c3f3-1354-407d-bd66-ab20ed5781e7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sv26b" Apr 17 16:34:19.518985 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.518917 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d717c3f3-1354-407d-bd66-ab20ed5781e7-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-sv26b\" (UID: \"d717c3f3-1354-407d-bd66-ab20ed5781e7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sv26b" Apr 17 16:34:19.619773 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.619743 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d717c3f3-1354-407d-bd66-ab20ed5781e7-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-sv26b\" (UID: \"d717c3f3-1354-407d-bd66-ab20ed5781e7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sv26b" Apr 17 16:34:19.619773 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.619774 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d717c3f3-1354-407d-bd66-ab20ed5781e7-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-sv26b\" (UID: \"d717c3f3-1354-407d-bd66-ab20ed5781e7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sv26b" Apr 17 16:34:19.619971 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.619791 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-45x6q\" (UniqueName: \"kubernetes.io/projected/d717c3f3-1354-407d-bd66-ab20ed5781e7-kube-api-access-45x6q\") pod \"prometheus-operator-5676c8c784-sv26b\" (UID: \"d717c3f3-1354-407d-bd66-ab20ed5781e7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sv26b" Apr 17 16:34:19.619971 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.619931 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d717c3f3-1354-407d-bd66-ab20ed5781e7-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-sv26b\" (UID: \"d717c3f3-1354-407d-bd66-ab20ed5781e7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sv26b" Apr 17 16:34:19.620585 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.620561 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d717c3f3-1354-407d-bd66-ab20ed5781e7-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-sv26b\" (UID: \"d717c3f3-1354-407d-bd66-ab20ed5781e7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sv26b" Apr 17 16:34:19.622149 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.622128 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d717c3f3-1354-407d-bd66-ab20ed5781e7-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-sv26b\" (UID: \"d717c3f3-1354-407d-bd66-ab20ed5781e7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sv26b" Apr 17 16:34:19.622255 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.622195 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d717c3f3-1354-407d-bd66-ab20ed5781e7-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-sv26b\" (UID: \"d717c3f3-1354-407d-bd66-ab20ed5781e7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sv26b" Apr 17 16:34:19.627679 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.627657 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-45x6q\" (UniqueName: \"kubernetes.io/projected/d717c3f3-1354-407d-bd66-ab20ed5781e7-kube-api-access-45x6q\") pod \"prometheus-operator-5676c8c784-sv26b\" (UID: \"d717c3f3-1354-407d-bd66-ab20ed5781e7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-sv26b" Apr 17 16:34:19.784747 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.784658 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-sv26b" Apr 17 16:34:19.903180 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:19.903149 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-sv26b"] Apr 17 16:34:19.906503 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:34:19.906474 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd717c3f3_1354_407d_bd66_ab20ed5781e7.slice/crio-709956bbd7c59e95869e18b0a31877680766955f1d5486854d9f14f0a6e1945f WatchSource:0}: Error finding container 709956bbd7c59e95869e18b0a31877680766955f1d5486854d9f14f0a6e1945f: Status 404 returned error can't find the container with id 709956bbd7c59e95869e18b0a31877680766955f1d5486854d9f14f0a6e1945f Apr 17 16:34:20.648061 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:20.648010 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-sv26b" event={"ID":"d717c3f3-1354-407d-bd66-ab20ed5781e7","Type":"ContainerStarted","Data":"709956bbd7c59e95869e18b0a31877680766955f1d5486854d9f14f0a6e1945f"} Apr 17 16:34:21.651735 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:21.651699 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-sv26b" event={"ID":"d717c3f3-1354-407d-bd66-ab20ed5781e7","Type":"ContainerStarted","Data":"2761fa827fa31d482dd2d0b4627b5c1cc8b9566a345f2e2305275223468b36a1"} Apr 17 16:34:21.651735 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:21.651736 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-sv26b" event={"ID":"d717c3f3-1354-407d-bd66-ab20ed5781e7","Type":"ContainerStarted","Data":"f53801ac9d8e5e6bad2509ab470587137607819a95f5f6ac45a411686c09076c"} Apr 17 16:34:21.669079 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:21.669031 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-sv26b" podStartSLOduration=1.53505504 podStartE2EDuration="2.66901495s" podCreationTimestamp="2026-04-17 16:34:19 +0000 UTC" firstStartedPulling="2026-04-17 16:34:19.908925347 +0000 UTC m=+177.240426227" lastFinishedPulling="2026-04-17 16:34:21.042885246 +0000 UTC m=+178.374386137" observedRunningTime="2026-04-17 16:34:21.668145062 +0000 UTC m=+178.999645970" watchObservedRunningTime="2026-04-17 16:34:21.66901495 +0000 UTC m=+179.000515853" Apr 17 16:34:23.940195 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:23.938716 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-bbsgj"] Apr 17 16:34:23.943558 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:23.943530 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:23.948367 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:23.948346 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-hr7mt\"" Apr 17 16:34:23.948502 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:23.948435 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 16:34:23.948683 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:23.948666 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 16:34:23.948928 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:23.948909 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 16:34:24.054400 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.054362 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/37301673-ea1b-4db5-8fb2-107b9ee330de-root\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.054560 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.054413 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnb7p\" (UniqueName: \"kubernetes.io/projected/37301673-ea1b-4db5-8fb2-107b9ee330de-kube-api-access-tnb7p\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.054560 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.054454 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/37301673-ea1b-4db5-8fb2-107b9ee330de-node-exporter-tls\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.054560 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.054471 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/37301673-ea1b-4db5-8fb2-107b9ee330de-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.054560 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.054527 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/37301673-ea1b-4db5-8fb2-107b9ee330de-node-exporter-accelerators-collector-config\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.054686 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.054579 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/37301673-ea1b-4db5-8fb2-107b9ee330de-metrics-client-ca\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.054686 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.054608 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/37301673-ea1b-4db5-8fb2-107b9ee330de-node-exporter-textfile\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.054686 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.054633 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37301673-ea1b-4db5-8fb2-107b9ee330de-sys\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.054686 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.054648 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/37301673-ea1b-4db5-8fb2-107b9ee330de-node-exporter-wtmp\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.155551 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.155513 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/37301673-ea1b-4db5-8fb2-107b9ee330de-node-exporter-tls\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.155551 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.155551 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/37301673-ea1b-4db5-8fb2-107b9ee330de-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.155787 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.155574 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/37301673-ea1b-4db5-8fb2-107b9ee330de-node-exporter-accelerators-collector-config\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.155787 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.155622 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/37301673-ea1b-4db5-8fb2-107b9ee330de-metrics-client-ca\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.155787 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.155651 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/37301673-ea1b-4db5-8fb2-107b9ee330de-node-exporter-textfile\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.155787 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.155687 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37301673-ea1b-4db5-8fb2-107b9ee330de-sys\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.155787 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:34:24.155692 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 16:34:24.155787 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.155709 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/37301673-ea1b-4db5-8fb2-107b9ee330de-node-exporter-wtmp\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.155787 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:34:24.155757 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37301673-ea1b-4db5-8fb2-107b9ee330de-node-exporter-tls podName:37301673-ea1b-4db5-8fb2-107b9ee330de nodeName:}" failed. No retries permitted until 2026-04-17 16:34:24.655738489 +0000 UTC m=+181.987239369 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/37301673-ea1b-4db5-8fb2-107b9ee330de-node-exporter-tls") pod "node-exporter-bbsgj" (UID: "37301673-ea1b-4db5-8fb2-107b9ee330de") : secret "node-exporter-tls" not found Apr 17 16:34:24.155787 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.155781 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/37301673-ea1b-4db5-8fb2-107b9ee330de-root\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.156173 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.155816 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37301673-ea1b-4db5-8fb2-107b9ee330de-sys\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.156173 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.155815 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnb7p\" (UniqueName: \"kubernetes.io/projected/37301673-ea1b-4db5-8fb2-107b9ee330de-kube-api-access-tnb7p\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.156173 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.155895 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/37301673-ea1b-4db5-8fb2-107b9ee330de-node-exporter-wtmp\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.156173 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.155949 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/37301673-ea1b-4db5-8fb2-107b9ee330de-root\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.156348 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.156318 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/37301673-ea1b-4db5-8fb2-107b9ee330de-node-exporter-textfile\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.156535 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.156515 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/37301673-ea1b-4db5-8fb2-107b9ee330de-node-exporter-accelerators-collector-config\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.156608 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.156594 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/37301673-ea1b-4db5-8fb2-107b9ee330de-metrics-client-ca\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.158220 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.158198 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/37301673-ea1b-4db5-8fb2-107b9ee330de-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.174105 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.174076 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnb7p\" (UniqueName: \"kubernetes.io/projected/37301673-ea1b-4db5-8fb2-107b9ee330de-kube-api-access-tnb7p\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.659496 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:24.659463 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/37301673-ea1b-4db5-8fb2-107b9ee330de-node-exporter-tls\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:24.659673 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:34:24.659600 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 16:34:24.659673 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:34:24.659654 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37301673-ea1b-4db5-8fb2-107b9ee330de-node-exporter-tls podName:37301673-ea1b-4db5-8fb2-107b9ee330de nodeName:}" failed. No retries permitted until 2026-04-17 16:34:25.659639632 +0000 UTC m=+182.991140513 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/37301673-ea1b-4db5-8fb2-107b9ee330de-node-exporter-tls") pod "node-exporter-bbsgj" (UID: "37301673-ea1b-4db5-8fb2-107b9ee330de") : secret "node-exporter-tls" not found Apr 17 16:34:25.668077 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:25.668041 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/37301673-ea1b-4db5-8fb2-107b9ee330de-node-exporter-tls\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:25.670498 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:25.670474 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/37301673-ea1b-4db5-8fb2-107b9ee330de-node-exporter-tls\") pod \"node-exporter-bbsgj\" (UID: \"37301673-ea1b-4db5-8fb2-107b9ee330de\") " pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:25.752668 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:25.752635 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bbsgj" Apr 17 16:34:25.761396 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:34:25.761365 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37301673_ea1b_4db5_8fb2_107b9ee330de.slice/crio-06dfa29e3eff5426e4eb4a793772d1a7e9f01192bc73795bcd20c8e8c382ecb9 WatchSource:0}: Error finding container 06dfa29e3eff5426e4eb4a793772d1a7e9f01192bc73795bcd20c8e8c382ecb9: Status 404 returned error can't find the container with id 06dfa29e3eff5426e4eb4a793772d1a7e9f01192bc73795bcd20c8e8c382ecb9 Apr 17 16:34:26.052551 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.052514 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5895cb458-4gd9v"] Apr 17 16:34:26.057251 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.057218 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.068534 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.068505 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 16:34:26.069186 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.068895 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-ffp7f\"" Apr 17 16:34:26.069186 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.068801 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-90tpa2q0unrjj\"" Apr 17 16:34:26.069716 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.069697 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 16:34:26.069863 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.069755 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 16:34:26.069941 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.069911 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 16:34:26.070098 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.070058 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 16:34:26.092481 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.092444 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5895cb458-4gd9v"] Apr 17 16:34:26.171689 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.171658 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/01879656-f233-4f71-af88-26fb36ea40f9-secret-grpc-tls\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.171891 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.171696 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/01879656-f233-4f71-af88-26fb36ea40f9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.171891 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.171727 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/01879656-f233-4f71-af88-26fb36ea40f9-metrics-client-ca\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.171891 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.171781 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/01879656-f233-4f71-af88-26fb36ea40f9-secret-thanos-querier-tls\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.171891 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.171812 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/01879656-f233-4f71-af88-26fb36ea40f9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.171891 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.171888 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/01879656-f233-4f71-af88-26fb36ea40f9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.172058 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.171913 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sxzv\" (UniqueName: \"kubernetes.io/projected/01879656-f233-4f71-af88-26fb36ea40f9-kube-api-access-8sxzv\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.172058 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.171958 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/01879656-f233-4f71-af88-26fb36ea40f9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.272926 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.272882 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/01879656-f233-4f71-af88-26fb36ea40f9-secret-thanos-querier-tls\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.273100 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.272938 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/01879656-f233-4f71-af88-26fb36ea40f9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.273100 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.273002 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/01879656-f233-4f71-af88-26fb36ea40f9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.273100 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.273039 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8sxzv\" (UniqueName: \"kubernetes.io/projected/01879656-f233-4f71-af88-26fb36ea40f9-kube-api-access-8sxzv\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.273100 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.273089 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/01879656-f233-4f71-af88-26fb36ea40f9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.273308 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.273124 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/01879656-f233-4f71-af88-26fb36ea40f9-secret-grpc-tls\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.273308 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.273153 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/01879656-f233-4f71-af88-26fb36ea40f9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.273308 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.273192 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/01879656-f233-4f71-af88-26fb36ea40f9-metrics-client-ca\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.274469 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.274432 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/01879656-f233-4f71-af88-26fb36ea40f9-metrics-client-ca\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.276369 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.276341 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/01879656-f233-4f71-af88-26fb36ea40f9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.276504 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.276346 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/01879656-f233-4f71-af88-26fb36ea40f9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.276504 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.276492 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/01879656-f233-4f71-af88-26fb36ea40f9-secret-thanos-querier-tls\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.276759 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.276735 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/01879656-f233-4f71-af88-26fb36ea40f9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.276855 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.276812 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/01879656-f233-4f71-af88-26fb36ea40f9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.277273 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.277249 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/01879656-f233-4f71-af88-26fb36ea40f9-secret-grpc-tls\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.296261 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.296227 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sxzv\" (UniqueName: \"kubernetes.io/projected/01879656-f233-4f71-af88-26fb36ea40f9-kube-api-access-8sxzv\") pod \"thanos-querier-5895cb458-4gd9v\" (UID: \"01879656-f233-4f71-af88-26fb36ea40f9\") " pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.366472 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.366388 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:26.632866 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.632773 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5895cb458-4gd9v"] Apr 17 16:34:26.635893 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:34:26.635863 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01879656_f233_4f71_af88_26fb36ea40f9.slice/crio-d1e67c9ce7df95b7245df7c133e59e49bedd8e09bd3df57d44169a24ceeabd27 WatchSource:0}: Error finding container d1e67c9ce7df95b7245df7c133e59e49bedd8e09bd3df57d44169a24ceeabd27: Status 404 returned error can't find the container with id d1e67c9ce7df95b7245df7c133e59e49bedd8e09bd3df57d44169a24ceeabd27 Apr 17 16:34:26.666357 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.666321 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" event={"ID":"01879656-f233-4f71-af88-26fb36ea40f9","Type":"ContainerStarted","Data":"d1e67c9ce7df95b7245df7c133e59e49bedd8e09bd3df57d44169a24ceeabd27"} Apr 17 16:34:26.667752 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.667729 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bbsgj" event={"ID":"37301673-ea1b-4db5-8fb2-107b9ee330de","Type":"ContainerStarted","Data":"3b91547a6191949d175a8c579884c73b5beac076751f5cfd90d6ed9b9797b4a4"} Apr 17 16:34:26.667883 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:26.667757 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bbsgj" event={"ID":"37301673-ea1b-4db5-8fb2-107b9ee330de","Type":"ContainerStarted","Data":"06dfa29e3eff5426e4eb4a793772d1a7e9f01192bc73795bcd20c8e8c382ecb9"} Apr 17 16:34:27.671441 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:27.671409 2573 generic.go:358] "Generic (PLEG): container finished" podID="37301673-ea1b-4db5-8fb2-107b9ee330de" containerID="3b91547a6191949d175a8c579884c73b5beac076751f5cfd90d6ed9b9797b4a4" exitCode=0 Apr 17 16:34:27.671853 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:27.671468 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bbsgj" event={"ID":"37301673-ea1b-4db5-8fb2-107b9ee330de","Type":"ContainerDied","Data":"3b91547a6191949d175a8c579884c73b5beac076751f5cfd90d6ed9b9797b4a4"} Apr 17 16:34:28.675398 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:28.675306 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" event={"ID":"01879656-f233-4f71-af88-26fb36ea40f9","Type":"ContainerStarted","Data":"7f5244bbb64215babbdb2028d8b38f3fd5998518f16703dd4a45595f25a6928d"} Apr 17 16:34:28.675398 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:28.675345 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" event={"ID":"01879656-f233-4f71-af88-26fb36ea40f9","Type":"ContainerStarted","Data":"26d5b050ff2fcc28408dbd3bf063814e142f0cb47025bd9ae4728fc512bb3767"} Apr 17 16:34:28.675398 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:28.675356 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" event={"ID":"01879656-f233-4f71-af88-26fb36ea40f9","Type":"ContainerStarted","Data":"61bb848a0bcda503cfb32e7fe0b2157163197ba5c299febc6aaa917927bec86e"} Apr 17 16:34:28.677027 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:28.677001 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bbsgj" event={"ID":"37301673-ea1b-4db5-8fb2-107b9ee330de","Type":"ContainerStarted","Data":"719b5275cd3b16d54ce5be4dd14458ae31b029f84ff3da4128f53ba816b0320c"} Apr 17 16:34:28.677128 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:28.677036 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bbsgj" event={"ID":"37301673-ea1b-4db5-8fb2-107b9ee330de","Type":"ContainerStarted","Data":"4603694f8a3b169e4b81aa6624f4e99db21ea5cc2167d970c5f49847c2cdb7c1"} Apr 17 16:34:28.705305 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:28.705253 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-bbsgj" podStartSLOduration=4.931053347 podStartE2EDuration="5.705227746s" podCreationTimestamp="2026-04-17 16:34:23 +0000 UTC" firstStartedPulling="2026-04-17 16:34:25.763358748 +0000 UTC m=+183.094859629" lastFinishedPulling="2026-04-17 16:34:26.537533145 +0000 UTC m=+183.869034028" observedRunningTime="2026-04-17 16:34:28.704713735 +0000 UTC m=+186.036214636" watchObservedRunningTime="2026-04-17 16:34:28.705227746 +0000 UTC m=+186.036728648" Apr 17 16:34:29.682862 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:29.682757 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" event={"ID":"01879656-f233-4f71-af88-26fb36ea40f9","Type":"ContainerStarted","Data":"5173b0167f0eb6d1755e171c6806094e4c43e43297691ce8720bb73ef93d8b8f"} Apr 17 16:34:29.682862 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:29.682792 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" event={"ID":"01879656-f233-4f71-af88-26fb36ea40f9","Type":"ContainerStarted","Data":"b05f2729a452b259e2b761943f98cc6a73cc9ab944872fa6199e067683bbea2f"} Apr 17 16:34:29.682862 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:29.682802 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" event={"ID":"01879656-f233-4f71-af88-26fb36ea40f9","Type":"ContainerStarted","Data":"7c39c3552dc8087dd8e408fa09e2507f63bcfda61fe19b7049c12a45d3fe18fb"} Apr 17 16:34:29.683342 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:29.682951 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:29.710002 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:29.709950 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" podStartSLOduration=1.156656351 podStartE2EDuration="3.709936779s" podCreationTimestamp="2026-04-17 16:34:26 +0000 UTC" firstStartedPulling="2026-04-17 16:34:26.637773861 +0000 UTC m=+183.969274750" lastFinishedPulling="2026-04-17 16:34:29.191054295 +0000 UTC m=+186.522555178" observedRunningTime="2026-04-17 16:34:29.7090053 +0000 UTC m=+187.040506202" watchObservedRunningTime="2026-04-17 16:34:29.709936779 +0000 UTC m=+187.041437680" Apr 17 16:34:31.626694 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:31.626664 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6598b846c9-zgtcv" Apr 17 16:34:35.693672 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:35.693636 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5895cb458-4gd9v" Apr 17 16:34:47.255674 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:47.255610 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" podUID="26f8780b-28b6-4679-8d3f-03a3a62e0358" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 16:34:57.254853 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:34:57.254797 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" podUID="26f8780b-28b6-4679-8d3f-03a3a62e0358" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 16:35:07.255479 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:35:07.255433 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" podUID="26f8780b-28b6-4679-8d3f-03a3a62e0358" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 16:35:07.255925 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:35:07.255514 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" Apr 17 16:35:07.255999 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:35:07.255968 2573 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"936df703ac6a14987576cbba4b11353ef39612012e8df3a231cabed3caf6b711"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 16:35:07.256043 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:35:07.256028 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" podUID="26f8780b-28b6-4679-8d3f-03a3a62e0358" containerName="service-proxy" containerID="cri-o://936df703ac6a14987576cbba4b11353ef39612012e8df3a231cabed3caf6b711" gracePeriod=30 Apr 17 16:35:07.789234 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:35:07.789200 2573 generic.go:358] "Generic (PLEG): container finished" podID="26f8780b-28b6-4679-8d3f-03a3a62e0358" containerID="936df703ac6a14987576cbba4b11353ef39612012e8df3a231cabed3caf6b711" exitCode=2 Apr 17 16:35:07.789454 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:35:07.789277 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" event={"ID":"26f8780b-28b6-4679-8d3f-03a3a62e0358","Type":"ContainerDied","Data":"936df703ac6a14987576cbba4b11353ef39612012e8df3a231cabed3caf6b711"} Apr 17 16:35:07.789454 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:35:07.789336 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bbcddcf55-khcj9" event={"ID":"26f8780b-28b6-4679-8d3f-03a3a62e0358","Type":"ContainerStarted","Data":"1fa6e417f932a0178608ba340065411b48384e285bd7482360d0460d2b9b2309"} Apr 17 16:35:34.914535 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:35:34.914476 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs\") pod \"network-metrics-daemon-zkmq8\" (UID: \"0d9d52ff-d172-4b74-90ce-5ef0ac75662c\") " pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:35:34.916743 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:35:34.916720 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d9d52ff-d172-4b74-90ce-5ef0ac75662c-metrics-certs\") pod \"network-metrics-daemon-zkmq8\" (UID: \"0d9d52ff-d172-4b74-90ce-5ef0ac75662c\") " pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:35:35.174912 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:35:35.174819 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bq7nv\"" Apr 17 16:35:35.182512 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:35:35.182484 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zkmq8" Apr 17 16:35:35.300680 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:35:35.300656 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zkmq8"] Apr 17 16:35:35.303252 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:35:35.303210 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d9d52ff_d172_4b74_90ce_5ef0ac75662c.slice/crio-cf68b4481a5fdd7f4bc18d9b3ac627f8836c4a552d595c2a5eda40dba33eaf67 WatchSource:0}: Error finding container cf68b4481a5fdd7f4bc18d9b3ac627f8836c4a552d595c2a5eda40dba33eaf67: Status 404 returned error can't find the container with id cf68b4481a5fdd7f4bc18d9b3ac627f8836c4a552d595c2a5eda40dba33eaf67 Apr 17 16:35:35.865628 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:35:35.865595 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zkmq8" event={"ID":"0d9d52ff-d172-4b74-90ce-5ef0ac75662c","Type":"ContainerStarted","Data":"cf68b4481a5fdd7f4bc18d9b3ac627f8836c4a552d595c2a5eda40dba33eaf67"} Apr 17 16:35:36.869807 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:35:36.869764 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zkmq8" event={"ID":"0d9d52ff-d172-4b74-90ce-5ef0ac75662c","Type":"ContainerStarted","Data":"29efa31662e74a218e9b3937fccfe9580bdb3918ae00d50a807a1d14675eee6a"} Apr 17 16:35:36.869807 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:35:36.869810 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zkmq8" event={"ID":"0d9d52ff-d172-4b74-90ce-5ef0ac75662c","Type":"ContainerStarted","Data":"bacd4e59b1fd9c6243ed3f027775664df3554d1f54569b1d32381f81f3adca00"} Apr 17 16:35:36.886739 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:35:36.886653 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zkmq8" podStartSLOduration=252.916653736 podStartE2EDuration="4m13.886637736s" podCreationTimestamp="2026-04-17 16:31:23 +0000 UTC" firstStartedPulling="2026-04-17 16:35:35.304882087 +0000 UTC m=+252.636382969" lastFinishedPulling="2026-04-17 16:35:36.274866085 +0000 UTC m=+253.606366969" observedRunningTime="2026-04-17 16:35:36.886105474 +0000 UTC m=+254.217606377" watchObservedRunningTime="2026-04-17 16:35:36.886637736 +0000 UTC m=+254.218138670" Apr 17 16:36:23.055602 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:36:23.055571 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 16:36:23.056249 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:36:23.056224 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 16:36:23.059070 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:36:23.059051 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 16:37:33.181870 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:37:33.181815 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-bv57b"] Apr 17 16:37:33.185105 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:37:33.185084 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bv57b" Apr 17 16:37:33.188023 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:37:33.188003 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 16:37:33.193483 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:37:33.193464 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bv57b"] Apr 17 16:37:33.260074 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:37:33.260042 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4729b8b3-8325-43e0-b518-b015db422a04-original-pull-secret\") pod \"global-pull-secret-syncer-bv57b\" (UID: \"4729b8b3-8325-43e0-b518-b015db422a04\") " pod="kube-system/global-pull-secret-syncer-bv57b" Apr 17 16:37:33.260247 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:37:33.260111 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4729b8b3-8325-43e0-b518-b015db422a04-kubelet-config\") pod \"global-pull-secret-syncer-bv57b\" (UID: \"4729b8b3-8325-43e0-b518-b015db422a04\") " pod="kube-system/global-pull-secret-syncer-bv57b" Apr 17 16:37:33.260247 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:37:33.260154 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4729b8b3-8325-43e0-b518-b015db422a04-dbus\") pod \"global-pull-secret-syncer-bv57b\" (UID: \"4729b8b3-8325-43e0-b518-b015db422a04\") " pod="kube-system/global-pull-secret-syncer-bv57b" Apr 17 16:37:33.361333 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:37:33.361287 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4729b8b3-8325-43e0-b518-b015db422a04-kubelet-config\") pod \"global-pull-secret-syncer-bv57b\" (UID: \"4729b8b3-8325-43e0-b518-b015db422a04\") " pod="kube-system/global-pull-secret-syncer-bv57b" Apr 17 16:37:33.361333 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:37:33.361340 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4729b8b3-8325-43e0-b518-b015db422a04-dbus\") pod \"global-pull-secret-syncer-bv57b\" (UID: \"4729b8b3-8325-43e0-b518-b015db422a04\") " pod="kube-system/global-pull-secret-syncer-bv57b" Apr 17 16:37:33.361622 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:37:33.361387 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4729b8b3-8325-43e0-b518-b015db422a04-original-pull-secret\") pod \"global-pull-secret-syncer-bv57b\" (UID: \"4729b8b3-8325-43e0-b518-b015db422a04\") " pod="kube-system/global-pull-secret-syncer-bv57b" Apr 17 16:37:33.361622 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:37:33.361441 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4729b8b3-8325-43e0-b518-b015db422a04-kubelet-config\") pod \"global-pull-secret-syncer-bv57b\" (UID: \"4729b8b3-8325-43e0-b518-b015db422a04\") " pod="kube-system/global-pull-secret-syncer-bv57b" Apr 17 16:37:33.361622 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:37:33.361560 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4729b8b3-8325-43e0-b518-b015db422a04-dbus\") pod \"global-pull-secret-syncer-bv57b\" (UID: \"4729b8b3-8325-43e0-b518-b015db422a04\") " pod="kube-system/global-pull-secret-syncer-bv57b" Apr 17 16:37:33.363707 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:37:33.363678 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4729b8b3-8325-43e0-b518-b015db422a04-original-pull-secret\") pod \"global-pull-secret-syncer-bv57b\" (UID: \"4729b8b3-8325-43e0-b518-b015db422a04\") " pod="kube-system/global-pull-secret-syncer-bv57b" Apr 17 16:37:33.494150 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:37:33.494113 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bv57b" Apr 17 16:37:33.614851 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:37:33.614810 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bv57b"] Apr 17 16:37:33.616601 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:37:33.616581 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4729b8b3_8325_43e0_b518_b015db422a04.slice/crio-f4c91b81437f7c4359a420f372ecdcc70f6effb88e9cb3b1e58b6841a3ef62f0 WatchSource:0}: Error finding container f4c91b81437f7c4359a420f372ecdcc70f6effb88e9cb3b1e58b6841a3ef62f0: Status 404 returned error can't find the container with id f4c91b81437f7c4359a420f372ecdcc70f6effb88e9cb3b1e58b6841a3ef62f0 Apr 17 16:37:33.618247 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:37:33.618233 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:37:34.178409 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:37:34.178363 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bv57b" event={"ID":"4729b8b3-8325-43e0-b518-b015db422a04","Type":"ContainerStarted","Data":"f4c91b81437f7c4359a420f372ecdcc70f6effb88e9cb3b1e58b6841a3ef62f0"} Apr 17 16:37:38.189620 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:37:38.189584 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bv57b" event={"ID":"4729b8b3-8325-43e0-b518-b015db422a04","Type":"ContainerStarted","Data":"7ce1f32d8e741b958093e99af0b104f12c2672d2cea202df82922f4ce13d7b1f"} Apr 17 16:37:38.204156 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:37:38.204077 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-bv57b" podStartSLOduration=1.4991072509999999 podStartE2EDuration="5.204058457s" podCreationTimestamp="2026-04-17 16:37:33 +0000 UTC" firstStartedPulling="2026-04-17 16:37:33.618353974 +0000 UTC m=+370.949854854" lastFinishedPulling="2026-04-17 16:37:37.323305169 +0000 UTC m=+374.654806060" observedRunningTime="2026-04-17 16:37:38.203563881 +0000 UTC m=+375.535064796" watchObservedRunningTime="2026-04-17 16:37:38.204058457 +0000 UTC m=+375.535559361" Apr 17 16:38:49.394567 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:49.394495 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-gcwst"] Apr 17 16:38:49.397417 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:49.397401 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-gcwst" Apr 17 16:38:49.401533 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:49.401514 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 17 16:38:49.401837 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:49.401812 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 17 16:38:49.402062 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:49.402042 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 16:38:49.402144 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:49.402045 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 16:38:49.402144 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:49.402070 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-tjkqd\"" Apr 17 16:38:49.402596 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:49.402579 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 16:38:49.418714 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:49.418682 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-gcwst"] Apr 17 16:38:49.513528 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:49.513489 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8e9bb700-c2e4-49d0-bc92-842737c8266c-certificates\") pod \"keda-operator-ffbb595cb-gcwst\" (UID: \"8e9bb700-c2e4-49d0-bc92-842737c8266c\") " pod="openshift-keda/keda-operator-ffbb595cb-gcwst" Apr 17 16:38:49.513705 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:49.513548 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/8e9bb700-c2e4-49d0-bc92-842737c8266c-cabundle0\") pod \"keda-operator-ffbb595cb-gcwst\" (UID: \"8e9bb700-c2e4-49d0-bc92-842737c8266c\") " pod="openshift-keda/keda-operator-ffbb595cb-gcwst" Apr 17 16:38:49.513705 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:49.513582 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xr45\" (UniqueName: \"kubernetes.io/projected/8e9bb700-c2e4-49d0-bc92-842737c8266c-kube-api-access-4xr45\") pod \"keda-operator-ffbb595cb-gcwst\" (UID: \"8e9bb700-c2e4-49d0-bc92-842737c8266c\") " pod="openshift-keda/keda-operator-ffbb595cb-gcwst" Apr 17 16:38:49.614032 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:49.614003 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/8e9bb700-c2e4-49d0-bc92-842737c8266c-cabundle0\") pod \"keda-operator-ffbb595cb-gcwst\" (UID: \"8e9bb700-c2e4-49d0-bc92-842737c8266c\") " pod="openshift-keda/keda-operator-ffbb595cb-gcwst" Apr 17 16:38:49.614207 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:49.614046 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xr45\" (UniqueName: \"kubernetes.io/projected/8e9bb700-c2e4-49d0-bc92-842737c8266c-kube-api-access-4xr45\") pod \"keda-operator-ffbb595cb-gcwst\" (UID: \"8e9bb700-c2e4-49d0-bc92-842737c8266c\") " pod="openshift-keda/keda-operator-ffbb595cb-gcwst" Apr 17 16:38:49.614207 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:49.614104 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8e9bb700-c2e4-49d0-bc92-842737c8266c-certificates\") pod \"keda-operator-ffbb595cb-gcwst\" (UID: \"8e9bb700-c2e4-49d0-bc92-842737c8266c\") " pod="openshift-keda/keda-operator-ffbb595cb-gcwst" Apr 17 16:38:49.614317 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:38:49.614295 2573 secret.go:281] references non-existent secret key: ca.crt Apr 17 16:38:49.614317 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:38:49.614314 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 16:38:49.614408 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:38:49.614325 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-gcwst: references non-existent secret key: ca.crt Apr 17 16:38:49.614408 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:38:49.614396 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e9bb700-c2e4-49d0-bc92-842737c8266c-certificates podName:8e9bb700-c2e4-49d0-bc92-842737c8266c nodeName:}" failed. No retries permitted until 2026-04-17 16:38:50.114373766 +0000 UTC m=+447.445874662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8e9bb700-c2e4-49d0-bc92-842737c8266c-certificates") pod "keda-operator-ffbb595cb-gcwst" (UID: "8e9bb700-c2e4-49d0-bc92-842737c8266c") : references non-existent secret key: ca.crt Apr 17 16:38:49.614746 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:49.614728 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/8e9bb700-c2e4-49d0-bc92-842737c8266c-cabundle0\") pod \"keda-operator-ffbb595cb-gcwst\" (UID: \"8e9bb700-c2e4-49d0-bc92-842737c8266c\") " pod="openshift-keda/keda-operator-ffbb595cb-gcwst" Apr 17 16:38:49.657727 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:49.657657 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xr45\" (UniqueName: \"kubernetes.io/projected/8e9bb700-c2e4-49d0-bc92-842737c8266c-kube-api-access-4xr45\") pod \"keda-operator-ffbb595cb-gcwst\" (UID: \"8e9bb700-c2e4-49d0-bc92-842737c8266c\") " pod="openshift-keda/keda-operator-ffbb595cb-gcwst" Apr 17 16:38:50.118220 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:50.118174 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8e9bb700-c2e4-49d0-bc92-842737c8266c-certificates\") pod \"keda-operator-ffbb595cb-gcwst\" (UID: \"8e9bb700-c2e4-49d0-bc92-842737c8266c\") " pod="openshift-keda/keda-operator-ffbb595cb-gcwst" Apr 17 16:38:50.118400 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:38:50.118338 2573 secret.go:281] references non-existent secret key: ca.crt Apr 17 16:38:50.118400 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:38:50.118358 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 16:38:50.118400 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:38:50.118367 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-gcwst: references non-existent secret key: ca.crt Apr 17 16:38:50.118500 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:38:50.118426 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e9bb700-c2e4-49d0-bc92-842737c8266c-certificates podName:8e9bb700-c2e4-49d0-bc92-842737c8266c nodeName:}" failed. No retries permitted until 2026-04-17 16:38:51.118411296 +0000 UTC m=+448.449912175 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8e9bb700-c2e4-49d0-bc92-842737c8266c-certificates") pod "keda-operator-ffbb595cb-gcwst" (UID: "8e9bb700-c2e4-49d0-bc92-842737c8266c") : references non-existent secret key: ca.crt Apr 17 16:38:51.126382 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:51.126341 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8e9bb700-c2e4-49d0-bc92-842737c8266c-certificates\") pod \"keda-operator-ffbb595cb-gcwst\" (UID: \"8e9bb700-c2e4-49d0-bc92-842737c8266c\") " pod="openshift-keda/keda-operator-ffbb595cb-gcwst" Apr 17 16:38:51.126764 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:38:51.126463 2573 secret.go:281] references non-existent secret key: ca.crt Apr 17 16:38:51.126764 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:38:51.126475 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 16:38:51.126764 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:38:51.126483 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-gcwst: references non-existent secret key: ca.crt Apr 17 16:38:51.126764 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:38:51.126531 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e9bb700-c2e4-49d0-bc92-842737c8266c-certificates podName:8e9bb700-c2e4-49d0-bc92-842737c8266c nodeName:}" failed. No retries permitted until 2026-04-17 16:38:53.126517723 +0000 UTC m=+450.458018603 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8e9bb700-c2e4-49d0-bc92-842737c8266c-certificates") pod "keda-operator-ffbb595cb-gcwst" (UID: "8e9bb700-c2e4-49d0-bc92-842737c8266c") : references non-existent secret key: ca.crt Apr 17 16:38:53.143036 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:53.142996 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8e9bb700-c2e4-49d0-bc92-842737c8266c-certificates\") pod \"keda-operator-ffbb595cb-gcwst\" (UID: \"8e9bb700-c2e4-49d0-bc92-842737c8266c\") " pod="openshift-keda/keda-operator-ffbb595cb-gcwst" Apr 17 16:38:53.145287 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:53.145267 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8e9bb700-c2e4-49d0-bc92-842737c8266c-certificates\") pod \"keda-operator-ffbb595cb-gcwst\" (UID: \"8e9bb700-c2e4-49d0-bc92-842737c8266c\") " pod="openshift-keda/keda-operator-ffbb595cb-gcwst" Apr 17 16:38:53.306952 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:53.306913 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-gcwst" Apr 17 16:38:53.426227 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:53.426156 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-gcwst"] Apr 17 16:38:53.429269 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:38:53.429241 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e9bb700_c2e4_49d0_bc92_842737c8266c.slice/crio-db381cad53438c138d72ed4426069494c4df209b8fb34abf761e4724c9b82daf WatchSource:0}: Error finding container db381cad53438c138d72ed4426069494c4df209b8fb34abf761e4724c9b82daf: Status 404 returned error can't find the container with id db381cad53438c138d72ed4426069494c4df209b8fb34abf761e4724c9b82daf Apr 17 16:38:54.388106 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:54.388072 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-gcwst" event={"ID":"8e9bb700-c2e4-49d0-bc92-842737c8266c","Type":"ContainerStarted","Data":"db381cad53438c138d72ed4426069494c4df209b8fb34abf761e4724c9b82daf"} Apr 17 16:38:57.397722 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:57.397682 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-gcwst" event={"ID":"8e9bb700-c2e4-49d0-bc92-842737c8266c","Type":"ContainerStarted","Data":"cbd842e682723d0c49c563125cacbd93667d4d120bba62f8ac88b062d46fbf5e"} Apr 17 16:38:57.398153 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:57.397852 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-gcwst" Apr 17 16:38:57.416320 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:38:57.416265 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-gcwst" podStartSLOduration=4.675634935 podStartE2EDuration="8.416251907s" podCreationTimestamp="2026-04-17 16:38:49 +0000 UTC" firstStartedPulling="2026-04-17 16:38:53.430847652 +0000 UTC m=+450.762348538" lastFinishedPulling="2026-04-17 16:38:57.171464592 +0000 UTC m=+454.502965510" observedRunningTime="2026-04-17 16:38:57.414317265 +0000 UTC m=+454.745818166" watchObservedRunningTime="2026-04-17 16:38:57.416251907 +0000 UTC m=+454.747752808" Apr 17 16:39:18.403548 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:39:18.403520 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-gcwst" Apr 17 16:40:00.515948 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.515915 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-6x29f"] Apr 17 16:40:00.519029 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.519013 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-6x29f" Apr 17 16:40:00.522658 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.522633 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-srrzf\"" Apr 17 16:40:00.522796 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.522700 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 16:40:00.522796 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.522717 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 16:40:00.522796 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.522717 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 17 16:40:00.531621 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.531596 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-6x29f"] Apr 17 16:40:00.544342 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.544310 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c07f2a87-1031-4c35-890f-056f15117ced-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-6x29f\" (UID: \"c07f2a87-1031-4c35-890f-056f15117ced\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-6x29f" Apr 17 16:40:00.544651 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.544626 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f29vp\" (UniqueName: \"kubernetes.io/projected/c07f2a87-1031-4c35-890f-056f15117ced-kube-api-access-f29vp\") pod \"llmisvc-controller-manager-68cc5db7c4-6x29f\" (UID: \"c07f2a87-1031-4c35-890f-056f15117ced\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-6x29f" Apr 17 16:40:00.554187 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.554162 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-5wr9s"] Apr 17 16:40:00.557181 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.557163 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-5wr9s" Apr 17 16:40:00.560994 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.560759 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-mdnc6\"" Apr 17 16:40:00.563979 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.563955 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 16:40:00.586304 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.586280 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-5wr9s"] Apr 17 16:40:00.645144 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.645109 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4xv5\" (UniqueName: \"kubernetes.io/projected/71949cb1-8366-4768-b562-c0473ec01812-kube-api-access-j4xv5\") pod \"seaweedfs-86cc847c5c-5wr9s\" (UID: \"71949cb1-8366-4768-b562-c0473ec01812\") " pod="kserve/seaweedfs-86cc847c5c-5wr9s" Apr 17 16:40:00.645144 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.645151 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/71949cb1-8366-4768-b562-c0473ec01812-data\") pod \"seaweedfs-86cc847c5c-5wr9s\" (UID: \"71949cb1-8366-4768-b562-c0473ec01812\") " pod="kserve/seaweedfs-86cc847c5c-5wr9s" Apr 17 16:40:00.645362 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.645183 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f29vp\" (UniqueName: \"kubernetes.io/projected/c07f2a87-1031-4c35-890f-056f15117ced-kube-api-access-f29vp\") pod \"llmisvc-controller-manager-68cc5db7c4-6x29f\" (UID: \"c07f2a87-1031-4c35-890f-056f15117ced\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-6x29f" Apr 17 16:40:00.645362 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.645236 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c07f2a87-1031-4c35-890f-056f15117ced-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-6x29f\" (UID: \"c07f2a87-1031-4c35-890f-056f15117ced\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-6x29f" Apr 17 16:40:00.647768 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.647744 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c07f2a87-1031-4c35-890f-056f15117ced-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-6x29f\" (UID: \"c07f2a87-1031-4c35-890f-056f15117ced\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-6x29f" Apr 17 16:40:00.664903 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.664875 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f29vp\" (UniqueName: \"kubernetes.io/projected/c07f2a87-1031-4c35-890f-056f15117ced-kube-api-access-f29vp\") pod \"llmisvc-controller-manager-68cc5db7c4-6x29f\" (UID: \"c07f2a87-1031-4c35-890f-056f15117ced\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-6x29f" Apr 17 16:40:00.746502 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.746457 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4xv5\" (UniqueName: \"kubernetes.io/projected/71949cb1-8366-4768-b562-c0473ec01812-kube-api-access-j4xv5\") pod \"seaweedfs-86cc847c5c-5wr9s\" (UID: \"71949cb1-8366-4768-b562-c0473ec01812\") " pod="kserve/seaweedfs-86cc847c5c-5wr9s" Apr 17 16:40:00.746502 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.746501 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/71949cb1-8366-4768-b562-c0473ec01812-data\") pod \"seaweedfs-86cc847c5c-5wr9s\" (UID: \"71949cb1-8366-4768-b562-c0473ec01812\") " pod="kserve/seaweedfs-86cc847c5c-5wr9s" Apr 17 16:40:00.746986 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.746970 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/71949cb1-8366-4768-b562-c0473ec01812-data\") pod \"seaweedfs-86cc847c5c-5wr9s\" (UID: \"71949cb1-8366-4768-b562-c0473ec01812\") " pod="kserve/seaweedfs-86cc847c5c-5wr9s" Apr 17 16:40:00.757562 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.757537 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4xv5\" (UniqueName: \"kubernetes.io/projected/71949cb1-8366-4768-b562-c0473ec01812-kube-api-access-j4xv5\") pod \"seaweedfs-86cc847c5c-5wr9s\" (UID: \"71949cb1-8366-4768-b562-c0473ec01812\") " pod="kserve/seaweedfs-86cc847c5c-5wr9s" Apr 17 16:40:00.829638 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.829556 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-6x29f" Apr 17 16:40:00.867581 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.867546 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-5wr9s" Apr 17 16:40:00.956461 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.956428 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-6x29f"] Apr 17 16:40:00.959698 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:40:00.959662 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc07f2a87_1031_4c35_890f_056f15117ced.slice/crio-be1b8dc5389b014d0bbd3a99f4a3f4d6c4b672e89817d3738c60f31ff6a1a8f1 WatchSource:0}: Error finding container be1b8dc5389b014d0bbd3a99f4a3f4d6c4b672e89817d3738c60f31ff6a1a8f1: Status 404 returned error can't find the container with id be1b8dc5389b014d0bbd3a99f4a3f4d6c4b672e89817d3738c60f31ff6a1a8f1 Apr 17 16:40:00.998367 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:00.998343 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-5wr9s"] Apr 17 16:40:01.000756 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:40:01.000714 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71949cb1_8366_4768_b562_c0473ec01812.slice/crio-0cab4ef8571a405d8e954bd88ed3a11a287e6d239125b8dcd906016cf4e1891c WatchSource:0}: Error finding container 0cab4ef8571a405d8e954bd88ed3a11a287e6d239125b8dcd906016cf4e1891c: Status 404 returned error can't find the container with id 0cab4ef8571a405d8e954bd88ed3a11a287e6d239125b8dcd906016cf4e1891c Apr 17 16:40:01.569490 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:01.569450 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-5wr9s" event={"ID":"71949cb1-8366-4768-b562-c0473ec01812","Type":"ContainerStarted","Data":"0cab4ef8571a405d8e954bd88ed3a11a287e6d239125b8dcd906016cf4e1891c"} Apr 17 16:40:01.570685 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:01.570657 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-6x29f" event={"ID":"c07f2a87-1031-4c35-890f-056f15117ced","Type":"ContainerStarted","Data":"be1b8dc5389b014d0bbd3a99f4a3f4d6c4b672e89817d3738c60f31ff6a1a8f1"} Apr 17 16:40:03.577779 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:03.577745 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-6x29f" event={"ID":"c07f2a87-1031-4c35-890f-056f15117ced","Type":"ContainerStarted","Data":"d45de33153f841f8104f2b0767bbdd929f6f4b53a293936d79d1c8ef9842f0d3"} Apr 17 16:40:03.578213 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:03.577861 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-6x29f" Apr 17 16:40:03.594655 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:03.594603 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-6x29f" podStartSLOduration=1.371225861 podStartE2EDuration="3.594588364s" podCreationTimestamp="2026-04-17 16:40:00 +0000 UTC" firstStartedPulling="2026-04-17 16:40:00.961187698 +0000 UTC m=+518.292688578" lastFinishedPulling="2026-04-17 16:40:03.184550189 +0000 UTC m=+520.516051081" observedRunningTime="2026-04-17 16:40:03.59373759 +0000 UTC m=+520.925238496" watchObservedRunningTime="2026-04-17 16:40:03.594588364 +0000 UTC m=+520.926089301" Apr 17 16:40:05.587336 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:05.587290 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-5wr9s" event={"ID":"71949cb1-8366-4768-b562-c0473ec01812","Type":"ContainerStarted","Data":"83cc85abd2f99186aa634a7b24d830a781c9ae94b4c0d919620f2e61925ba9bc"} Apr 17 16:40:05.587788 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:05.587446 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-5wr9s" Apr 17 16:40:05.605583 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:05.605498 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-5wr9s" podStartSLOduration=2.056954586 podStartE2EDuration="5.60548254s" podCreationTimestamp="2026-04-17 16:40:00 +0000 UTC" firstStartedPulling="2026-04-17 16:40:01.002080982 +0000 UTC m=+518.333581863" lastFinishedPulling="2026-04-17 16:40:04.550608923 +0000 UTC m=+521.882109817" observedRunningTime="2026-04-17 16:40:05.604258677 +0000 UTC m=+522.935759579" watchObservedRunningTime="2026-04-17 16:40:05.60548254 +0000 UTC m=+522.936983442" Apr 17 16:40:11.592426 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:11.592351 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-5wr9s" Apr 17 16:40:34.585245 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:40:34.585212 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-6x29f" Apr 17 16:41:23.077370 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:41:23.077336 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 16:41:23.077938 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:41:23.077897 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 16:41:44.507516 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:41:44.507437 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-kq5nt"] Apr 17 16:41:44.510306 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:41:44.510291 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-kq5nt" Apr 17 16:41:44.512921 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:41:44.512903 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 17 16:41:44.513033 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:41:44.512970 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 17 16:41:44.518517 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:41:44.518495 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-kq5nt"] Apr 17 16:41:44.647608 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:41:44.647575 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6e391735-dd05-49a1-9116-d67840d417a9-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-kq5nt\" (UID: \"6e391735-dd05-49a1-9116-d67840d417a9\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-kq5nt" Apr 17 16:41:44.647788 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:41:44.647626 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/6e391735-dd05-49a1-9116-d67840d417a9-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-kq5nt\" (UID: \"6e391735-dd05-49a1-9116-d67840d417a9\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-kq5nt" Apr 17 16:41:44.647788 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:41:44.647704 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snmhl\" (UniqueName: \"kubernetes.io/projected/6e391735-dd05-49a1-9116-d67840d417a9-kube-api-access-snmhl\") pod \"seaweedfs-tls-custom-5c88b85bb7-kq5nt\" (UID: \"6e391735-dd05-49a1-9116-d67840d417a9\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-kq5nt" Apr 17 16:41:44.748936 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:41:44.748903 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snmhl\" (UniqueName: \"kubernetes.io/projected/6e391735-dd05-49a1-9116-d67840d417a9-kube-api-access-snmhl\") pod \"seaweedfs-tls-custom-5c88b85bb7-kq5nt\" (UID: \"6e391735-dd05-49a1-9116-d67840d417a9\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-kq5nt" Apr 17 16:41:44.749118 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:41:44.748950 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6e391735-dd05-49a1-9116-d67840d417a9-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-kq5nt\" (UID: \"6e391735-dd05-49a1-9116-d67840d417a9\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-kq5nt" Apr 17 16:41:44.749118 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:41:44.748987 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/6e391735-dd05-49a1-9116-d67840d417a9-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-kq5nt\" (UID: \"6e391735-dd05-49a1-9116-d67840d417a9\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-kq5nt" Apr 17 16:41:44.749383 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:41:44.749352 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6e391735-dd05-49a1-9116-d67840d417a9-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-kq5nt\" (UID: \"6e391735-dd05-49a1-9116-d67840d417a9\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-kq5nt" Apr 17 16:41:44.751439 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:41:44.751422 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/6e391735-dd05-49a1-9116-d67840d417a9-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-kq5nt\" (UID: \"6e391735-dd05-49a1-9116-d67840d417a9\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-kq5nt" Apr 17 16:41:44.759003 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:41:44.758948 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snmhl\" (UniqueName: \"kubernetes.io/projected/6e391735-dd05-49a1-9116-d67840d417a9-kube-api-access-snmhl\") pod \"seaweedfs-tls-custom-5c88b85bb7-kq5nt\" (UID: \"6e391735-dd05-49a1-9116-d67840d417a9\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-kq5nt" Apr 17 16:41:44.820111 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:41:44.820075 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-kq5nt" Apr 17 16:41:44.937018 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:41:44.936992 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-kq5nt"] Apr 17 16:41:44.939289 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:41:44.939263 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e391735_dd05_49a1_9116_d67840d417a9.slice/crio-91830c22e173385eae607e9180044fc4cf7f60e0b032c1d6e0c9a42e71145cb6 WatchSource:0}: Error finding container 91830c22e173385eae607e9180044fc4cf7f60e0b032c1d6e0c9a42e71145cb6: Status 404 returned error can't find the container with id 91830c22e173385eae607e9180044fc4cf7f60e0b032c1d6e0c9a42e71145cb6 Apr 17 16:41:45.848638 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:41:45.848601 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-kq5nt" event={"ID":"6e391735-dd05-49a1-9116-d67840d417a9","Type":"ContainerStarted","Data":"f46feec2bf83e43e3dc1fda13835b18a79d4f152fe27f64c7f2e47f0689b5326"} Apr 17 16:41:45.848638 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:41:45.848642 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-kq5nt" event={"ID":"6e391735-dd05-49a1-9116-d67840d417a9","Type":"ContainerStarted","Data":"91830c22e173385eae607e9180044fc4cf7f60e0b032c1d6e0c9a42e71145cb6"} Apr 17 16:41:45.864412 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:41:45.864369 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-kq5nt" podStartSLOduration=1.5855922690000002 podStartE2EDuration="1.864354968s" podCreationTimestamp="2026-04-17 16:41:44 +0000 UTC" firstStartedPulling="2026-04-17 16:41:44.940898193 +0000 UTC m=+622.272399073" lastFinishedPulling="2026-04-17 16:41:45.219660876 +0000 UTC m=+622.551161772" observedRunningTime="2026-04-17 16:41:45.863605316 +0000 UTC m=+623.195106219" watchObservedRunningTime="2026-04-17 16:41:45.864354968 +0000 UTC m=+623.195855869" Apr 17 16:46:23.098924 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:46:23.098890 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 16:46:23.099612 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:46:23.099595 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 16:51:23.118519 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:51:23.118491 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 16:51:23.119103 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:51:23.118693 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 16:55:51.269378 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:51.269344 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb"] Apr 17 16:55:51.272684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:51.272668 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" Apr 17 16:55:51.274964 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:51.274942 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-kube-rbac-proxy-sar-config\"" Apr 17 16:55:51.275077 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:51.274973 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-wzvpv\"" Apr 17 16:55:51.275772 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:51.275752 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-predictor-serving-cert\"" Apr 17 16:55:51.275772 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:51.275762 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 16:55:51.275925 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:51.275820 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 16:55:51.284530 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:51.284508 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb"] Apr 17 16:55:51.322968 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:51.322938 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f057762-049b-4b81-94d7-6b41acb51d00-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-2bnlb\" (UID: \"8f057762-049b-4b81-94d7-6b41acb51d00\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" Apr 17 16:55:51.323121 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:51.322979 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f057762-049b-4b81-94d7-6b41acb51d00-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-2bnlb\" (UID: \"8f057762-049b-4b81-94d7-6b41acb51d00\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" Apr 17 16:55:51.323121 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:51.323019 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f057762-049b-4b81-94d7-6b41acb51d00-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-2bnlb\" (UID: \"8f057762-049b-4b81-94d7-6b41acb51d00\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" Apr 17 16:55:51.323121 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:51.323079 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szjxw\" (UniqueName: \"kubernetes.io/projected/8f057762-049b-4b81-94d7-6b41acb51d00-kube-api-access-szjxw\") pod \"isvc-paddle-predictor-6b8b7cfb4b-2bnlb\" (UID: \"8f057762-049b-4b81-94d7-6b41acb51d00\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" Apr 17 16:55:51.423797 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:51.423765 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f057762-049b-4b81-94d7-6b41acb51d00-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-2bnlb\" (UID: \"8f057762-049b-4b81-94d7-6b41acb51d00\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" Apr 17 16:55:51.423797 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:51.423801 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szjxw\" (UniqueName: \"kubernetes.io/projected/8f057762-049b-4b81-94d7-6b41acb51d00-kube-api-access-szjxw\") pod \"isvc-paddle-predictor-6b8b7cfb4b-2bnlb\" (UID: \"8f057762-049b-4b81-94d7-6b41acb51d00\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" Apr 17 16:55:51.424039 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:51.423874 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f057762-049b-4b81-94d7-6b41acb51d00-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-2bnlb\" (UID: \"8f057762-049b-4b81-94d7-6b41acb51d00\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" Apr 17 16:55:51.424039 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:51.423901 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f057762-049b-4b81-94d7-6b41acb51d00-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-2bnlb\" (UID: \"8f057762-049b-4b81-94d7-6b41acb51d00\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" Apr 17 16:55:51.424039 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:55:51.423942 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-paddle-predictor-serving-cert: secret "isvc-paddle-predictor-serving-cert" not found Apr 17 16:55:51.424039 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:55:51.424023 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f057762-049b-4b81-94d7-6b41acb51d00-proxy-tls podName:8f057762-049b-4b81-94d7-6b41acb51d00 nodeName:}" failed. No retries permitted until 2026-04-17 16:55:51.924000951 +0000 UTC m=+1469.255501854 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/8f057762-049b-4b81-94d7-6b41acb51d00-proxy-tls") pod "isvc-paddle-predictor-6b8b7cfb4b-2bnlb" (UID: "8f057762-049b-4b81-94d7-6b41acb51d00") : secret "isvc-paddle-predictor-serving-cert" not found Apr 17 16:55:51.424332 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:51.424309 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f057762-049b-4b81-94d7-6b41acb51d00-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-2bnlb\" (UID: \"8f057762-049b-4b81-94d7-6b41acb51d00\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" Apr 17 16:55:51.424563 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:51.424546 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f057762-049b-4b81-94d7-6b41acb51d00-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-2bnlb\" (UID: \"8f057762-049b-4b81-94d7-6b41acb51d00\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" Apr 17 16:55:51.434663 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:51.434637 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szjxw\" (UniqueName: \"kubernetes.io/projected/8f057762-049b-4b81-94d7-6b41acb51d00-kube-api-access-szjxw\") pod \"isvc-paddle-predictor-6b8b7cfb4b-2bnlb\" (UID: \"8f057762-049b-4b81-94d7-6b41acb51d00\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" Apr 17 16:55:51.929149 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:51.929108 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f057762-049b-4b81-94d7-6b41acb51d00-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-2bnlb\" (UID: \"8f057762-049b-4b81-94d7-6b41acb51d00\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" Apr 17 16:55:51.931629 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:51.931598 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f057762-049b-4b81-94d7-6b41acb51d00-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-2bnlb\" (UID: \"8f057762-049b-4b81-94d7-6b41acb51d00\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" Apr 17 16:55:52.183527 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:52.183442 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" Apr 17 16:55:52.301782 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:52.301757 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb"] Apr 17 16:55:52.304288 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:55:52.304257 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f057762_049b_4b81_94d7_6b41acb51d00.slice/crio-924e2cd6ed8e6779a35474c1701e496781bd000465996b80ec17cf03973340f2 WatchSource:0}: Error finding container 924e2cd6ed8e6779a35474c1701e496781bd000465996b80ec17cf03973340f2: Status 404 returned error can't find the container with id 924e2cd6ed8e6779a35474c1701e496781bd000465996b80ec17cf03973340f2 Apr 17 16:55:52.306029 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:52.306015 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:55:53.143720 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:53.143682 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" event={"ID":"8f057762-049b-4b81-94d7-6b41acb51d00","Type":"ContainerStarted","Data":"924e2cd6ed8e6779a35474c1701e496781bd000465996b80ec17cf03973340f2"} Apr 17 16:55:56.153975 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:55:56.153885 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" event={"ID":"8f057762-049b-4b81-94d7-6b41acb51d00","Type":"ContainerStarted","Data":"d7d256570b6f511216bcd0a20b5ee395b9127a927aa5a18c0ccfddac576cfe05"} Apr 17 16:56:01.169207 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:56:01.169173 2573 generic.go:358] "Generic (PLEG): container finished" podID="8f057762-049b-4b81-94d7-6b41acb51d00" containerID="d7d256570b6f511216bcd0a20b5ee395b9127a927aa5a18c0ccfddac576cfe05" exitCode=0 Apr 17 16:56:01.169607 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:56:01.169212 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" event={"ID":"8f057762-049b-4b81-94d7-6b41acb51d00","Type":"ContainerDied","Data":"d7d256570b6f511216bcd0a20b5ee395b9127a927aa5a18c0ccfddac576cfe05"} Apr 17 16:56:13.213562 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:56:13.213518 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" event={"ID":"8f057762-049b-4b81-94d7-6b41acb51d00","Type":"ContainerStarted","Data":"90016fd79fdcdcd87b7dfd8b91b93d6277fc258d6d5e50fa79e1194f49a46aa9"} Apr 17 16:56:16.224259 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:56:16.224224 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" event={"ID":"8f057762-049b-4b81-94d7-6b41acb51d00","Type":"ContainerStarted","Data":"cf768e7c423268d7c10b56211ed4a540d3aab14e8a25e69cf21637e920ee5cfb"} Apr 17 16:56:16.224704 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:56:16.224491 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" Apr 17 16:56:16.246690 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:56:16.246625 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" podStartSLOduration=2.099106313 podStartE2EDuration="25.246606175s" podCreationTimestamp="2026-04-17 16:55:51 +0000 UTC" firstStartedPulling="2026-04-17 16:55:52.306140398 +0000 UTC m=+1469.637641278" lastFinishedPulling="2026-04-17 16:56:15.453640256 +0000 UTC m=+1492.785141140" observedRunningTime="2026-04-17 16:56:16.243601528 +0000 UTC m=+1493.575102436" watchObservedRunningTime="2026-04-17 16:56:16.246606175 +0000 UTC m=+1493.578107079" Apr 17 16:56:17.226880 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:56:17.226839 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" Apr 17 16:56:17.228143 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:56:17.228112 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" podUID="8f057762-049b-4b81-94d7-6b41acb51d00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 17 16:56:18.230214 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:56:18.230168 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" podUID="8f057762-049b-4b81-94d7-6b41acb51d00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 17 16:56:23.140238 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:56:23.140208 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 16:56:23.140818 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:56:23.140290 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 16:56:23.236264 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:56:23.236237 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" Apr 17 16:56:23.236915 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:56:23.236885 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" podUID="8f057762-049b-4b81-94d7-6b41acb51d00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 17 16:56:33.237703 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:56:33.237663 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" podUID="8f057762-049b-4b81-94d7-6b41acb51d00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 17 16:56:43.237414 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:56:43.237323 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" podUID="8f057762-049b-4b81-94d7-6b41acb51d00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 17 16:56:53.237881 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:56:53.237818 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" podUID="8f057762-049b-4b81-94d7-6b41acb51d00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 17 16:57:03.237013 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:03.236974 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" Apr 17 16:57:12.793398 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:12.793364 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb"] Apr 17 16:57:12.793817 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:12.793659 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" podUID="8f057762-049b-4b81-94d7-6b41acb51d00" containerName="kserve-container" containerID="cri-o://90016fd79fdcdcd87b7dfd8b91b93d6277fc258d6d5e50fa79e1194f49a46aa9" gracePeriod=30 Apr 17 16:57:12.793817 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:12.793729 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" podUID="8f057762-049b-4b81-94d7-6b41acb51d00" containerName="kube-rbac-proxy" containerID="cri-o://cf768e7c423268d7c10b56211ed4a540d3aab14e8a25e69cf21637e920ee5cfb" gracePeriod=30 Apr 17 16:57:13.231079 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:13.231037 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" podUID="8f057762-049b-4b81-94d7-6b41acb51d00" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.18:8643/healthz\": dial tcp 10.134.0.18:8643: connect: connection refused" Apr 17 16:57:13.237505 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:13.237479 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" podUID="8f057762-049b-4b81-94d7-6b41acb51d00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 17 16:57:13.387510 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:13.387477 2573 generic.go:358] "Generic (PLEG): container finished" podID="8f057762-049b-4b81-94d7-6b41acb51d00" containerID="cf768e7c423268d7c10b56211ed4a540d3aab14e8a25e69cf21637e920ee5cfb" exitCode=2 Apr 17 16:57:13.387684 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:13.387518 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" event={"ID":"8f057762-049b-4b81-94d7-6b41acb51d00","Type":"ContainerDied","Data":"cf768e7c423268d7c10b56211ed4a540d3aab14e8a25e69cf21637e920ee5cfb"} Apr 17 16:57:15.533997 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:15.533969 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" Apr 17 16:57:15.618741 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:15.618663 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f057762-049b-4b81-94d7-6b41acb51d00-kserve-provision-location\") pod \"8f057762-049b-4b81-94d7-6b41acb51d00\" (UID: \"8f057762-049b-4b81-94d7-6b41acb51d00\") " Apr 17 16:57:15.618741 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:15.618720 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f057762-049b-4b81-94d7-6b41acb51d00-proxy-tls\") pod \"8f057762-049b-4b81-94d7-6b41acb51d00\" (UID: \"8f057762-049b-4b81-94d7-6b41acb51d00\") " Apr 17 16:57:15.618741 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:15.618741 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szjxw\" (UniqueName: \"kubernetes.io/projected/8f057762-049b-4b81-94d7-6b41acb51d00-kube-api-access-szjxw\") pod \"8f057762-049b-4b81-94d7-6b41acb51d00\" (UID: \"8f057762-049b-4b81-94d7-6b41acb51d00\") " Apr 17 16:57:15.618986 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:15.618778 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f057762-049b-4b81-94d7-6b41acb51d00-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"8f057762-049b-4b81-94d7-6b41acb51d00\" (UID: \"8f057762-049b-4b81-94d7-6b41acb51d00\") " Apr 17 16:57:15.619207 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:15.619180 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f057762-049b-4b81-94d7-6b41acb51d00-isvc-paddle-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-kube-rbac-proxy-sar-config") pod "8f057762-049b-4b81-94d7-6b41acb51d00" (UID: "8f057762-049b-4b81-94d7-6b41acb51d00"). InnerVolumeSpecName "isvc-paddle-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:57:15.621048 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:15.621019 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f057762-049b-4b81-94d7-6b41acb51d00-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8f057762-049b-4b81-94d7-6b41acb51d00" (UID: "8f057762-049b-4b81-94d7-6b41acb51d00"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:57:15.621161 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:15.621073 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f057762-049b-4b81-94d7-6b41acb51d00-kube-api-access-szjxw" (OuterVolumeSpecName: "kube-api-access-szjxw") pod "8f057762-049b-4b81-94d7-6b41acb51d00" (UID: "8f057762-049b-4b81-94d7-6b41acb51d00"). InnerVolumeSpecName "kube-api-access-szjxw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:57:15.626519 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:15.626495 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f057762-049b-4b81-94d7-6b41acb51d00-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8f057762-049b-4b81-94d7-6b41acb51d00" (UID: "8f057762-049b-4b81-94d7-6b41acb51d00"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:57:15.719649 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:15.719613 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f057762-049b-4b81-94d7-6b41acb51d00-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 16:57:15.719649 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:15.719643 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f057762-049b-4b81-94d7-6b41acb51d00-proxy-tls\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 16:57:15.719649 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:15.719653 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-szjxw\" (UniqueName: \"kubernetes.io/projected/8f057762-049b-4b81-94d7-6b41acb51d00-kube-api-access-szjxw\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 16:57:15.719949 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:15.719664 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f057762-049b-4b81-94d7-6b41acb51d00-isvc-paddle-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 16:57:16.398410 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:16.398376 2573 generic.go:358] "Generic (PLEG): container finished" podID="8f057762-049b-4b81-94d7-6b41acb51d00" containerID="90016fd79fdcdcd87b7dfd8b91b93d6277fc258d6d5e50fa79e1194f49a46aa9" exitCode=0 Apr 17 16:57:16.398587 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:16.398435 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" event={"ID":"8f057762-049b-4b81-94d7-6b41acb51d00","Type":"ContainerDied","Data":"90016fd79fdcdcd87b7dfd8b91b93d6277fc258d6d5e50fa79e1194f49a46aa9"} Apr 17 16:57:16.398587 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:16.398449 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" Apr 17 16:57:16.398587 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:16.398470 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb" event={"ID":"8f057762-049b-4b81-94d7-6b41acb51d00","Type":"ContainerDied","Data":"924e2cd6ed8e6779a35474c1701e496781bd000465996b80ec17cf03973340f2"} Apr 17 16:57:16.398587 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:16.398487 2573 scope.go:117] "RemoveContainer" containerID="cf768e7c423268d7c10b56211ed4a540d3aab14e8a25e69cf21637e920ee5cfb" Apr 17 16:57:16.406710 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:16.406444 2573 scope.go:117] "RemoveContainer" containerID="90016fd79fdcdcd87b7dfd8b91b93d6277fc258d6d5e50fa79e1194f49a46aa9" Apr 17 16:57:16.413661 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:16.413644 2573 scope.go:117] "RemoveContainer" containerID="d7d256570b6f511216bcd0a20b5ee395b9127a927aa5a18c0ccfddac576cfe05" Apr 17 16:57:16.420487 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:16.420463 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb"] Apr 17 16:57:16.421237 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:16.421223 2573 scope.go:117] "RemoveContainer" containerID="cf768e7c423268d7c10b56211ed4a540d3aab14e8a25e69cf21637e920ee5cfb" Apr 17 16:57:16.421498 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:57:16.421478 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf768e7c423268d7c10b56211ed4a540d3aab14e8a25e69cf21637e920ee5cfb\": container with ID starting with cf768e7c423268d7c10b56211ed4a540d3aab14e8a25e69cf21637e920ee5cfb not found: ID does not exist" containerID="cf768e7c423268d7c10b56211ed4a540d3aab14e8a25e69cf21637e920ee5cfb" Apr 17 16:57:16.421542 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:16.421508 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf768e7c423268d7c10b56211ed4a540d3aab14e8a25e69cf21637e920ee5cfb"} err="failed to get container status \"cf768e7c423268d7c10b56211ed4a540d3aab14e8a25e69cf21637e920ee5cfb\": rpc error: code = NotFound desc = could not find container \"cf768e7c423268d7c10b56211ed4a540d3aab14e8a25e69cf21637e920ee5cfb\": container with ID starting with cf768e7c423268d7c10b56211ed4a540d3aab14e8a25e69cf21637e920ee5cfb not found: ID does not exist" Apr 17 16:57:16.421542 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:16.421525 2573 scope.go:117] "RemoveContainer" containerID="90016fd79fdcdcd87b7dfd8b91b93d6277fc258d6d5e50fa79e1194f49a46aa9" Apr 17 16:57:16.421776 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:57:16.421758 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90016fd79fdcdcd87b7dfd8b91b93d6277fc258d6d5e50fa79e1194f49a46aa9\": container with ID starting with 90016fd79fdcdcd87b7dfd8b91b93d6277fc258d6d5e50fa79e1194f49a46aa9 not found: ID does not exist" containerID="90016fd79fdcdcd87b7dfd8b91b93d6277fc258d6d5e50fa79e1194f49a46aa9" Apr 17 16:57:16.421875 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:16.421784 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90016fd79fdcdcd87b7dfd8b91b93d6277fc258d6d5e50fa79e1194f49a46aa9"} err="failed to get container status \"90016fd79fdcdcd87b7dfd8b91b93d6277fc258d6d5e50fa79e1194f49a46aa9\": rpc error: code = NotFound desc = could not find container \"90016fd79fdcdcd87b7dfd8b91b93d6277fc258d6d5e50fa79e1194f49a46aa9\": container with ID starting with 90016fd79fdcdcd87b7dfd8b91b93d6277fc258d6d5e50fa79e1194f49a46aa9 not found: ID does not exist" Apr 17 16:57:16.421875 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:16.421806 2573 scope.go:117] "RemoveContainer" containerID="d7d256570b6f511216bcd0a20b5ee395b9127a927aa5a18c0ccfddac576cfe05" Apr 17 16:57:16.422063 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:57:16.422048 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7d256570b6f511216bcd0a20b5ee395b9127a927aa5a18c0ccfddac576cfe05\": container with ID starting with d7d256570b6f511216bcd0a20b5ee395b9127a927aa5a18c0ccfddac576cfe05 not found: ID does not exist" containerID="d7d256570b6f511216bcd0a20b5ee395b9127a927aa5a18c0ccfddac576cfe05" Apr 17 16:57:16.422104 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:16.422067 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7d256570b6f511216bcd0a20b5ee395b9127a927aa5a18c0ccfddac576cfe05"} err="failed to get container status \"d7d256570b6f511216bcd0a20b5ee395b9127a927aa5a18c0ccfddac576cfe05\": rpc error: code = NotFound desc = could not find container \"d7d256570b6f511216bcd0a20b5ee395b9127a927aa5a18c0ccfddac576cfe05\": container with ID starting with d7d256570b6f511216bcd0a20b5ee395b9127a927aa5a18c0ccfddac576cfe05 not found: ID does not exist" Apr 17 16:57:16.425667 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:16.425648 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-2bnlb"] Apr 17 16:57:17.175111 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:57:17.175036 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f057762-049b-4b81-94d7-6b41acb51d00" path="/var/lib/kubelet/pods/8f057762-049b-4b81-94d7-6b41acb51d00/volumes" Apr 17 16:58:24.480278 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.480240 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn"] Apr 17 16:58:24.480765 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.480560 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f057762-049b-4b81-94d7-6b41acb51d00" containerName="kube-rbac-proxy" Apr 17 16:58:24.480765 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.480571 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f057762-049b-4b81-94d7-6b41acb51d00" containerName="kube-rbac-proxy" Apr 17 16:58:24.480765 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.480579 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f057762-049b-4b81-94d7-6b41acb51d00" containerName="kserve-container" Apr 17 16:58:24.480765 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.480585 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f057762-049b-4b81-94d7-6b41acb51d00" containerName="kserve-container" Apr 17 16:58:24.480765 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.480594 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f057762-049b-4b81-94d7-6b41acb51d00" containerName="storage-initializer" Apr 17 16:58:24.480765 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.480604 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f057762-049b-4b81-94d7-6b41acb51d00" containerName="storage-initializer" Apr 17 16:58:24.480765 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.480660 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f057762-049b-4b81-94d7-6b41acb51d00" containerName="kube-rbac-proxy" Apr 17 16:58:24.480765 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.480668 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f057762-049b-4b81-94d7-6b41acb51d00" containerName="kserve-container" Apr 17 16:58:24.483694 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.483677 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" Apr 17 16:58:24.486851 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.486817 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-predictor-serving-cert\"" Apr 17 16:58:24.487326 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.487312 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 16:58:24.487417 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.487338 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 16:58:24.487417 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.487382 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 17 16:58:24.487417 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.487408 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-wzvpv\"" Apr 17 16:58:24.494553 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.494529 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn"] Apr 17 16:58:24.538294 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.538265 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87075b3b-926c-4402-a018-40390735a384-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn\" (UID: \"87075b3b-926c-4402-a018-40390735a384\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" Apr 17 16:58:24.538294 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.538298 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29qlm\" (UniqueName: \"kubernetes.io/projected/87075b3b-926c-4402-a018-40390735a384-kube-api-access-29qlm\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn\" (UID: \"87075b3b-926c-4402-a018-40390735a384\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" Apr 17 16:58:24.538506 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.538324 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87075b3b-926c-4402-a018-40390735a384-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn\" (UID: \"87075b3b-926c-4402-a018-40390735a384\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" Apr 17 16:58:24.538506 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.538394 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87075b3b-926c-4402-a018-40390735a384-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn\" (UID: \"87075b3b-926c-4402-a018-40390735a384\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" Apr 17 16:58:24.638983 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.638949 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87075b3b-926c-4402-a018-40390735a384-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn\" (UID: \"87075b3b-926c-4402-a018-40390735a384\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" Apr 17 16:58:24.639162 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.639002 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87075b3b-926c-4402-a018-40390735a384-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn\" (UID: \"87075b3b-926c-4402-a018-40390735a384\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" Apr 17 16:58:24.639162 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.639021 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29qlm\" (UniqueName: \"kubernetes.io/projected/87075b3b-926c-4402-a018-40390735a384-kube-api-access-29qlm\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn\" (UID: \"87075b3b-926c-4402-a018-40390735a384\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" Apr 17 16:58:24.639162 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.639049 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87075b3b-926c-4402-a018-40390735a384-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn\" (UID: \"87075b3b-926c-4402-a018-40390735a384\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" Apr 17 16:58:24.639315 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:58:24.639176 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-serving-cert: secret "isvc-paddle-v2-kserve-predictor-serving-cert" not found Apr 17 16:58:24.639315 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:58:24.639252 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87075b3b-926c-4402-a018-40390735a384-proxy-tls podName:87075b3b-926c-4402-a018-40390735a384 nodeName:}" failed. No retries permitted until 2026-04-17 16:58:25.139231032 +0000 UTC m=+1622.470731916 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/87075b3b-926c-4402-a018-40390735a384-proxy-tls") pod "isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" (UID: "87075b3b-926c-4402-a018-40390735a384") : secret "isvc-paddle-v2-kserve-predictor-serving-cert" not found Apr 17 16:58:24.639613 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.639591 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87075b3b-926c-4402-a018-40390735a384-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn\" (UID: \"87075b3b-926c-4402-a018-40390735a384\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" Apr 17 16:58:24.639661 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.639644 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87075b3b-926c-4402-a018-40390735a384-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn\" (UID: \"87075b3b-926c-4402-a018-40390735a384\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" Apr 17 16:58:24.647490 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:24.647469 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29qlm\" (UniqueName: \"kubernetes.io/projected/87075b3b-926c-4402-a018-40390735a384-kube-api-access-29qlm\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn\" (UID: \"87075b3b-926c-4402-a018-40390735a384\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" Apr 17 16:58:25.142466 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:25.142413 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87075b3b-926c-4402-a018-40390735a384-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn\" (UID: \"87075b3b-926c-4402-a018-40390735a384\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" Apr 17 16:58:25.144913 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:25.144891 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87075b3b-926c-4402-a018-40390735a384-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn\" (UID: \"87075b3b-926c-4402-a018-40390735a384\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" Apr 17 16:58:25.394851 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:25.394732 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" Apr 17 16:58:25.517976 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:25.517954 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn"] Apr 17 16:58:25.518607 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:58:25.518574 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87075b3b_926c_4402_a018_40390735a384.slice/crio-f4d01fdaaaf8e3d882bf2ba7a877615200f8269148a34a0a56fab973633f2caf WatchSource:0}: Error finding container f4d01fdaaaf8e3d882bf2ba7a877615200f8269148a34a0a56fab973633f2caf: Status 404 returned error can't find the container with id f4d01fdaaaf8e3d882bf2ba7a877615200f8269148a34a0a56fab973633f2caf Apr 17 16:58:25.592876 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:25.592843 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" event={"ID":"87075b3b-926c-4402-a018-40390735a384","Type":"ContainerStarted","Data":"66382733977d227e6824d31a3d19413efd3d9b882392520febe06f353871fb7b"} Apr 17 16:58:25.592876 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:25.592880 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" event={"ID":"87075b3b-926c-4402-a018-40390735a384","Type":"ContainerStarted","Data":"f4d01fdaaaf8e3d882bf2ba7a877615200f8269148a34a0a56fab973633f2caf"} Apr 17 16:58:30.609343 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:30.609307 2573 generic.go:358] "Generic (PLEG): container finished" podID="87075b3b-926c-4402-a018-40390735a384" containerID="66382733977d227e6824d31a3d19413efd3d9b882392520febe06f353871fb7b" exitCode=0 Apr 17 16:58:30.609765 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:30.609382 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" event={"ID":"87075b3b-926c-4402-a018-40390735a384","Type":"ContainerDied","Data":"66382733977d227e6824d31a3d19413efd3d9b882392520febe06f353871fb7b"} Apr 17 16:58:31.614138 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:31.614101 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" event={"ID":"87075b3b-926c-4402-a018-40390735a384","Type":"ContainerStarted","Data":"705b01d57426c4e5b2cc0a5d7718e2d780cd7fde4f07191b16977de56fb6a49c"} Apr 17 16:58:31.614138 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:31.614141 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" event={"ID":"87075b3b-926c-4402-a018-40390735a384","Type":"ContainerStarted","Data":"c800ec04ffb374b5495ac7bcbad6c60fdd4061a9bf45045d3b95ab7154515cca"} Apr 17 16:58:31.614615 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:31.614330 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" Apr 17 16:58:31.635761 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:31.635709 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" podStartSLOduration=7.635693227 podStartE2EDuration="7.635693227s" podCreationTimestamp="2026-04-17 16:58:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:58:31.634488144 +0000 UTC m=+1628.965989058" watchObservedRunningTime="2026-04-17 16:58:31.635693227 +0000 UTC m=+1628.967194128" Apr 17 16:58:32.617069 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:32.617039 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" Apr 17 16:58:32.618172 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:32.618148 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" podUID="87075b3b-926c-4402-a018-40390735a384" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 17 16:58:33.620043 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:33.619998 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" podUID="87075b3b-926c-4402-a018-40390735a384" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 17 16:58:38.624899 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:38.624867 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" Apr 17 16:58:38.625451 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:38.625424 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" podUID="87075b3b-926c-4402-a018-40390735a384" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 17 16:58:48.625664 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:48.625617 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" podUID="87075b3b-926c-4402-a018-40390735a384" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 17 16:58:58.628594 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:58:58.628551 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" podUID="87075b3b-926c-4402-a018-40390735a384" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 17 16:59:08.625353 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:08.625310 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" podUID="87075b3b-926c-4402-a018-40390735a384" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 17 16:59:18.626558 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:18.626526 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" Apr 17 16:59:26.367737 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:26.367701 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz"] Apr 17 16:59:26.371677 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:26.371653 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" Apr 17 16:59:26.375252 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:26.375226 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-predictor-serving-cert\"" Apr 17 16:59:26.375387 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:26.375299 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-kube-rbac-proxy-sar-config\"" Apr 17 16:59:26.383581 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:26.383558 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz"] Apr 17 16:59:26.408324 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:26.408292 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn"] Apr 17 16:59:26.408614 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:26.408588 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" podUID="87075b3b-926c-4402-a018-40390735a384" containerName="kserve-container" containerID="cri-o://c800ec04ffb374b5495ac7bcbad6c60fdd4061a9bf45045d3b95ab7154515cca" gracePeriod=30 Apr 17 16:59:26.408676 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:26.408628 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" podUID="87075b3b-926c-4402-a018-40390735a384" containerName="kube-rbac-proxy" containerID="cri-o://705b01d57426c4e5b2cc0a5d7718e2d780cd7fde4f07191b16977de56fb6a49c" gracePeriod=30 Apr 17 16:59:26.426000 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:26.425970 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fw8m\" (UniqueName: \"kubernetes.io/projected/cf14eeb7-1c92-49a5-afde-dab65331e5d2-kube-api-access-9fw8m\") pod \"isvc-pmml-predictor-8bb578669-rztqz\" (UID: \"cf14eeb7-1c92-49a5-afde-dab65331e5d2\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" Apr 17 16:59:26.426122 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:26.426020 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cf14eeb7-1c92-49a5-afde-dab65331e5d2-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-rztqz\" (UID: \"cf14eeb7-1c92-49a5-afde-dab65331e5d2\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" Apr 17 16:59:26.426122 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:26.426041 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf14eeb7-1c92-49a5-afde-dab65331e5d2-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-rztqz\" (UID: \"cf14eeb7-1c92-49a5-afde-dab65331e5d2\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" Apr 17 16:59:26.426122 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:26.426060 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cf14eeb7-1c92-49a5-afde-dab65331e5d2-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-rztqz\" (UID: \"cf14eeb7-1c92-49a5-afde-dab65331e5d2\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" Apr 17 16:59:26.526719 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:26.526693 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fw8m\" (UniqueName: \"kubernetes.io/projected/cf14eeb7-1c92-49a5-afde-dab65331e5d2-kube-api-access-9fw8m\") pod \"isvc-pmml-predictor-8bb578669-rztqz\" (UID: \"cf14eeb7-1c92-49a5-afde-dab65331e5d2\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" Apr 17 16:59:26.526810 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:26.526736 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cf14eeb7-1c92-49a5-afde-dab65331e5d2-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-rztqz\" (UID: \"cf14eeb7-1c92-49a5-afde-dab65331e5d2\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" Apr 17 16:59:26.526810 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:26.526757 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf14eeb7-1c92-49a5-afde-dab65331e5d2-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-rztqz\" (UID: \"cf14eeb7-1c92-49a5-afde-dab65331e5d2\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" Apr 17 16:59:26.526810 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:26.526781 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cf14eeb7-1c92-49a5-afde-dab65331e5d2-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-rztqz\" (UID: \"cf14eeb7-1c92-49a5-afde-dab65331e5d2\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" Apr 17 16:59:26.526998 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:59:26.526926 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-pmml-predictor-serving-cert: secret "isvc-pmml-predictor-serving-cert" not found Apr 17 16:59:26.527042 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:59:26.527016 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf14eeb7-1c92-49a5-afde-dab65331e5d2-proxy-tls podName:cf14eeb7-1c92-49a5-afde-dab65331e5d2 nodeName:}" failed. No retries permitted until 2026-04-17 16:59:27.026995336 +0000 UTC m=+1684.358496228 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/cf14eeb7-1c92-49a5-afde-dab65331e5d2-proxy-tls") pod "isvc-pmml-predictor-8bb578669-rztqz" (UID: "cf14eeb7-1c92-49a5-afde-dab65331e5d2") : secret "isvc-pmml-predictor-serving-cert" not found Apr 17 16:59:26.527110 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:26.527086 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cf14eeb7-1c92-49a5-afde-dab65331e5d2-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-rztqz\" (UID: \"cf14eeb7-1c92-49a5-afde-dab65331e5d2\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" Apr 17 16:59:26.527412 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:26.527396 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cf14eeb7-1c92-49a5-afde-dab65331e5d2-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-rztqz\" (UID: \"cf14eeb7-1c92-49a5-afde-dab65331e5d2\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" Apr 17 16:59:26.536061 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:26.536033 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fw8m\" (UniqueName: \"kubernetes.io/projected/cf14eeb7-1c92-49a5-afde-dab65331e5d2-kube-api-access-9fw8m\") pod \"isvc-pmml-predictor-8bb578669-rztqz\" (UID: \"cf14eeb7-1c92-49a5-afde-dab65331e5d2\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" Apr 17 16:59:26.772880 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:26.772847 2573 generic.go:358] "Generic (PLEG): container finished" podID="87075b3b-926c-4402-a018-40390735a384" containerID="705b01d57426c4e5b2cc0a5d7718e2d780cd7fde4f07191b16977de56fb6a49c" exitCode=2 Apr 17 16:59:26.773059 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:26.772914 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" event={"ID":"87075b3b-926c-4402-a018-40390735a384","Type":"ContainerDied","Data":"705b01d57426c4e5b2cc0a5d7718e2d780cd7fde4f07191b16977de56fb6a49c"} Apr 17 16:59:27.032284 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:27.032185 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf14eeb7-1c92-49a5-afde-dab65331e5d2-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-rztqz\" (UID: \"cf14eeb7-1c92-49a5-afde-dab65331e5d2\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" Apr 17 16:59:27.034612 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:27.034590 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf14eeb7-1c92-49a5-afde-dab65331e5d2-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-rztqz\" (UID: \"cf14eeb7-1c92-49a5-afde-dab65331e5d2\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" Apr 17 16:59:27.282249 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:27.282212 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" Apr 17 16:59:27.402655 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:27.402502 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz"] Apr 17 16:59:27.405112 ip-10-0-128-217 kubenswrapper[2573]: W0417 16:59:27.405085 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf14eeb7_1c92_49a5_afde_dab65331e5d2.slice/crio-e6005816fc1a8c5c3c45b224213cff4153599dc4c2e078d32b3eb1b10aefad4d WatchSource:0}: Error finding container e6005816fc1a8c5c3c45b224213cff4153599dc4c2e078d32b3eb1b10aefad4d: Status 404 returned error can't find the container with id e6005816fc1a8c5c3c45b224213cff4153599dc4c2e078d32b3eb1b10aefad4d Apr 17 16:59:27.777202 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:27.777165 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" event={"ID":"cf14eeb7-1c92-49a5-afde-dab65331e5d2","Type":"ContainerStarted","Data":"99f4fe173edb8107c36d963148f61092e39eda466adc525d1f4732ae154d8dd9"} Apr 17 16:59:27.777202 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:27.777208 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" event={"ID":"cf14eeb7-1c92-49a5-afde-dab65331e5d2","Type":"ContainerStarted","Data":"e6005816fc1a8c5c3c45b224213cff4153599dc4c2e078d32b3eb1b10aefad4d"} Apr 17 16:59:28.620706 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:28.620670 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" podUID="87075b3b-926c-4402-a018-40390735a384" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.19:8643/healthz\": dial tcp 10.134.0.19:8643: connect: connection refused" Apr 17 16:59:28.625625 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:28.625599 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" podUID="87075b3b-926c-4402-a018-40390735a384" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 17 16:59:29.251612 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.251591 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" Apr 17 16:59:29.351607 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.351510 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87075b3b-926c-4402-a018-40390735a384-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"87075b3b-926c-4402-a018-40390735a384\" (UID: \"87075b3b-926c-4402-a018-40390735a384\") " Apr 17 16:59:29.351607 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.351566 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87075b3b-926c-4402-a018-40390735a384-proxy-tls\") pod \"87075b3b-926c-4402-a018-40390735a384\" (UID: \"87075b3b-926c-4402-a018-40390735a384\") " Apr 17 16:59:29.351607 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.351599 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87075b3b-926c-4402-a018-40390735a384-kserve-provision-location\") pod \"87075b3b-926c-4402-a018-40390735a384\" (UID: \"87075b3b-926c-4402-a018-40390735a384\") " Apr 17 16:59:29.351922 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.351701 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29qlm\" (UniqueName: \"kubernetes.io/projected/87075b3b-926c-4402-a018-40390735a384-kube-api-access-29qlm\") pod \"87075b3b-926c-4402-a018-40390735a384\" (UID: \"87075b3b-926c-4402-a018-40390735a384\") " Apr 17 16:59:29.351985 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.351946 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87075b3b-926c-4402-a018-40390735a384-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config") pod "87075b3b-926c-4402-a018-40390735a384" (UID: "87075b3b-926c-4402-a018-40390735a384"). InnerVolumeSpecName "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:59:29.353717 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.353682 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87075b3b-926c-4402-a018-40390735a384-kube-api-access-29qlm" (OuterVolumeSpecName: "kube-api-access-29qlm") pod "87075b3b-926c-4402-a018-40390735a384" (UID: "87075b3b-926c-4402-a018-40390735a384"). InnerVolumeSpecName "kube-api-access-29qlm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:59:29.353717 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.353705 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87075b3b-926c-4402-a018-40390735a384-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "87075b3b-926c-4402-a018-40390735a384" (UID: "87075b3b-926c-4402-a018-40390735a384"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:59:29.360832 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.360796 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87075b3b-926c-4402-a018-40390735a384-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "87075b3b-926c-4402-a018-40390735a384" (UID: "87075b3b-926c-4402-a018-40390735a384"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:59:29.453162 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.453126 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87075b3b-926c-4402-a018-40390735a384-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 16:59:29.453162 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.453156 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-29qlm\" (UniqueName: \"kubernetes.io/projected/87075b3b-926c-4402-a018-40390735a384-kube-api-access-29qlm\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 16:59:29.453162 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.453168 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87075b3b-926c-4402-a018-40390735a384-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 16:59:29.453401 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.453180 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87075b3b-926c-4402-a018-40390735a384-proxy-tls\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 16:59:29.783873 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.783816 2573 generic.go:358] "Generic (PLEG): container finished" podID="87075b3b-926c-4402-a018-40390735a384" containerID="c800ec04ffb374b5495ac7bcbad6c60fdd4061a9bf45045d3b95ab7154515cca" exitCode=0 Apr 17 16:59:29.783873 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.783862 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" event={"ID":"87075b3b-926c-4402-a018-40390735a384","Type":"ContainerDied","Data":"c800ec04ffb374b5495ac7bcbad6c60fdd4061a9bf45045d3b95ab7154515cca"} Apr 17 16:59:29.784364 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.783916 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" event={"ID":"87075b3b-926c-4402-a018-40390735a384","Type":"ContainerDied","Data":"f4d01fdaaaf8e3d882bf2ba7a877615200f8269148a34a0a56fab973633f2caf"} Apr 17 16:59:29.784364 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.783921 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn" Apr 17 16:59:29.784364 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.783937 2573 scope.go:117] "RemoveContainer" containerID="705b01d57426c4e5b2cc0a5d7718e2d780cd7fde4f07191b16977de56fb6a49c" Apr 17 16:59:29.792623 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.792603 2573 scope.go:117] "RemoveContainer" containerID="c800ec04ffb374b5495ac7bcbad6c60fdd4061a9bf45045d3b95ab7154515cca" Apr 17 16:59:29.799632 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.799612 2573 scope.go:117] "RemoveContainer" containerID="66382733977d227e6824d31a3d19413efd3d9b882392520febe06f353871fb7b" Apr 17 16:59:29.806005 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.805981 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn"] Apr 17 16:59:29.806857 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.806808 2573 scope.go:117] "RemoveContainer" containerID="705b01d57426c4e5b2cc0a5d7718e2d780cd7fde4f07191b16977de56fb6a49c" Apr 17 16:59:29.807123 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:59:29.807105 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"705b01d57426c4e5b2cc0a5d7718e2d780cd7fde4f07191b16977de56fb6a49c\": container with ID starting with 705b01d57426c4e5b2cc0a5d7718e2d780cd7fde4f07191b16977de56fb6a49c not found: ID does not exist" containerID="705b01d57426c4e5b2cc0a5d7718e2d780cd7fde4f07191b16977de56fb6a49c" Apr 17 16:59:29.807189 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.807131 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"705b01d57426c4e5b2cc0a5d7718e2d780cd7fde4f07191b16977de56fb6a49c"} err="failed to get container status \"705b01d57426c4e5b2cc0a5d7718e2d780cd7fde4f07191b16977de56fb6a49c\": rpc error: code = NotFound desc = could not find container \"705b01d57426c4e5b2cc0a5d7718e2d780cd7fde4f07191b16977de56fb6a49c\": container with ID starting with 705b01d57426c4e5b2cc0a5d7718e2d780cd7fde4f07191b16977de56fb6a49c not found: ID does not exist" Apr 17 16:59:29.807189 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.807154 2573 scope.go:117] "RemoveContainer" containerID="c800ec04ffb374b5495ac7bcbad6c60fdd4061a9bf45045d3b95ab7154515cca" Apr 17 16:59:29.807403 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:59:29.807380 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c800ec04ffb374b5495ac7bcbad6c60fdd4061a9bf45045d3b95ab7154515cca\": container with ID starting with c800ec04ffb374b5495ac7bcbad6c60fdd4061a9bf45045d3b95ab7154515cca not found: ID does not exist" containerID="c800ec04ffb374b5495ac7bcbad6c60fdd4061a9bf45045d3b95ab7154515cca" Apr 17 16:59:29.807494 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.807411 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c800ec04ffb374b5495ac7bcbad6c60fdd4061a9bf45045d3b95ab7154515cca"} err="failed to get container status \"c800ec04ffb374b5495ac7bcbad6c60fdd4061a9bf45045d3b95ab7154515cca\": rpc error: code = NotFound desc = could not find container \"c800ec04ffb374b5495ac7bcbad6c60fdd4061a9bf45045d3b95ab7154515cca\": container with ID starting with c800ec04ffb374b5495ac7bcbad6c60fdd4061a9bf45045d3b95ab7154515cca not found: ID does not exist" Apr 17 16:59:29.807494 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.807427 2573 scope.go:117] "RemoveContainer" containerID="66382733977d227e6824d31a3d19413efd3d9b882392520febe06f353871fb7b" Apr 17 16:59:29.807652 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:59:29.807634 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66382733977d227e6824d31a3d19413efd3d9b882392520febe06f353871fb7b\": container with ID starting with 66382733977d227e6824d31a3d19413efd3d9b882392520febe06f353871fb7b not found: ID does not exist" containerID="66382733977d227e6824d31a3d19413efd3d9b882392520febe06f353871fb7b" Apr 17 16:59:29.807693 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.807655 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66382733977d227e6824d31a3d19413efd3d9b882392520febe06f353871fb7b"} err="failed to get container status \"66382733977d227e6824d31a3d19413efd3d9b882392520febe06f353871fb7b\": rpc error: code = NotFound desc = could not find container \"66382733977d227e6824d31a3d19413efd3d9b882392520febe06f353871fb7b\": container with ID starting with 66382733977d227e6824d31a3d19413efd3d9b882392520febe06f353871fb7b not found: ID does not exist" Apr 17 16:59:29.810913 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:29.810891 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-lvxwn"] Apr 17 16:59:31.175202 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:31.175169 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87075b3b-926c-4402-a018-40390735a384" path="/var/lib/kubelet/pods/87075b3b-926c-4402-a018-40390735a384/volumes" Apr 17 16:59:31.386841 ip-10-0-128-217 kubenswrapper[2573]: E0417 16:59:31.386792 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf14eeb7_1c92_49a5_afde_dab65331e5d2.slice/crio-conmon-99f4fe173edb8107c36d963148f61092e39eda466adc525d1f4732ae154d8dd9.scope\": RecentStats: unable to find data in memory cache]" Apr 17 16:59:31.791519 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:31.791480 2573 generic.go:358] "Generic (PLEG): container finished" podID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerID="99f4fe173edb8107c36d963148f61092e39eda466adc525d1f4732ae154d8dd9" exitCode=0 Apr 17 16:59:31.791671 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:31.791555 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" event={"ID":"cf14eeb7-1c92-49a5-afde-dab65331e5d2","Type":"ContainerDied","Data":"99f4fe173edb8107c36d963148f61092e39eda466adc525d1f4732ae154d8dd9"} Apr 17 16:59:38.817430 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:38.817389 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" event={"ID":"cf14eeb7-1c92-49a5-afde-dab65331e5d2","Type":"ContainerStarted","Data":"8163c9a807326e205e8d6168d3df11550d112b99a1c2515268881c86026ba5a7"} Apr 17 16:59:38.817430 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:38.817436 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" event={"ID":"cf14eeb7-1c92-49a5-afde-dab65331e5d2","Type":"ContainerStarted","Data":"bf239e83a457d8d909d100a91fc2b8c8db4169f3c39d454009733c36ce4a78fc"} Apr 17 16:59:38.817991 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:38.817669 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" Apr 17 16:59:38.837512 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:38.837404 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" podStartSLOduration=6.047141168 podStartE2EDuration="12.837386883s" podCreationTimestamp="2026-04-17 16:59:26 +0000 UTC" firstStartedPulling="2026-04-17 16:59:31.792710912 +0000 UTC m=+1689.124211795" lastFinishedPulling="2026-04-17 16:59:38.58295663 +0000 UTC m=+1695.914457510" observedRunningTime="2026-04-17 16:59:38.835543593 +0000 UTC m=+1696.167044495" watchObservedRunningTime="2026-04-17 16:59:38.837386883 +0000 UTC m=+1696.168887785" Apr 17 16:59:39.820218 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:39.820188 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" Apr 17 16:59:39.821344 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:39.821320 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" podUID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 17 16:59:40.822903 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:40.822859 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" podUID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 17 16:59:45.826916 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:45.826883 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" Apr 17 16:59:45.827478 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:45.827453 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" podUID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 17 16:59:55.827890 ip-10-0-128-217 kubenswrapper[2573]: I0417 16:59:55.827852 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" podUID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 17 17:00:05.827543 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:00:05.827506 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" podUID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 17 17:00:15.827448 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:00:15.827407 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" podUID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 17 17:00:25.827609 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:00:25.827562 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" podUID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 17 17:00:35.827698 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:00:35.827652 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" podUID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 17 17:00:45.827681 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:00:45.827640 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" podUID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 17 17:00:54.171446 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:00:54.171400 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" podUID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 17 17:01:04.171925 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:04.171889 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" Apr 17 17:01:07.510922 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.510840 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz"] Apr 17 17:01:07.511309 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.511118 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" podUID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerName="kserve-container" containerID="cri-o://bf239e83a457d8d909d100a91fc2b8c8db4169f3c39d454009733c36ce4a78fc" gracePeriod=30 Apr 17 17:01:07.511309 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.511158 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" podUID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerName="kube-rbac-proxy" containerID="cri-o://8163c9a807326e205e8d6168d3df11550d112b99a1c2515268881c86026ba5a7" gracePeriod=30 Apr 17 17:01:07.636819 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.636787 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr"] Apr 17 17:01:07.637177 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.637160 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87075b3b-926c-4402-a018-40390735a384" containerName="storage-initializer" Apr 17 17:01:07.637259 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.637180 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="87075b3b-926c-4402-a018-40390735a384" containerName="storage-initializer" Apr 17 17:01:07.637259 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.637212 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87075b3b-926c-4402-a018-40390735a384" containerName="kserve-container" Apr 17 17:01:07.637259 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.637221 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="87075b3b-926c-4402-a018-40390735a384" containerName="kserve-container" Apr 17 17:01:07.637259 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.637230 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87075b3b-926c-4402-a018-40390735a384" containerName="kube-rbac-proxy" Apr 17 17:01:07.637259 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.637239 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="87075b3b-926c-4402-a018-40390735a384" containerName="kube-rbac-proxy" Apr 17 17:01:07.637509 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.637317 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="87075b3b-926c-4402-a018-40390735a384" containerName="kserve-container" Apr 17 17:01:07.637509 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.637330 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="87075b3b-926c-4402-a018-40390735a384" containerName="kube-rbac-proxy" Apr 17 17:01:07.640382 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.640360 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" Apr 17 17:01:07.642707 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.642684 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-predictor-serving-cert\"" Apr 17 17:01:07.643984 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.643967 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-kube-rbac-proxy-sar-config\"" Apr 17 17:01:07.655558 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.655468 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr"] Apr 17 17:01:07.760167 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.760135 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-8mtsr\" (UID: \"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" Apr 17 17:01:07.760336 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.760180 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-8mtsr\" (UID: \"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" Apr 17 17:01:07.760336 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.760280 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-8mtsr\" (UID: \"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" Apr 17 17:01:07.760336 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.760323 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tqfv\" (UniqueName: \"kubernetes.io/projected/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-kube-api-access-7tqfv\") pod \"isvc-pmml-runtime-predictor-67bc544947-8mtsr\" (UID: \"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" Apr 17 17:01:07.861660 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.861572 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-8mtsr\" (UID: \"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" Apr 17 17:01:07.861660 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.861626 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-8mtsr\" (UID: \"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" Apr 17 17:01:07.861660 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.861651 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tqfv\" (UniqueName: \"kubernetes.io/projected/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-kube-api-access-7tqfv\") pod \"isvc-pmml-runtime-predictor-67bc544947-8mtsr\" (UID: \"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" Apr 17 17:01:07.861951 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.861682 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-8mtsr\" (UID: \"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" Apr 17 17:01:07.862157 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.862136 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-8mtsr\" (UID: \"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" Apr 17 17:01:07.862274 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.862254 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-8mtsr\" (UID: \"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" Apr 17 17:01:07.864161 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.864145 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-8mtsr\" (UID: \"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" Apr 17 17:01:07.870545 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.870511 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tqfv\" (UniqueName: \"kubernetes.io/projected/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-kube-api-access-7tqfv\") pod \"isvc-pmml-runtime-predictor-67bc544947-8mtsr\" (UID: \"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" Apr 17 17:01:07.954910 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:07.954865 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" Apr 17 17:01:08.073178 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:08.073143 2573 generic.go:358] "Generic (PLEG): container finished" podID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerID="8163c9a807326e205e8d6168d3df11550d112b99a1c2515268881c86026ba5a7" exitCode=2 Apr 17 17:01:08.073342 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:08.073211 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" event={"ID":"cf14eeb7-1c92-49a5-afde-dab65331e5d2","Type":"ContainerDied","Data":"8163c9a807326e205e8d6168d3df11550d112b99a1c2515268881c86026ba5a7"} Apr 17 17:01:08.075848 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:08.075805 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr"] Apr 17 17:01:08.078855 ip-10-0-128-217 kubenswrapper[2573]: W0417 17:01:08.078812 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce4d0fe5_a6f1_481a_b9c9_9e3d21ea9887.slice/crio-858ade4cff155fcea21cf9c8d8d51e0f8422c84fad9b4fd048396b32f6a0e8fc WatchSource:0}: Error finding container 858ade4cff155fcea21cf9c8d8d51e0f8422c84fad9b4fd048396b32f6a0e8fc: Status 404 returned error can't find the container with id 858ade4cff155fcea21cf9c8d8d51e0f8422c84fad9b4fd048396b32f6a0e8fc Apr 17 17:01:08.080387 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:08.080374 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:01:09.076713 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:09.076676 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" event={"ID":"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887","Type":"ContainerStarted","Data":"75de99432254c766fa3fbc9358912350c85de238a5a6dcc1cb3e5398265d1bfd"} Apr 17 17:01:09.076713 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:09.076719 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" event={"ID":"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887","Type":"ContainerStarted","Data":"858ade4cff155fcea21cf9c8d8d51e0f8422c84fad9b4fd048396b32f6a0e8fc"} Apr 17 17:01:10.824076 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:10.824027 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" podUID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.20:8643/healthz\": dial tcp 10.134.0.20:8643: connect: connection refused" Apr 17 17:01:11.084849 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:11.084765 2573 generic.go:358] "Generic (PLEG): container finished" podID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerID="bf239e83a457d8d909d100a91fc2b8c8db4169f3c39d454009733c36ce4a78fc" exitCode=0 Apr 17 17:01:11.084976 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:11.084854 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" event={"ID":"cf14eeb7-1c92-49a5-afde-dab65331e5d2","Type":"ContainerDied","Data":"bf239e83a457d8d909d100a91fc2b8c8db4169f3c39d454009733c36ce4a78fc"} Apr 17 17:01:11.144389 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:11.144368 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" Apr 17 17:01:11.186421 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:11.186394 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fw8m\" (UniqueName: \"kubernetes.io/projected/cf14eeb7-1c92-49a5-afde-dab65331e5d2-kube-api-access-9fw8m\") pod \"cf14eeb7-1c92-49a5-afde-dab65331e5d2\" (UID: \"cf14eeb7-1c92-49a5-afde-dab65331e5d2\") " Apr 17 17:01:11.186596 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:11.186439 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cf14eeb7-1c92-49a5-afde-dab65331e5d2-kserve-provision-location\") pod \"cf14eeb7-1c92-49a5-afde-dab65331e5d2\" (UID: \"cf14eeb7-1c92-49a5-afde-dab65331e5d2\") " Apr 17 17:01:11.186596 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:11.186493 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf14eeb7-1c92-49a5-afde-dab65331e5d2-proxy-tls\") pod \"cf14eeb7-1c92-49a5-afde-dab65331e5d2\" (UID: \"cf14eeb7-1c92-49a5-afde-dab65331e5d2\") " Apr 17 17:01:11.186596 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:11.186521 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cf14eeb7-1c92-49a5-afde-dab65331e5d2-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"cf14eeb7-1c92-49a5-afde-dab65331e5d2\" (UID: \"cf14eeb7-1c92-49a5-afde-dab65331e5d2\") " Apr 17 17:01:11.186870 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:11.186846 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf14eeb7-1c92-49a5-afde-dab65331e5d2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cf14eeb7-1c92-49a5-afde-dab65331e5d2" (UID: "cf14eeb7-1c92-49a5-afde-dab65331e5d2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:01:11.186943 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:11.186924 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf14eeb7-1c92-49a5-afde-dab65331e5d2-isvc-pmml-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-kube-rbac-proxy-sar-config") pod "cf14eeb7-1c92-49a5-afde-dab65331e5d2" (UID: "cf14eeb7-1c92-49a5-afde-dab65331e5d2"). InnerVolumeSpecName "isvc-pmml-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:01:11.188600 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:11.188579 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf14eeb7-1c92-49a5-afde-dab65331e5d2-kube-api-access-9fw8m" (OuterVolumeSpecName: "kube-api-access-9fw8m") pod "cf14eeb7-1c92-49a5-afde-dab65331e5d2" (UID: "cf14eeb7-1c92-49a5-afde-dab65331e5d2"). InnerVolumeSpecName "kube-api-access-9fw8m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:01:11.188661 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:11.188599 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf14eeb7-1c92-49a5-afde-dab65331e5d2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cf14eeb7-1c92-49a5-afde-dab65331e5d2" (UID: "cf14eeb7-1c92-49a5-afde-dab65331e5d2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:01:11.287791 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:11.287757 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cf14eeb7-1c92-49a5-afde-dab65331e5d2-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:01:11.287791 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:11.287787 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf14eeb7-1c92-49a5-afde-dab65331e5d2-proxy-tls\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:01:11.287791 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:11.287798 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cf14eeb7-1c92-49a5-afde-dab65331e5d2-isvc-pmml-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:01:11.288041 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:11.287808 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9fw8m\" (UniqueName: \"kubernetes.io/projected/cf14eeb7-1c92-49a5-afde-dab65331e5d2-kube-api-access-9fw8m\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:01:12.088726 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:12.088637 2573 generic.go:358] "Generic (PLEG): container finished" podID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerID="75de99432254c766fa3fbc9358912350c85de238a5a6dcc1cb3e5398265d1bfd" exitCode=0 Apr 17 17:01:12.088726 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:12.088707 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" event={"ID":"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887","Type":"ContainerDied","Data":"75de99432254c766fa3fbc9358912350c85de238a5a6dcc1cb3e5398265d1bfd"} Apr 17 17:01:12.090479 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:12.090454 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" event={"ID":"cf14eeb7-1c92-49a5-afde-dab65331e5d2","Type":"ContainerDied","Data":"e6005816fc1a8c5c3c45b224213cff4153599dc4c2e078d32b3eb1b10aefad4d"} Apr 17 17:01:12.090479 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:12.090480 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz" Apr 17 17:01:12.090633 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:12.090497 2573 scope.go:117] "RemoveContainer" containerID="8163c9a807326e205e8d6168d3df11550d112b99a1c2515268881c86026ba5a7" Apr 17 17:01:12.099345 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:12.099257 2573 scope.go:117] "RemoveContainer" containerID="bf239e83a457d8d909d100a91fc2b8c8db4169f3c39d454009733c36ce4a78fc" Apr 17 17:01:12.106399 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:12.106380 2573 scope.go:117] "RemoveContainer" containerID="99f4fe173edb8107c36d963148f61092e39eda466adc525d1f4732ae154d8dd9" Apr 17 17:01:12.158589 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:12.158565 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz"] Apr 17 17:01:12.171893 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:12.171870 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rztqz"] Apr 17 17:01:13.095280 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:13.095247 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" event={"ID":"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887","Type":"ContainerStarted","Data":"5f631f058678b6a983101278b20f00de0ac7bbabb6a3f8f47aba7201342e023d"} Apr 17 17:01:13.095280 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:13.095286 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" event={"ID":"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887","Type":"ContainerStarted","Data":"4159f4de2d261e4be493728c5202fc6d6f7fe8d6dd892d99d0232ee7c29b429b"} Apr 17 17:01:13.095808 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:13.095509 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" Apr 17 17:01:13.118584 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:13.118535 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" podStartSLOduration=6.118521174 podStartE2EDuration="6.118521174s" podCreationTimestamp="2026-04-17 17:01:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:01:13.116299753 +0000 UTC m=+1790.447800654" watchObservedRunningTime="2026-04-17 17:01:13.118521174 +0000 UTC m=+1790.450022076" Apr 17 17:01:13.174402 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:13.174369 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" path="/var/lib/kubelet/pods/cf14eeb7-1c92-49a5-afde-dab65331e5d2/volumes" Apr 17 17:01:14.099131 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:14.099102 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" Apr 17 17:01:14.100331 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:14.100302 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 17 17:01:15.101488 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:15.101438 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 17 17:01:20.106334 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:20.106303 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" Apr 17 17:01:20.106931 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:20.106901 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 17 17:01:23.168933 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:23.168911 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 17:01:23.169607 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:23.169588 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 17:01:30.107096 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:30.107053 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 17 17:01:40.107655 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:40.107616 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 17 17:01:50.107272 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:01:50.107229 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 17 17:02:00.107457 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:00.107420 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 17 17:02:10.107064 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:10.107014 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 17 17:02:20.107094 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:20.107055 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 17 17:02:30.107377 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:30.107338 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 17 17:02:40.107672 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:40.107591 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" Apr 17 17:02:48.735227 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:48.735178 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr"] Apr 17 17:02:48.735713 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:48.735494 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerName="kserve-container" containerID="cri-o://4159f4de2d261e4be493728c5202fc6d6f7fe8d6dd892d99d0232ee7c29b429b" gracePeriod=30 Apr 17 17:02:48.735713 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:48.735550 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerName="kube-rbac-proxy" containerID="cri-o://5f631f058678b6a983101278b20f00de0ac7bbabb6a3f8f47aba7201342e023d" gracePeriod=30 Apr 17 17:02:48.836664 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:48.836631 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn"] Apr 17 17:02:48.836969 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:48.836957 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerName="kube-rbac-proxy" Apr 17 17:02:48.837016 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:48.836971 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerName="kube-rbac-proxy" Apr 17 17:02:48.837016 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:48.836986 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerName="storage-initializer" Apr 17 17:02:48.837016 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:48.836991 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerName="storage-initializer" Apr 17 17:02:48.837016 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:48.837007 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerName="kserve-container" Apr 17 17:02:48.837016 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:48.837013 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerName="kserve-container" Apr 17 17:02:48.837157 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:48.837059 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerName="kserve-container" Apr 17 17:02:48.837157 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:48.837070 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="cf14eeb7-1c92-49a5-afde-dab65331e5d2" containerName="kube-rbac-proxy" Apr 17 17:02:48.840104 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:48.840083 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" Apr 17 17:02:48.842900 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:48.842873 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-predictor-serving-cert\"" Apr 17 17:02:48.843635 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:48.843615 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 17 17:02:48.851129 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:48.851104 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn"] Apr 17 17:02:48.911279 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:48.911241 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn\" (UID: \"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" Apr 17 17:02:48.911279 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:48.911286 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn\" (UID: \"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" Apr 17 17:02:48.911494 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:48.911354 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn\" (UID: \"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" Apr 17 17:02:48.911494 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:48.911419 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzpt7\" (UniqueName: \"kubernetes.io/projected/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-kube-api-access-qzpt7\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn\" (UID: \"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" Apr 17 17:02:49.011844 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:49.011737 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn\" (UID: \"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" Apr 17 17:02:49.011844 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:49.011800 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn\" (UID: \"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" Apr 17 17:02:49.012041 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:49.011886 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzpt7\" (UniqueName: \"kubernetes.io/projected/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-kube-api-access-qzpt7\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn\" (UID: \"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" Apr 17 17:02:49.012041 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:49.011925 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn\" (UID: \"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" Apr 17 17:02:49.012248 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:49.012225 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn\" (UID: \"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" Apr 17 17:02:49.012483 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:49.012461 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn\" (UID: \"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" Apr 17 17:02:49.014291 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:49.014274 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn\" (UID: \"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" Apr 17 17:02:49.020159 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:49.020136 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzpt7\" (UniqueName: \"kubernetes.io/projected/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-kube-api-access-qzpt7\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn\" (UID: \"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" Apr 17 17:02:49.151389 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:49.151352 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" Apr 17 17:02:49.276227 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:49.276063 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn"] Apr 17 17:02:49.278450 ip-10-0-128-217 kubenswrapper[2573]: W0417 17:02:49.278421 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8b2fb44_a01f_4605_8e5d_5b706a28f0b9.slice/crio-4458da7317d0d97d73b0d8f271813a9a7005cad91673364785e5a99b831eeacc WatchSource:0}: Error finding container 4458da7317d0d97d73b0d8f271813a9a7005cad91673364785e5a99b831eeacc: Status 404 returned error can't find the container with id 4458da7317d0d97d73b0d8f271813a9a7005cad91673364785e5a99b831eeacc Apr 17 17:02:49.376185 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:49.376147 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" event={"ID":"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9","Type":"ContainerStarted","Data":"c61c40dfb92dec4a64e7aa00ee664c07d19ad355e1001d83a386e5a6fe753ab0"} Apr 17 17:02:49.376185 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:49.376191 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" event={"ID":"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9","Type":"ContainerStarted","Data":"4458da7317d0d97d73b0d8f271813a9a7005cad91673364785e5a99b831eeacc"} Apr 17 17:02:49.378034 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:49.378006 2573 generic.go:358] "Generic (PLEG): container finished" podID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerID="5f631f058678b6a983101278b20f00de0ac7bbabb6a3f8f47aba7201342e023d" exitCode=2 Apr 17 17:02:49.378153 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:49.378066 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" event={"ID":"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887","Type":"ContainerDied","Data":"5f631f058678b6a983101278b20f00de0ac7bbabb6a3f8f47aba7201342e023d"} Apr 17 17:02:50.102347 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:50.102300 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.21:8643/healthz\": dial tcp 10.134.0.21:8643: connect: connection refused" Apr 17 17:02:50.107708 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:50.107677 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 17 17:02:52.382735 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.382713 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" Apr 17 17:02:52.388447 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.388416 2573 generic.go:358] "Generic (PLEG): container finished" podID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerID="4159f4de2d261e4be493728c5202fc6d6f7fe8d6dd892d99d0232ee7c29b429b" exitCode=0 Apr 17 17:02:52.388586 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.388502 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" Apr 17 17:02:52.388586 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.388495 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" event={"ID":"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887","Type":"ContainerDied","Data":"4159f4de2d261e4be493728c5202fc6d6f7fe8d6dd892d99d0232ee7c29b429b"} Apr 17 17:02:52.388663 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.388606 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr" event={"ID":"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887","Type":"ContainerDied","Data":"858ade4cff155fcea21cf9c8d8d51e0f8422c84fad9b4fd048396b32f6a0e8fc"} Apr 17 17:02:52.388663 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.388629 2573 scope.go:117] "RemoveContainer" containerID="5f631f058678b6a983101278b20f00de0ac7bbabb6a3f8f47aba7201342e023d" Apr 17 17:02:52.395954 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.395933 2573 scope.go:117] "RemoveContainer" containerID="4159f4de2d261e4be493728c5202fc6d6f7fe8d6dd892d99d0232ee7c29b429b" Apr 17 17:02:52.402866 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.402845 2573 scope.go:117] "RemoveContainer" containerID="75de99432254c766fa3fbc9358912350c85de238a5a6dcc1cb3e5398265d1bfd" Apr 17 17:02:52.410975 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.410945 2573 scope.go:117] "RemoveContainer" containerID="5f631f058678b6a983101278b20f00de0ac7bbabb6a3f8f47aba7201342e023d" Apr 17 17:02:52.411240 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:02:52.411221 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f631f058678b6a983101278b20f00de0ac7bbabb6a3f8f47aba7201342e023d\": container with ID starting with 5f631f058678b6a983101278b20f00de0ac7bbabb6a3f8f47aba7201342e023d not found: ID does not exist" containerID="5f631f058678b6a983101278b20f00de0ac7bbabb6a3f8f47aba7201342e023d" Apr 17 17:02:52.411292 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.411250 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f631f058678b6a983101278b20f00de0ac7bbabb6a3f8f47aba7201342e023d"} err="failed to get container status \"5f631f058678b6a983101278b20f00de0ac7bbabb6a3f8f47aba7201342e023d\": rpc error: code = NotFound desc = could not find container \"5f631f058678b6a983101278b20f00de0ac7bbabb6a3f8f47aba7201342e023d\": container with ID starting with 5f631f058678b6a983101278b20f00de0ac7bbabb6a3f8f47aba7201342e023d not found: ID does not exist" Apr 17 17:02:52.411292 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.411270 2573 scope.go:117] "RemoveContainer" containerID="4159f4de2d261e4be493728c5202fc6d6f7fe8d6dd892d99d0232ee7c29b429b" Apr 17 17:02:52.411523 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:02:52.411497 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4159f4de2d261e4be493728c5202fc6d6f7fe8d6dd892d99d0232ee7c29b429b\": container with ID starting with 4159f4de2d261e4be493728c5202fc6d6f7fe8d6dd892d99d0232ee7c29b429b not found: ID does not exist" containerID="4159f4de2d261e4be493728c5202fc6d6f7fe8d6dd892d99d0232ee7c29b429b" Apr 17 17:02:52.411569 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.411535 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4159f4de2d261e4be493728c5202fc6d6f7fe8d6dd892d99d0232ee7c29b429b"} err="failed to get container status \"4159f4de2d261e4be493728c5202fc6d6f7fe8d6dd892d99d0232ee7c29b429b\": rpc error: code = NotFound desc = could not find container \"4159f4de2d261e4be493728c5202fc6d6f7fe8d6dd892d99d0232ee7c29b429b\": container with ID starting with 4159f4de2d261e4be493728c5202fc6d6f7fe8d6dd892d99d0232ee7c29b429b not found: ID does not exist" Apr 17 17:02:52.411569 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.411560 2573 scope.go:117] "RemoveContainer" containerID="75de99432254c766fa3fbc9358912350c85de238a5a6dcc1cb3e5398265d1bfd" Apr 17 17:02:52.411808 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:02:52.411789 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75de99432254c766fa3fbc9358912350c85de238a5a6dcc1cb3e5398265d1bfd\": container with ID starting with 75de99432254c766fa3fbc9358912350c85de238a5a6dcc1cb3e5398265d1bfd not found: ID does not exist" containerID="75de99432254c766fa3fbc9358912350c85de238a5a6dcc1cb3e5398265d1bfd" Apr 17 17:02:52.411885 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.411816 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75de99432254c766fa3fbc9358912350c85de238a5a6dcc1cb3e5398265d1bfd"} err="failed to get container status \"75de99432254c766fa3fbc9358912350c85de238a5a6dcc1cb3e5398265d1bfd\": rpc error: code = NotFound desc = could not find container \"75de99432254c766fa3fbc9358912350c85de238a5a6dcc1cb3e5398265d1bfd\": container with ID starting with 75de99432254c766fa3fbc9358912350c85de238a5a6dcc1cb3e5398265d1bfd not found: ID does not exist" Apr 17 17:02:52.439206 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.439180 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tqfv\" (UniqueName: \"kubernetes.io/projected/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-kube-api-access-7tqfv\") pod \"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887\" (UID: \"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887\") " Apr 17 17:02:52.439314 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.439238 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887\" (UID: \"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887\") " Apr 17 17:02:52.439314 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.439281 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-kserve-provision-location\") pod \"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887\" (UID: \"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887\") " Apr 17 17:02:52.439419 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.439342 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-proxy-tls\") pod \"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887\" (UID: \"ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887\") " Apr 17 17:02:52.439610 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.439587 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" (UID: "ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:02:52.439665 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.439616 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-isvc-pmml-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-runtime-kube-rbac-proxy-sar-config") pod "ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" (UID: "ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887"). InnerVolumeSpecName "isvc-pmml-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:02:52.441192 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.441164 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-kube-api-access-7tqfv" (OuterVolumeSpecName: "kube-api-access-7tqfv") pod "ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" (UID: "ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887"). InnerVolumeSpecName "kube-api-access-7tqfv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:02:52.441287 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.441236 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" (UID: "ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:02:52.540909 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.540870 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:02:52.540909 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.540902 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:02:52.540909 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.540912 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-proxy-tls\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:02:52.540909 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.540921 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7tqfv\" (UniqueName: \"kubernetes.io/projected/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887-kube-api-access-7tqfv\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:02:52.710369 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.710340 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr"] Apr 17 17:02:52.714519 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:52.714495 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-8mtsr"] Apr 17 17:02:53.174967 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:53.174934 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" path="/var/lib/kubelet/pods/ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887/volumes" Apr 17 17:02:53.392855 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:53.392750 2573 generic.go:358] "Generic (PLEG): container finished" podID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerID="c61c40dfb92dec4a64e7aa00ee664c07d19ad355e1001d83a386e5a6fe753ab0" exitCode=0 Apr 17 17:02:53.392855 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:53.392845 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" event={"ID":"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9","Type":"ContainerDied","Data":"c61c40dfb92dec4a64e7aa00ee664c07d19ad355e1001d83a386e5a6fe753ab0"} Apr 17 17:02:54.398603 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:54.398572 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" event={"ID":"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9","Type":"ContainerStarted","Data":"7661fffdf39f12c70dbac92358ccf64b198f9549c7d8ed77506bfabdb2b2e785"} Apr 17 17:02:54.398603 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:54.398609 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" event={"ID":"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9","Type":"ContainerStarted","Data":"3e1e4d106dcc48c316b95a423fd73a07ce52f66647c315c212373443fd0a918f"} Apr 17 17:02:54.399054 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:54.398895 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" Apr 17 17:02:54.399054 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:54.399032 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" Apr 17 17:02:54.400311 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:54.400284 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" podUID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 17 17:02:54.433622 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:54.433576 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" podStartSLOduration=6.433562863 podStartE2EDuration="6.433562863s" podCreationTimestamp="2026-04-17 17:02:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:02:54.430999844 +0000 UTC m=+1891.762500768" watchObservedRunningTime="2026-04-17 17:02:54.433562863 +0000 UTC m=+1891.765063769" Apr 17 17:02:55.401062 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:02:55.401017 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" podUID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 17 17:03:00.405911 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:03:00.405881 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" Apr 17 17:03:00.406456 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:03:00.406431 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" podUID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 17 17:03:10.407262 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:03:10.407210 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" podUID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 17 17:03:20.407368 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:03:20.407331 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" podUID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 17 17:03:30.406498 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:03:30.406455 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" podUID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 17 17:03:40.406505 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:03:40.406463 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" podUID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 17 17:03:50.407418 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:03:50.407320 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" podUID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 17 17:04:00.406608 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:00.406566 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" podUID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 17 17:04:10.406544 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:10.406458 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" podUID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 17 17:04:15.175509 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:15.175476 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" Apr 17 17:04:19.812370 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:19.812340 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn"] Apr 17 17:04:19.815010 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:19.812676 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" podUID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerName="kserve-container" containerID="cri-o://3e1e4d106dcc48c316b95a423fd73a07ce52f66647c315c212373443fd0a918f" gracePeriod=30 Apr 17 17:04:19.815010 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:19.812718 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" podUID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerName="kube-rbac-proxy" containerID="cri-o://7661fffdf39f12c70dbac92358ccf64b198f9549c7d8ed77506bfabdb2b2e785" gracePeriod=30 Apr 17 17:04:20.402131 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:20.402088 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" podUID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 17 17:04:20.636759 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:20.636728 2573 generic.go:358] "Generic (PLEG): container finished" podID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerID="7661fffdf39f12c70dbac92358ccf64b198f9549c7d8ed77506bfabdb2b2e785" exitCode=2 Apr 17 17:04:20.636967 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:20.636792 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" event={"ID":"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9","Type":"ContainerDied","Data":"7661fffdf39f12c70dbac92358ccf64b198f9549c7d8ed77506bfabdb2b2e785"} Apr 17 17:04:23.456944 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.456917 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" Apr 17 17:04:23.564484 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.564396 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-kserve-provision-location\") pod \"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9\" (UID: \"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9\") " Apr 17 17:04:23.564484 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.564443 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9\" (UID: \"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9\") " Apr 17 17:04:23.564484 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.564487 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-proxy-tls\") pod \"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9\" (UID: \"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9\") " Apr 17 17:04:23.564756 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.564528 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzpt7\" (UniqueName: \"kubernetes.io/projected/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-kube-api-access-qzpt7\") pod \"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9\" (UID: \"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9\") " Apr 17 17:04:23.564864 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.564807 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" (UID: "a8b2fb44-a01f-4605-8e5d-5b706a28f0b9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:04:23.564934 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.564889 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config") pod "a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" (UID: "a8b2fb44-a01f-4605-8e5d-5b706a28f0b9"). InnerVolumeSpecName "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:04:23.566693 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.566671 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" (UID: "a8b2fb44-a01f-4605-8e5d-5b706a28f0b9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:04:23.566769 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.566717 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-kube-api-access-qzpt7" (OuterVolumeSpecName: "kube-api-access-qzpt7") pod "a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" (UID: "a8b2fb44-a01f-4605-8e5d-5b706a28f0b9"). InnerVolumeSpecName "kube-api-access-qzpt7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:04:23.650770 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.650737 2573 generic.go:358] "Generic (PLEG): container finished" podID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerID="3e1e4d106dcc48c316b95a423fd73a07ce52f66647c315c212373443fd0a918f" exitCode=0 Apr 17 17:04:23.650989 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.650809 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" event={"ID":"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9","Type":"ContainerDied","Data":"3e1e4d106dcc48c316b95a423fd73a07ce52f66647c315c212373443fd0a918f"} Apr 17 17:04:23.650989 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.650848 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" Apr 17 17:04:23.650989 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.650868 2573 scope.go:117] "RemoveContainer" containerID="7661fffdf39f12c70dbac92358ccf64b198f9549c7d8ed77506bfabdb2b2e785" Apr 17 17:04:23.650989 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.650857 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn" event={"ID":"a8b2fb44-a01f-4605-8e5d-5b706a28f0b9","Type":"ContainerDied","Data":"4458da7317d0d97d73b0d8f271813a9a7005cad91673364785e5a99b831eeacc"} Apr 17 17:04:23.659278 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.659257 2573 scope.go:117] "RemoveContainer" containerID="3e1e4d106dcc48c316b95a423fd73a07ce52f66647c315c212373443fd0a918f" Apr 17 17:04:23.665285 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.665265 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:04:23.665285 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.665287 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:04:23.665424 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.665307 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-proxy-tls\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:04:23.665424 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.665320 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qzpt7\" (UniqueName: \"kubernetes.io/projected/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9-kube-api-access-qzpt7\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:04:23.666288 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.666275 2573 scope.go:117] "RemoveContainer" containerID="c61c40dfb92dec4a64e7aa00ee664c07d19ad355e1001d83a386e5a6fe753ab0" Apr 17 17:04:23.673217 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.673195 2573 scope.go:117] "RemoveContainer" containerID="7661fffdf39f12c70dbac92358ccf64b198f9549c7d8ed77506bfabdb2b2e785" Apr 17 17:04:23.673492 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:04:23.673471 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7661fffdf39f12c70dbac92358ccf64b198f9549c7d8ed77506bfabdb2b2e785\": container with ID starting with 7661fffdf39f12c70dbac92358ccf64b198f9549c7d8ed77506bfabdb2b2e785 not found: ID does not exist" containerID="7661fffdf39f12c70dbac92358ccf64b198f9549c7d8ed77506bfabdb2b2e785" Apr 17 17:04:23.673557 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.673502 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7661fffdf39f12c70dbac92358ccf64b198f9549c7d8ed77506bfabdb2b2e785"} err="failed to get container status \"7661fffdf39f12c70dbac92358ccf64b198f9549c7d8ed77506bfabdb2b2e785\": rpc error: code = NotFound desc = could not find container \"7661fffdf39f12c70dbac92358ccf64b198f9549c7d8ed77506bfabdb2b2e785\": container with ID starting with 7661fffdf39f12c70dbac92358ccf64b198f9549c7d8ed77506bfabdb2b2e785 not found: ID does not exist" Apr 17 17:04:23.673557 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.673521 2573 scope.go:117] "RemoveContainer" containerID="3e1e4d106dcc48c316b95a423fd73a07ce52f66647c315c212373443fd0a918f" Apr 17 17:04:23.673763 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:04:23.673748 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e1e4d106dcc48c316b95a423fd73a07ce52f66647c315c212373443fd0a918f\": container with ID starting with 3e1e4d106dcc48c316b95a423fd73a07ce52f66647c315c212373443fd0a918f not found: ID does not exist" containerID="3e1e4d106dcc48c316b95a423fd73a07ce52f66647c315c212373443fd0a918f" Apr 17 17:04:23.673801 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.673767 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e1e4d106dcc48c316b95a423fd73a07ce52f66647c315c212373443fd0a918f"} err="failed to get container status \"3e1e4d106dcc48c316b95a423fd73a07ce52f66647c315c212373443fd0a918f\": rpc error: code = NotFound desc = could not find container \"3e1e4d106dcc48c316b95a423fd73a07ce52f66647c315c212373443fd0a918f\": container with ID starting with 3e1e4d106dcc48c316b95a423fd73a07ce52f66647c315c212373443fd0a918f not found: ID does not exist" Apr 17 17:04:23.673801 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.673780 2573 scope.go:117] "RemoveContainer" containerID="c61c40dfb92dec4a64e7aa00ee664c07d19ad355e1001d83a386e5a6fe753ab0" Apr 17 17:04:23.673890 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.673847 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn"] Apr 17 17:04:23.674053 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:04:23.674035 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c61c40dfb92dec4a64e7aa00ee664c07d19ad355e1001d83a386e5a6fe753ab0\": container with ID starting with c61c40dfb92dec4a64e7aa00ee664c07d19ad355e1001d83a386e5a6fe753ab0 not found: ID does not exist" containerID="c61c40dfb92dec4a64e7aa00ee664c07d19ad355e1001d83a386e5a6fe753ab0" Apr 17 17:04:23.674094 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.674058 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c61c40dfb92dec4a64e7aa00ee664c07d19ad355e1001d83a386e5a6fe753ab0"} err="failed to get container status \"c61c40dfb92dec4a64e7aa00ee664c07d19ad355e1001d83a386e5a6fe753ab0\": rpc error: code = NotFound desc = could not find container \"c61c40dfb92dec4a64e7aa00ee664c07d19ad355e1001d83a386e5a6fe753ab0\": container with ID starting with c61c40dfb92dec4a64e7aa00ee664c07d19ad355e1001d83a386e5a6fe753ab0 not found: ID does not exist" Apr 17 17:04:23.677747 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:23.677724 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-6k6cn"] Apr 17 17:04:25.174583 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:04:25.174548 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" path="/var/lib/kubelet/pods/a8b2fb44-a01f-4605-8e5d-5b706a28f0b9/volumes" Apr 17 17:06:10.390051 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.390013 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n"] Apr 17 17:06:10.390566 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.390302 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerName="kserve-container" Apr 17 17:06:10.390566 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.390313 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerName="kserve-container" Apr 17 17:06:10.390566 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.390322 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerName="storage-initializer" Apr 17 17:06:10.390566 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.390327 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerName="storage-initializer" Apr 17 17:06:10.390566 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.390336 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerName="kube-rbac-proxy" Apr 17 17:06:10.390566 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.390343 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerName="kube-rbac-proxy" Apr 17 17:06:10.390566 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.390351 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerName="kserve-container" Apr 17 17:06:10.390566 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.390357 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerName="kserve-container" Apr 17 17:06:10.390566 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.390363 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerName="storage-initializer" Apr 17 17:06:10.390566 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.390368 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerName="storage-initializer" Apr 17 17:06:10.390566 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.390375 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerName="kube-rbac-proxy" Apr 17 17:06:10.390566 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.390381 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerName="kube-rbac-proxy" Apr 17 17:06:10.390566 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.390423 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerName="kube-rbac-proxy" Apr 17 17:06:10.390566 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.390432 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerName="kserve-container" Apr 17 17:06:10.390566 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.390441 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce4d0fe5-a6f1-481a-b9c9-9e3d21ea9887" containerName="kube-rbac-proxy" Apr 17 17:06:10.390566 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.390450 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8b2fb44-a01f-4605-8e5d-5b706a28f0b9" containerName="kserve-container" Apr 17 17:06:10.393555 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.393537 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" Apr 17 17:06:10.396125 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.396098 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 17:06:10.396271 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.396182 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-predictor-serving-cert\"" Apr 17 17:06:10.396271 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.396221 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-wzvpv\"" Apr 17 17:06:10.396411 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.396397 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\"" Apr 17 17:06:10.397084 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.397065 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 17:06:10.403354 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.403330 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n"] Apr 17 17:06:10.485254 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.485215 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n\" (UID: \"fc27d58f-3321-4de8-a11f-3eabc0e0b08e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" Apr 17 17:06:10.485254 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.485255 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n\" (UID: \"fc27d58f-3321-4de8-a11f-3eabc0e0b08e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" Apr 17 17:06:10.485522 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.485342 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n\" (UID: \"fc27d58f-3321-4de8-a11f-3eabc0e0b08e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" Apr 17 17:06:10.485522 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.485386 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cph9j\" (UniqueName: \"kubernetes.io/projected/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-kube-api-access-cph9j\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n\" (UID: \"fc27d58f-3321-4de8-a11f-3eabc0e0b08e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" Apr 17 17:06:10.586205 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.586169 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n\" (UID: \"fc27d58f-3321-4de8-a11f-3eabc0e0b08e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" Apr 17 17:06:10.586308 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.586210 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n\" (UID: \"fc27d58f-3321-4de8-a11f-3eabc0e0b08e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" Apr 17 17:06:10.586308 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.586243 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n\" (UID: \"fc27d58f-3321-4de8-a11f-3eabc0e0b08e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" Apr 17 17:06:10.586308 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.586260 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cph9j\" (UniqueName: \"kubernetes.io/projected/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-kube-api-access-cph9j\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n\" (UID: \"fc27d58f-3321-4de8-a11f-3eabc0e0b08e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" Apr 17 17:06:10.586446 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:06:10.586323 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-serving-cert: secret "isvc-predictive-sklearn-predictor-serving-cert" not found Apr 17 17:06:10.586446 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:06:10.586405 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-proxy-tls podName:fc27d58f-3321-4de8-a11f-3eabc0e0b08e nodeName:}" failed. No retries permitted until 2026-04-17 17:06:11.086385835 +0000 UTC m=+2088.417886722 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-proxy-tls") pod "isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" (UID: "fc27d58f-3321-4de8-a11f-3eabc0e0b08e") : secret "isvc-predictive-sklearn-predictor-serving-cert" not found Apr 17 17:06:10.586620 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.586600 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n\" (UID: \"fc27d58f-3321-4de8-a11f-3eabc0e0b08e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" Apr 17 17:06:10.586981 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.586956 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n\" (UID: \"fc27d58f-3321-4de8-a11f-3eabc0e0b08e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" Apr 17 17:06:10.595308 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:10.595283 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cph9j\" (UniqueName: \"kubernetes.io/projected/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-kube-api-access-cph9j\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n\" (UID: \"fc27d58f-3321-4de8-a11f-3eabc0e0b08e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" Apr 17 17:06:11.089044 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:11.089008 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n\" (UID: \"fc27d58f-3321-4de8-a11f-3eabc0e0b08e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" Apr 17 17:06:11.091403 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:11.091379 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n\" (UID: \"fc27d58f-3321-4de8-a11f-3eabc0e0b08e\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" Apr 17 17:06:11.305482 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:11.305441 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" Apr 17 17:06:11.433262 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:11.433231 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n"] Apr 17 17:06:11.436469 ip-10-0-128-217 kubenswrapper[2573]: W0417 17:06:11.436436 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc27d58f_3321_4de8_a11f_3eabc0e0b08e.slice/crio-d428922cfe937ace3073604b5989eedfd2127d7a62953191a4fe358204803601 WatchSource:0}: Error finding container d428922cfe937ace3073604b5989eedfd2127d7a62953191a4fe358204803601: Status 404 returned error can't find the container with id d428922cfe937ace3073604b5989eedfd2127d7a62953191a4fe358204803601 Apr 17 17:06:11.438190 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:11.438173 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:06:11.959091 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:11.959056 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" event={"ID":"fc27d58f-3321-4de8-a11f-3eabc0e0b08e","Type":"ContainerStarted","Data":"90bdc75e78f6de35583355c79a5cf5468d2e7ac82f537c3f5535b68a40d366bd"} Apr 17 17:06:11.959091 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:11.959093 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" event={"ID":"fc27d58f-3321-4de8-a11f-3eabc0e0b08e","Type":"ContainerStarted","Data":"d428922cfe937ace3073604b5989eedfd2127d7a62953191a4fe358204803601"} Apr 17 17:06:15.972438 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:15.972400 2573 generic.go:358] "Generic (PLEG): container finished" podID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerID="90bdc75e78f6de35583355c79a5cf5468d2e7ac82f537c3f5535b68a40d366bd" exitCode=0 Apr 17 17:06:15.972936 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:15.972485 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" event={"ID":"fc27d58f-3321-4de8-a11f-3eabc0e0b08e","Type":"ContainerDied","Data":"90bdc75e78f6de35583355c79a5cf5468d2e7ac82f537c3f5535b68a40d366bd"} Apr 17 17:06:26.150869 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:26.150837 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 17:06:26.151304 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:26.150935 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 17:06:39.055643 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:39.055608 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" event={"ID":"fc27d58f-3321-4de8-a11f-3eabc0e0b08e","Type":"ContainerStarted","Data":"abddac06cfb2664780e30a9abd80c37436478a3fceb3d00c0118d40390797847"} Apr 17 17:06:39.056025 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:39.055652 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" event={"ID":"fc27d58f-3321-4de8-a11f-3eabc0e0b08e","Type":"ContainerStarted","Data":"c5e4638d86a7d7acdd2dfadf510f651ec74fddaac11a2731b1e71ca58c059e65"} Apr 17 17:06:39.056025 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:39.055949 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" Apr 17 17:06:39.056107 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:39.056075 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" Apr 17 17:06:39.057372 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:39.057344 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" podUID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 17:06:39.073607 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:39.073559 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" podStartSLOduration=6.952006354 podStartE2EDuration="29.073542726s" podCreationTimestamp="2026-04-17 17:06:10 +0000 UTC" firstStartedPulling="2026-04-17 17:06:15.973771178 +0000 UTC m=+2093.305272059" lastFinishedPulling="2026-04-17 17:06:38.095307548 +0000 UTC m=+2115.426808431" observedRunningTime="2026-04-17 17:06:39.071859195 +0000 UTC m=+2116.403360093" watchObservedRunningTime="2026-04-17 17:06:39.073542726 +0000 UTC m=+2116.405043627" Apr 17 17:06:40.059093 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:40.059054 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" podUID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 17:06:45.064807 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:45.064778 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" Apr 17 17:06:45.065434 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:45.065407 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" podUID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 17:06:55.066293 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:06:55.066252 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" podUID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 17:07:05.066042 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:07:05.065999 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" podUID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 17:07:15.065602 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:07:15.065516 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" podUID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 17:07:25.065520 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:07:25.065481 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" podUID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 17:07:35.065943 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:07:35.065894 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" podUID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 17:07:45.066233 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:07:45.066188 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" podUID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 17:07:48.171017 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:07:48.170962 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" podUID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 17:07:58.172384 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:07:58.172349 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" Apr 17 17:08:00.552665 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:00.552631 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n"] Apr 17 17:08:00.553173 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:00.552950 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" podUID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerName="kserve-container" containerID="cri-o://c5e4638d86a7d7acdd2dfadf510f651ec74fddaac11a2731b1e71ca58c059e65" gracePeriod=30 Apr 17 17:08:00.553173 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:00.553008 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" podUID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerName="kube-rbac-proxy" containerID="cri-o://abddac06cfb2664780e30a9abd80c37436478a3fceb3d00c0118d40390797847" gracePeriod=30 Apr 17 17:08:00.663510 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:00.663477 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj"] Apr 17 17:08:00.667374 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:00.667351 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" Apr 17 17:08:00.672238 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:00.672218 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\"" Apr 17 17:08:00.672338 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:00.672218 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-predictor-serving-cert\"" Apr 17 17:08:00.678508 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:00.678485 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj"] Apr 17 17:08:00.746531 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:00.746489 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87b1a6ba-da8b-4e0e-a68e-cf1832168331-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj\" (UID: \"87b1a6ba-da8b-4e0e-a68e-cf1832168331\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" Apr 17 17:08:00.746531 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:00.746531 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87b1a6ba-da8b-4e0e-a68e-cf1832168331-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj\" (UID: \"87b1a6ba-da8b-4e0e-a68e-cf1832168331\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" Apr 17 17:08:00.746743 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:00.746626 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87b1a6ba-da8b-4e0e-a68e-cf1832168331-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj\" (UID: \"87b1a6ba-da8b-4e0e-a68e-cf1832168331\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" Apr 17 17:08:00.746743 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:00.746664 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l26ds\" (UniqueName: \"kubernetes.io/projected/87b1a6ba-da8b-4e0e-a68e-cf1832168331-kube-api-access-l26ds\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj\" (UID: \"87b1a6ba-da8b-4e0e-a68e-cf1832168331\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" Apr 17 17:08:00.847921 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:00.847815 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87b1a6ba-da8b-4e0e-a68e-cf1832168331-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj\" (UID: \"87b1a6ba-da8b-4e0e-a68e-cf1832168331\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" Apr 17 17:08:00.847921 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:00.847867 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87b1a6ba-da8b-4e0e-a68e-cf1832168331-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj\" (UID: \"87b1a6ba-da8b-4e0e-a68e-cf1832168331\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" Apr 17 17:08:00.847921 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:00.847914 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87b1a6ba-da8b-4e0e-a68e-cf1832168331-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj\" (UID: \"87b1a6ba-da8b-4e0e-a68e-cf1832168331\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" Apr 17 17:08:00.848195 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:00.847934 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l26ds\" (UniqueName: \"kubernetes.io/projected/87b1a6ba-da8b-4e0e-a68e-cf1832168331-kube-api-access-l26ds\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj\" (UID: \"87b1a6ba-da8b-4e0e-a68e-cf1832168331\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" Apr 17 17:08:00.848357 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:00.848333 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87b1a6ba-da8b-4e0e-a68e-cf1832168331-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj\" (UID: \"87b1a6ba-da8b-4e0e-a68e-cf1832168331\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" Apr 17 17:08:00.848550 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:00.848530 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87b1a6ba-da8b-4e0e-a68e-cf1832168331-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj\" (UID: \"87b1a6ba-da8b-4e0e-a68e-cf1832168331\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" Apr 17 17:08:00.850254 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:00.850231 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87b1a6ba-da8b-4e0e-a68e-cf1832168331-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj\" (UID: \"87b1a6ba-da8b-4e0e-a68e-cf1832168331\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" Apr 17 17:08:00.855742 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:00.855721 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l26ds\" (UniqueName: \"kubernetes.io/projected/87b1a6ba-da8b-4e0e-a68e-cf1832168331-kube-api-access-l26ds\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj\" (UID: \"87b1a6ba-da8b-4e0e-a68e-cf1832168331\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" Apr 17 17:08:00.977107 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:00.977070 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" Apr 17 17:08:01.095470 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:01.095439 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj"] Apr 17 17:08:01.098374 ip-10-0-128-217 kubenswrapper[2573]: W0417 17:08:01.098320 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87b1a6ba_da8b_4e0e_a68e_cf1832168331.slice/crio-451017621f3be010e0bc7d81e39ca9bc4fc718ec648d51d56a013e0b0d342241 WatchSource:0}: Error finding container 451017621f3be010e0bc7d81e39ca9bc4fc718ec648d51d56a013e0b0d342241: Status 404 returned error can't find the container with id 451017621f3be010e0bc7d81e39ca9bc4fc718ec648d51d56a013e0b0d342241 Apr 17 17:08:01.289846 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:01.289786 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" event={"ID":"87b1a6ba-da8b-4e0e-a68e-cf1832168331","Type":"ContainerStarted","Data":"a9bdd073441d10b5929458d7ebef68a607c20200e139c2a7649e7c1869b989f3"} Apr 17 17:08:01.289846 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:01.289849 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" event={"ID":"87b1a6ba-da8b-4e0e-a68e-cf1832168331","Type":"ContainerStarted","Data":"451017621f3be010e0bc7d81e39ca9bc4fc718ec648d51d56a013e0b0d342241"} Apr 17 17:08:01.291641 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:01.291616 2573 generic.go:358] "Generic (PLEG): container finished" podID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerID="abddac06cfb2664780e30a9abd80c37436478a3fceb3d00c0118d40390797847" exitCode=2 Apr 17 17:08:01.291753 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:01.291691 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" event={"ID":"fc27d58f-3321-4de8-a11f-3eabc0e0b08e","Type":"ContainerDied","Data":"abddac06cfb2664780e30a9abd80c37436478a3fceb3d00c0118d40390797847"} Apr 17 17:08:05.059982 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:05.059895 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" podUID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.23:8643/healthz\": dial tcp 10.134.0.23:8643: connect: connection refused" Apr 17 17:08:05.304817 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:05.304787 2573 generic.go:358] "Generic (PLEG): container finished" podID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerID="a9bdd073441d10b5929458d7ebef68a607c20200e139c2a7649e7c1869b989f3" exitCode=0 Apr 17 17:08:05.305014 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:05.304861 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" event={"ID":"87b1a6ba-da8b-4e0e-a68e-cf1832168331","Type":"ContainerDied","Data":"a9bdd073441d10b5929458d7ebef68a607c20200e139c2a7649e7c1869b989f3"} Apr 17 17:08:05.306953 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:05.306930 2573 generic.go:358] "Generic (PLEG): container finished" podID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerID="c5e4638d86a7d7acdd2dfadf510f651ec74fddaac11a2731b1e71ca58c059e65" exitCode=0 Apr 17 17:08:05.307070 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:05.307009 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" event={"ID":"fc27d58f-3321-4de8-a11f-3eabc0e0b08e","Type":"ContainerDied","Data":"c5e4638d86a7d7acdd2dfadf510f651ec74fddaac11a2731b1e71ca58c059e65"} Apr 17 17:08:05.411217 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:05.411190 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" Apr 17 17:08:05.489482 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:05.489448 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"fc27d58f-3321-4de8-a11f-3eabc0e0b08e\" (UID: \"fc27d58f-3321-4de8-a11f-3eabc0e0b08e\") " Apr 17 17:08:05.489733 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:05.489528 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cph9j\" (UniqueName: \"kubernetes.io/projected/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-kube-api-access-cph9j\") pod \"fc27d58f-3321-4de8-a11f-3eabc0e0b08e\" (UID: \"fc27d58f-3321-4de8-a11f-3eabc0e0b08e\") " Apr 17 17:08:05.489733 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:05.489576 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-proxy-tls\") pod \"fc27d58f-3321-4de8-a11f-3eabc0e0b08e\" (UID: \"fc27d58f-3321-4de8-a11f-3eabc0e0b08e\") " Apr 17 17:08:05.489733 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:05.489597 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-kserve-provision-location\") pod \"fc27d58f-3321-4de8-a11f-3eabc0e0b08e\" (UID: \"fc27d58f-3321-4de8-a11f-3eabc0e0b08e\") " Apr 17 17:08:05.489949 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:05.489917 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fc27d58f-3321-4de8-a11f-3eabc0e0b08e" (UID: "fc27d58f-3321-4de8-a11f-3eabc0e0b08e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:08:05.490001 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:05.489948 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-isvc-predictive-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-kube-rbac-proxy-sar-config") pod "fc27d58f-3321-4de8-a11f-3eabc0e0b08e" (UID: "fc27d58f-3321-4de8-a11f-3eabc0e0b08e"). InnerVolumeSpecName "isvc-predictive-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:08:05.491660 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:05.491638 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-kube-api-access-cph9j" (OuterVolumeSpecName: "kube-api-access-cph9j") pod "fc27d58f-3321-4de8-a11f-3eabc0e0b08e" (UID: "fc27d58f-3321-4de8-a11f-3eabc0e0b08e"). InnerVolumeSpecName "kube-api-access-cph9j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:08:05.491741 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:05.491660 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fc27d58f-3321-4de8-a11f-3eabc0e0b08e" (UID: "fc27d58f-3321-4de8-a11f-3eabc0e0b08e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:08:05.591120 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:05.591031 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-proxy-tls\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:08:05.591120 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:05.591063 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:08:05.591120 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:05.591075 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:08:05.591120 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:05.591085 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cph9j\" (UniqueName: \"kubernetes.io/projected/fc27d58f-3321-4de8-a11f-3eabc0e0b08e-kube-api-access-cph9j\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:08:06.311295 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:06.311254 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" event={"ID":"fc27d58f-3321-4de8-a11f-3eabc0e0b08e","Type":"ContainerDied","Data":"d428922cfe937ace3073604b5989eedfd2127d7a62953191a4fe358204803601"} Apr 17 17:08:06.311767 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:06.311315 2573 scope.go:117] "RemoveContainer" containerID="abddac06cfb2664780e30a9abd80c37436478a3fceb3d00c0118d40390797847" Apr 17 17:08:06.311767 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:06.311323 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n" Apr 17 17:08:06.313308 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:06.313272 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" event={"ID":"87b1a6ba-da8b-4e0e-a68e-cf1832168331","Type":"ContainerStarted","Data":"cfe7621dd4e57381ebdf62610ee06167807681dc9e0e21603f46e3049175133b"} Apr 17 17:08:06.313411 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:06.313320 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" event={"ID":"87b1a6ba-da8b-4e0e-a68e-cf1832168331","Type":"ContainerStarted","Data":"c147756cd98473a946ae4bd85c37023c4283b5f59ccc8c71d4b183d727f87fef"} Apr 17 17:08:06.313656 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:06.313630 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" Apr 17 17:08:06.313716 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:06.313669 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" Apr 17 17:08:06.315084 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:06.315058 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" podUID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 17 17:08:06.319889 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:06.319807 2573 scope.go:117] "RemoveContainer" containerID="c5e4638d86a7d7acdd2dfadf510f651ec74fddaac11a2731b1e71ca58c059e65" Apr 17 17:08:06.326556 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:06.326540 2573 scope.go:117] "RemoveContainer" containerID="90bdc75e78f6de35583355c79a5cf5468d2e7ac82f537c3f5535b68a40d366bd" Apr 17 17:08:06.334220 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:06.334176 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" podStartSLOduration=6.334166187 podStartE2EDuration="6.334166187s" podCreationTimestamp="2026-04-17 17:08:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:08:06.332911028 +0000 UTC m=+2203.664411933" watchObservedRunningTime="2026-04-17 17:08:06.334166187 +0000 UTC m=+2203.665667089" Apr 17 17:08:06.345099 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:06.345078 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n"] Apr 17 17:08:06.348698 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:06.348676 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-pqm9n"] Apr 17 17:08:07.179125 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:07.179091 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" path="/var/lib/kubelet/pods/fc27d58f-3321-4de8-a11f-3eabc0e0b08e/volumes" Apr 17 17:08:07.317049 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:07.317015 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" podUID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 17 17:08:12.321740 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:12.321705 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" Apr 17 17:08:12.322243 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:12.322179 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" podUID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 17 17:08:22.322170 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:22.322130 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" podUID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 17 17:08:32.322353 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:32.322315 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" podUID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 17 17:08:42.323077 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:42.322988 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" podUID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 17 17:08:52.322285 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:08:52.322249 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" podUID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 17 17:09:02.322299 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:02.322262 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" podUID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 17 17:09:12.323134 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:12.323092 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" podUID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 17 17:09:22.323711 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:22.323685 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" Apr 17 17:09:30.787268 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:30.787234 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj"] Apr 17 17:09:30.787764 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:30.787710 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" podUID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerName="kserve-container" containerID="cri-o://c147756cd98473a946ae4bd85c37023c4283b5f59ccc8c71d4b183d727f87fef" gracePeriod=30 Apr 17 17:09:30.787860 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:30.787757 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" podUID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerName="kube-rbac-proxy" containerID="cri-o://cfe7621dd4e57381ebdf62610ee06167807681dc9e0e21603f46e3049175133b" gracePeriod=30 Apr 17 17:09:30.884077 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:30.884045 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp"] Apr 17 17:09:30.884370 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:30.884358 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerName="kserve-container" Apr 17 17:09:30.884416 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:30.884372 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerName="kserve-container" Apr 17 17:09:30.884416 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:30.884399 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerName="storage-initializer" Apr 17 17:09:30.884416 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:30.884405 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerName="storage-initializer" Apr 17 17:09:30.884416 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:30.884413 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerName="kube-rbac-proxy" Apr 17 17:09:30.884560 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:30.884418 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerName="kube-rbac-proxy" Apr 17 17:09:30.884560 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:30.884463 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerName="kube-rbac-proxy" Apr 17 17:09:30.884560 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:30.884471 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc27d58f-3321-4de8-a11f-3eabc0e0b08e" containerName="kserve-container" Apr 17 17:09:30.887482 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:30.887467 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" Apr 17 17:09:30.889777 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:30.889759 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-predictor-serving-cert\"" Apr 17 17:09:30.889879 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:30.889766 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\"" Apr 17 17:09:30.896294 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:30.896265 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp"] Apr 17 17:09:30.939503 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:30.939465 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/979422e9-0546-4ccf-919f-ba9134153bfb-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp\" (UID: \"979422e9-0546-4ccf-919f-ba9134153bfb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" Apr 17 17:09:30.939503 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:30.939506 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/979422e9-0546-4ccf-919f-ba9134153bfb-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp\" (UID: \"979422e9-0546-4ccf-919f-ba9134153bfb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" Apr 17 17:09:30.939712 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:30.939540 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxct5\" (UniqueName: \"kubernetes.io/projected/979422e9-0546-4ccf-919f-ba9134153bfb-kube-api-access-cxct5\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp\" (UID: \"979422e9-0546-4ccf-919f-ba9134153bfb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" Apr 17 17:09:30.939712 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:30.939595 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/979422e9-0546-4ccf-919f-ba9134153bfb-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp\" (UID: \"979422e9-0546-4ccf-919f-ba9134153bfb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" Apr 17 17:09:31.040011 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:31.039928 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/979422e9-0546-4ccf-919f-ba9134153bfb-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp\" (UID: \"979422e9-0546-4ccf-919f-ba9134153bfb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" Apr 17 17:09:31.040011 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:31.039967 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/979422e9-0546-4ccf-919f-ba9134153bfb-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp\" (UID: \"979422e9-0546-4ccf-919f-ba9134153bfb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" Apr 17 17:09:31.040217 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:31.040012 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxct5\" (UniqueName: \"kubernetes.io/projected/979422e9-0546-4ccf-919f-ba9134153bfb-kube-api-access-cxct5\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp\" (UID: \"979422e9-0546-4ccf-919f-ba9134153bfb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" Apr 17 17:09:31.040217 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:31.040074 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/979422e9-0546-4ccf-919f-ba9134153bfb-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp\" (UID: \"979422e9-0546-4ccf-919f-ba9134153bfb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" Apr 17 17:09:31.040217 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:09:31.040098 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-serving-cert: secret "isvc-predictive-lightgbm-predictor-serving-cert" not found Apr 17 17:09:31.040217 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:09:31.040177 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/979422e9-0546-4ccf-919f-ba9134153bfb-proxy-tls podName:979422e9-0546-4ccf-919f-ba9134153bfb nodeName:}" failed. No retries permitted until 2026-04-17 17:09:31.540155071 +0000 UTC m=+2288.871655955 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/979422e9-0546-4ccf-919f-ba9134153bfb-proxy-tls") pod "isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" (UID: "979422e9-0546-4ccf-919f-ba9134153bfb") : secret "isvc-predictive-lightgbm-predictor-serving-cert" not found Apr 17 17:09:31.040446 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:31.040429 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/979422e9-0546-4ccf-919f-ba9134153bfb-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp\" (UID: \"979422e9-0546-4ccf-919f-ba9134153bfb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" Apr 17 17:09:31.040893 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:31.040868 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/979422e9-0546-4ccf-919f-ba9134153bfb-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp\" (UID: \"979422e9-0546-4ccf-919f-ba9134153bfb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" Apr 17 17:09:31.050439 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:31.050409 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxct5\" (UniqueName: \"kubernetes.io/projected/979422e9-0546-4ccf-919f-ba9134153bfb-kube-api-access-cxct5\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp\" (UID: \"979422e9-0546-4ccf-919f-ba9134153bfb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" Apr 17 17:09:31.545307 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:31.545257 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/979422e9-0546-4ccf-919f-ba9134153bfb-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp\" (UID: \"979422e9-0546-4ccf-919f-ba9134153bfb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" Apr 17 17:09:31.547741 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:31.547713 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/979422e9-0546-4ccf-919f-ba9134153bfb-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp\" (UID: \"979422e9-0546-4ccf-919f-ba9134153bfb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" Apr 17 17:09:31.562701 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:31.562671 2573 generic.go:358] "Generic (PLEG): container finished" podID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerID="cfe7621dd4e57381ebdf62610ee06167807681dc9e0e21603f46e3049175133b" exitCode=2 Apr 17 17:09:31.562844 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:31.562731 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" event={"ID":"87b1a6ba-da8b-4e0e-a68e-cf1832168331","Type":"ContainerDied","Data":"cfe7621dd4e57381ebdf62610ee06167807681dc9e0e21603f46e3049175133b"} Apr 17 17:09:31.798482 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:31.798386 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" Apr 17 17:09:31.922047 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:31.922008 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp"] Apr 17 17:09:31.925940 ip-10-0-128-217 kubenswrapper[2573]: W0417 17:09:31.925903 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod979422e9_0546_4ccf_919f_ba9134153bfb.slice/crio-4290dfd401ca7a09087139daa9c579a4e9c11e1fc3e3cfd40bb780cc8548f487 WatchSource:0}: Error finding container 4290dfd401ca7a09087139daa9c579a4e9c11e1fc3e3cfd40bb780cc8548f487: Status 404 returned error can't find the container with id 4290dfd401ca7a09087139daa9c579a4e9c11e1fc3e3cfd40bb780cc8548f487 Apr 17 17:09:32.317977 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:32.317931 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" podUID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.24:8643/healthz\": dial tcp 10.134.0.24:8643: connect: connection refused" Apr 17 17:09:32.322216 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:32.322192 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" podUID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 17 17:09:32.567026 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:32.566986 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" event={"ID":"979422e9-0546-4ccf-919f-ba9134153bfb","Type":"ContainerStarted","Data":"ff79df3773e14a4157a73942a96fad624e5349a9f1ec039f569c7227378c8cf9"} Apr 17 17:09:32.567026 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:32.567027 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" event={"ID":"979422e9-0546-4ccf-919f-ba9134153bfb","Type":"ContainerStarted","Data":"4290dfd401ca7a09087139daa9c579a4e9c11e1fc3e3cfd40bb780cc8548f487"} Apr 17 17:09:35.577731 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:35.577696 2573 generic.go:358] "Generic (PLEG): container finished" podID="979422e9-0546-4ccf-919f-ba9134153bfb" containerID="ff79df3773e14a4157a73942a96fad624e5349a9f1ec039f569c7227378c8cf9" exitCode=0 Apr 17 17:09:35.578197 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:35.577768 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" event={"ID":"979422e9-0546-4ccf-919f-ba9134153bfb","Type":"ContainerDied","Data":"ff79df3773e14a4157a73942a96fad624e5349a9f1ec039f569c7227378c8cf9"} Apr 17 17:09:35.579878 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:35.579854 2573 generic.go:358] "Generic (PLEG): container finished" podID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerID="c147756cd98473a946ae4bd85c37023c4283b5f59ccc8c71d4b183d727f87fef" exitCode=0 Apr 17 17:09:35.580000 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:35.579922 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" event={"ID":"87b1a6ba-da8b-4e0e-a68e-cf1832168331","Type":"ContainerDied","Data":"c147756cd98473a946ae4bd85c37023c4283b5f59ccc8c71d4b183d727f87fef"} Apr 17 17:09:35.640540 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:35.640516 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" Apr 17 17:09:35.788926 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:35.788882 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87b1a6ba-da8b-4e0e-a68e-cf1832168331-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"87b1a6ba-da8b-4e0e-a68e-cf1832168331\" (UID: \"87b1a6ba-da8b-4e0e-a68e-cf1832168331\") " Apr 17 17:09:35.788926 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:35.788931 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87b1a6ba-da8b-4e0e-a68e-cf1832168331-proxy-tls\") pod \"87b1a6ba-da8b-4e0e-a68e-cf1832168331\" (UID: \"87b1a6ba-da8b-4e0e-a68e-cf1832168331\") " Apr 17 17:09:35.789169 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:35.788967 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87b1a6ba-da8b-4e0e-a68e-cf1832168331-kserve-provision-location\") pod \"87b1a6ba-da8b-4e0e-a68e-cf1832168331\" (UID: \"87b1a6ba-da8b-4e0e-a68e-cf1832168331\") " Apr 17 17:09:35.789169 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:35.789080 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l26ds\" (UniqueName: \"kubernetes.io/projected/87b1a6ba-da8b-4e0e-a68e-cf1832168331-kube-api-access-l26ds\") pod \"87b1a6ba-da8b-4e0e-a68e-cf1832168331\" (UID: \"87b1a6ba-da8b-4e0e-a68e-cf1832168331\") " Apr 17 17:09:35.789248 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:35.789231 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87b1a6ba-da8b-4e0e-a68e-cf1832168331-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "87b1a6ba-da8b-4e0e-a68e-cf1832168331" (UID: "87b1a6ba-da8b-4e0e-a68e-cf1832168331"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:09:35.789295 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:35.789245 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87b1a6ba-da8b-4e0e-a68e-cf1832168331-isvc-predictive-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-kube-rbac-proxy-sar-config") pod "87b1a6ba-da8b-4e0e-a68e-cf1832168331" (UID: "87b1a6ba-da8b-4e0e-a68e-cf1832168331"). InnerVolumeSpecName "isvc-predictive-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:09:35.789415 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:35.789391 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87b1a6ba-da8b-4e0e-a68e-cf1832168331-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:09:35.789415 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:35.789415 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87b1a6ba-da8b-4e0e-a68e-cf1832168331-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:09:35.791248 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:35.791222 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b1a6ba-da8b-4e0e-a68e-cf1832168331-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "87b1a6ba-da8b-4e0e-a68e-cf1832168331" (UID: "87b1a6ba-da8b-4e0e-a68e-cf1832168331"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:09:35.791361 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:35.791230 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87b1a6ba-da8b-4e0e-a68e-cf1832168331-kube-api-access-l26ds" (OuterVolumeSpecName: "kube-api-access-l26ds") pod "87b1a6ba-da8b-4e0e-a68e-cf1832168331" (UID: "87b1a6ba-da8b-4e0e-a68e-cf1832168331"). InnerVolumeSpecName "kube-api-access-l26ds". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:09:35.889935 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:35.889900 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l26ds\" (UniqueName: \"kubernetes.io/projected/87b1a6ba-da8b-4e0e-a68e-cf1832168331-kube-api-access-l26ds\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:09:35.889935 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:35.889928 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87b1a6ba-da8b-4e0e-a68e-cf1832168331-proxy-tls\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:09:36.584793 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:36.584763 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" Apr 17 17:09:36.585280 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:36.584756 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj" event={"ID":"87b1a6ba-da8b-4e0e-a68e-cf1832168331","Type":"ContainerDied","Data":"451017621f3be010e0bc7d81e39ca9bc4fc718ec648d51d56a013e0b0d342241"} Apr 17 17:09:36.585280 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:36.584895 2573 scope.go:117] "RemoveContainer" containerID="cfe7621dd4e57381ebdf62610ee06167807681dc9e0e21603f46e3049175133b" Apr 17 17:09:36.586861 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:36.586813 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" event={"ID":"979422e9-0546-4ccf-919f-ba9134153bfb","Type":"ContainerStarted","Data":"8cc3f962d066e9a4b288eb2e8c66b034a9dcd055dabaff9bf3859d282bf45cff"} Apr 17 17:09:36.586966 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:36.586872 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" event={"ID":"979422e9-0546-4ccf-919f-ba9134153bfb","Type":"ContainerStarted","Data":"b0bd3f5c6861883442e199baf90e3038fa51b956bc40988c070d4b3abace257d"} Apr 17 17:09:36.587154 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:36.587135 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" Apr 17 17:09:36.587202 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:36.587168 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" Apr 17 17:09:36.588966 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:36.588939 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" podUID="979422e9-0546-4ccf-919f-ba9134153bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 17 17:09:36.593687 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:36.593670 2573 scope.go:117] "RemoveContainer" containerID="c147756cd98473a946ae4bd85c37023c4283b5f59ccc8c71d4b183d727f87fef" Apr 17 17:09:36.600769 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:36.600752 2573 scope.go:117] "RemoveContainer" containerID="a9bdd073441d10b5929458d7ebef68a607c20200e139c2a7649e7c1869b989f3" Apr 17 17:09:36.606764 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:36.606725 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" podStartSLOduration=6.606712158 podStartE2EDuration="6.606712158s" podCreationTimestamp="2026-04-17 17:09:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:09:36.605876351 +0000 UTC m=+2293.937377256" watchObservedRunningTime="2026-04-17 17:09:36.606712158 +0000 UTC m=+2293.938213061" Apr 17 17:09:36.619394 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:36.619366 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj"] Apr 17 17:09:36.623625 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:36.623599 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-x6dvj"] Apr 17 17:09:37.174521 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:37.174486 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" path="/var/lib/kubelet/pods/87b1a6ba-da8b-4e0e-a68e-cf1832168331/volumes" Apr 17 17:09:37.590418 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:37.590382 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" podUID="979422e9-0546-4ccf-919f-ba9134153bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 17 17:09:42.596301 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:42.596271 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" Apr 17 17:09:42.596871 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:42.596840 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" podUID="979422e9-0546-4ccf-919f-ba9134153bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 17 17:09:52.596931 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:09:52.596888 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" podUID="979422e9-0546-4ccf-919f-ba9134153bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 17 17:10:02.596972 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:02.596929 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" podUID="979422e9-0546-4ccf-919f-ba9134153bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 17 17:10:12.597233 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:12.597136 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" podUID="979422e9-0546-4ccf-919f-ba9134153bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 17 17:10:22.597586 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:22.597541 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" podUID="979422e9-0546-4ccf-919f-ba9134153bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 17 17:10:32.597515 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:32.597474 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" podUID="979422e9-0546-4ccf-919f-ba9134153bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 17 17:10:42.597712 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:42.597674 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" podUID="979422e9-0546-4ccf-919f-ba9134153bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 17 17:10:50.171713 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:50.171684 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" Apr 17 17:10:51.089196 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.089165 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp"] Apr 17 17:10:51.168446 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.168412 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj"] Apr 17 17:10:51.168726 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.168715 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerName="kserve-container" Apr 17 17:10:51.168782 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.168728 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerName="kserve-container" Apr 17 17:10:51.168782 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.168742 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerName="storage-initializer" Apr 17 17:10:51.168782 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.168748 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerName="storage-initializer" Apr 17 17:10:51.168782 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.168759 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerName="kube-rbac-proxy" Apr 17 17:10:51.168782 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.168765 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerName="kube-rbac-proxy" Apr 17 17:10:51.168971 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.168816 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerName="kserve-container" Apr 17 17:10:51.168971 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.168837 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="87b1a6ba-da8b-4e0e-a68e-cf1832168331" containerName="kube-rbac-proxy" Apr 17 17:10:51.171934 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.171913 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" Apr 17 17:10:51.174079 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.174058 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-predictor-serving-cert\"" Apr 17 17:10:51.174377 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.174361 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 17 17:10:51.181629 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.181605 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj"] Apr 17 17:10:51.293688 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.293649 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b86c632-de97-4b52-a959-6436c0781d9c-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj\" (UID: \"1b86c632-de97-4b52-a959-6436c0781d9c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" Apr 17 17:10:51.293688 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.293691 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b86c632-de97-4b52-a959-6436c0781d9c-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj\" (UID: \"1b86c632-de97-4b52-a959-6436c0781d9c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" Apr 17 17:10:51.293957 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.293727 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b86c632-de97-4b52-a959-6436c0781d9c-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj\" (UID: \"1b86c632-de97-4b52-a959-6436c0781d9c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" Apr 17 17:10:51.293957 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.293785 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jmw4\" (UniqueName: \"kubernetes.io/projected/1b86c632-de97-4b52-a959-6436c0781d9c-kube-api-access-9jmw4\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj\" (UID: \"1b86c632-de97-4b52-a959-6436c0781d9c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" Apr 17 17:10:51.394504 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.394405 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jmw4\" (UniqueName: \"kubernetes.io/projected/1b86c632-de97-4b52-a959-6436c0781d9c-kube-api-access-9jmw4\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj\" (UID: \"1b86c632-de97-4b52-a959-6436c0781d9c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" Apr 17 17:10:51.394504 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.394507 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b86c632-de97-4b52-a959-6436c0781d9c-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj\" (UID: \"1b86c632-de97-4b52-a959-6436c0781d9c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" Apr 17 17:10:51.394696 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.394545 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b86c632-de97-4b52-a959-6436c0781d9c-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj\" (UID: \"1b86c632-de97-4b52-a959-6436c0781d9c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" Apr 17 17:10:51.394696 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.394602 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b86c632-de97-4b52-a959-6436c0781d9c-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj\" (UID: \"1b86c632-de97-4b52-a959-6436c0781d9c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" Apr 17 17:10:51.394696 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:10:51.394669 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-serving-cert: secret "isvc-predictive-sklearn-v2-predictor-serving-cert" not found Apr 17 17:10:51.394816 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:10:51.394761 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b86c632-de97-4b52-a959-6436c0781d9c-proxy-tls podName:1b86c632-de97-4b52-a959-6436c0781d9c nodeName:}" failed. No retries permitted until 2026-04-17 17:10:51.89473729 +0000 UTC m=+2369.226238176 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/1b86c632-de97-4b52-a959-6436c0781d9c-proxy-tls") pod "isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" (UID: "1b86c632-de97-4b52-a959-6436c0781d9c") : secret "isvc-predictive-sklearn-v2-predictor-serving-cert" not found Apr 17 17:10:51.395037 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.395016 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b86c632-de97-4b52-a959-6436c0781d9c-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj\" (UID: \"1b86c632-de97-4b52-a959-6436c0781d9c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" Apr 17 17:10:51.395214 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.395194 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b86c632-de97-4b52-a959-6436c0781d9c-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj\" (UID: \"1b86c632-de97-4b52-a959-6436c0781d9c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" Apr 17 17:10:51.402893 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.402873 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jmw4\" (UniqueName: \"kubernetes.io/projected/1b86c632-de97-4b52-a959-6436c0781d9c-kube-api-access-9jmw4\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj\" (UID: \"1b86c632-de97-4b52-a959-6436c0781d9c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" Apr 17 17:10:51.797087 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.797043 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" podUID="979422e9-0546-4ccf-919f-ba9134153bfb" containerName="kube-rbac-proxy" containerID="cri-o://8cc3f962d066e9a4b288eb2e8c66b034a9dcd055dabaff9bf3859d282bf45cff" gracePeriod=30 Apr 17 17:10:51.797087 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.797060 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" podUID="979422e9-0546-4ccf-919f-ba9134153bfb" containerName="kserve-container" containerID="cri-o://b0bd3f5c6861883442e199baf90e3038fa51b956bc40988c070d4b3abace257d" gracePeriod=30 Apr 17 17:10:51.898331 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.898294 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b86c632-de97-4b52-a959-6436c0781d9c-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj\" (UID: \"1b86c632-de97-4b52-a959-6436c0781d9c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" Apr 17 17:10:51.900850 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:51.900799 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b86c632-de97-4b52-a959-6436c0781d9c-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj\" (UID: \"1b86c632-de97-4b52-a959-6436c0781d9c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" Apr 17 17:10:52.083335 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:52.083236 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" Apr 17 17:10:52.202817 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:52.202793 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj"] Apr 17 17:10:52.205220 ip-10-0-128-217 kubenswrapper[2573]: W0417 17:10:52.205194 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b86c632_de97_4b52_a959_6436c0781d9c.slice/crio-0310a6011ea18ccd354efed977aa9b00717576426dd98bd6296901c882c0b200 WatchSource:0}: Error finding container 0310a6011ea18ccd354efed977aa9b00717576426dd98bd6296901c882c0b200: Status 404 returned error can't find the container with id 0310a6011ea18ccd354efed977aa9b00717576426dd98bd6296901c882c0b200 Apr 17 17:10:52.591211 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:52.591169 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" podUID="979422e9-0546-4ccf-919f-ba9134153bfb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.25:8643/healthz\": dial tcp 10.134.0.25:8643: connect: connection refused" Apr 17 17:10:52.801985 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:52.801947 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" event={"ID":"1b86c632-de97-4b52-a959-6436c0781d9c","Type":"ContainerStarted","Data":"0a1a5db1c350b22bcd0c6ce596be2855ee40dc52dc529041d40bfd4b714ad0a2"} Apr 17 17:10:52.801985 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:52.801991 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" event={"ID":"1b86c632-de97-4b52-a959-6436c0781d9c","Type":"ContainerStarted","Data":"0310a6011ea18ccd354efed977aa9b00717576426dd98bd6296901c882c0b200"} Apr 17 17:10:52.803671 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:52.803648 2573 generic.go:358] "Generic (PLEG): container finished" podID="979422e9-0546-4ccf-919f-ba9134153bfb" containerID="8cc3f962d066e9a4b288eb2e8c66b034a9dcd055dabaff9bf3859d282bf45cff" exitCode=2 Apr 17 17:10:52.803774 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:52.803689 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" event={"ID":"979422e9-0546-4ccf-919f-ba9134153bfb","Type":"ContainerDied","Data":"8cc3f962d066e9a4b288eb2e8c66b034a9dcd055dabaff9bf3859d282bf45cff"} Apr 17 17:10:56.639591 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.639567 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" Apr 17 17:10:56.735216 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.735128 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxct5\" (UniqueName: \"kubernetes.io/projected/979422e9-0546-4ccf-919f-ba9134153bfb-kube-api-access-cxct5\") pod \"979422e9-0546-4ccf-919f-ba9134153bfb\" (UID: \"979422e9-0546-4ccf-919f-ba9134153bfb\") " Apr 17 17:10:56.735216 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.735171 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/979422e9-0546-4ccf-919f-ba9134153bfb-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"979422e9-0546-4ccf-919f-ba9134153bfb\" (UID: \"979422e9-0546-4ccf-919f-ba9134153bfb\") " Apr 17 17:10:56.735459 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.735235 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/979422e9-0546-4ccf-919f-ba9134153bfb-proxy-tls\") pod \"979422e9-0546-4ccf-919f-ba9134153bfb\" (UID: \"979422e9-0546-4ccf-919f-ba9134153bfb\") " Apr 17 17:10:56.735459 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.735267 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/979422e9-0546-4ccf-919f-ba9134153bfb-kserve-provision-location\") pod \"979422e9-0546-4ccf-919f-ba9134153bfb\" (UID: \"979422e9-0546-4ccf-919f-ba9134153bfb\") " Apr 17 17:10:56.735646 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.735618 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/979422e9-0546-4ccf-919f-ba9134153bfb-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config") pod "979422e9-0546-4ccf-919f-ba9134153bfb" (UID: "979422e9-0546-4ccf-919f-ba9134153bfb"). InnerVolumeSpecName "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:10:56.735703 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.735639 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/979422e9-0546-4ccf-919f-ba9134153bfb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "979422e9-0546-4ccf-919f-ba9134153bfb" (UID: "979422e9-0546-4ccf-919f-ba9134153bfb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:10:56.737415 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.737390 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979422e9-0546-4ccf-919f-ba9134153bfb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "979422e9-0546-4ccf-919f-ba9134153bfb" (UID: "979422e9-0546-4ccf-919f-ba9134153bfb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:10:56.737523 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.737394 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/979422e9-0546-4ccf-919f-ba9134153bfb-kube-api-access-cxct5" (OuterVolumeSpecName: "kube-api-access-cxct5") pod "979422e9-0546-4ccf-919f-ba9134153bfb" (UID: "979422e9-0546-4ccf-919f-ba9134153bfb"). InnerVolumeSpecName "kube-api-access-cxct5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:10:56.816229 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.816196 2573 generic.go:358] "Generic (PLEG): container finished" podID="1b86c632-de97-4b52-a959-6436c0781d9c" containerID="0a1a5db1c350b22bcd0c6ce596be2855ee40dc52dc529041d40bfd4b714ad0a2" exitCode=0 Apr 17 17:10:56.816423 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.816267 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" event={"ID":"1b86c632-de97-4b52-a959-6436c0781d9c","Type":"ContainerDied","Data":"0a1a5db1c350b22bcd0c6ce596be2855ee40dc52dc529041d40bfd4b714ad0a2"} Apr 17 17:10:56.817891 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.817865 2573 generic.go:358] "Generic (PLEG): container finished" podID="979422e9-0546-4ccf-919f-ba9134153bfb" containerID="b0bd3f5c6861883442e199baf90e3038fa51b956bc40988c070d4b3abace257d" exitCode=0 Apr 17 17:10:56.817986 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.817932 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" event={"ID":"979422e9-0546-4ccf-919f-ba9134153bfb","Type":"ContainerDied","Data":"b0bd3f5c6861883442e199baf90e3038fa51b956bc40988c070d4b3abace257d"} Apr 17 17:10:56.817986 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.817964 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" event={"ID":"979422e9-0546-4ccf-919f-ba9134153bfb","Type":"ContainerDied","Data":"4290dfd401ca7a09087139daa9c579a4e9c11e1fc3e3cfd40bb780cc8548f487"} Apr 17 17:10:56.817986 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.817982 2573 scope.go:117] "RemoveContainer" containerID="8cc3f962d066e9a4b288eb2e8c66b034a9dcd055dabaff9bf3859d282bf45cff" Apr 17 17:10:56.818123 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.817936 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp" Apr 17 17:10:56.826516 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.826500 2573 scope.go:117] "RemoveContainer" containerID="b0bd3f5c6861883442e199baf90e3038fa51b956bc40988c070d4b3abace257d" Apr 17 17:10:56.833384 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.833282 2573 scope.go:117] "RemoveContainer" containerID="ff79df3773e14a4157a73942a96fad624e5349a9f1ec039f569c7227378c8cf9" Apr 17 17:10:56.836235 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.836211 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/979422e9-0546-4ccf-919f-ba9134153bfb-proxy-tls\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:10:56.836312 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.836243 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/979422e9-0546-4ccf-919f-ba9134153bfb-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:10:56.836312 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.836260 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cxct5\" (UniqueName: \"kubernetes.io/projected/979422e9-0546-4ccf-919f-ba9134153bfb-kube-api-access-cxct5\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:10:56.836312 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.836278 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/979422e9-0546-4ccf-919f-ba9134153bfb-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:10:56.840602 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.840581 2573 scope.go:117] "RemoveContainer" containerID="8cc3f962d066e9a4b288eb2e8c66b034a9dcd055dabaff9bf3859d282bf45cff" Apr 17 17:10:56.840887 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:10:56.840868 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cc3f962d066e9a4b288eb2e8c66b034a9dcd055dabaff9bf3859d282bf45cff\": container with ID starting with 8cc3f962d066e9a4b288eb2e8c66b034a9dcd055dabaff9bf3859d282bf45cff not found: ID does not exist" containerID="8cc3f962d066e9a4b288eb2e8c66b034a9dcd055dabaff9bf3859d282bf45cff" Apr 17 17:10:56.840950 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.840893 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cc3f962d066e9a4b288eb2e8c66b034a9dcd055dabaff9bf3859d282bf45cff"} err="failed to get container status \"8cc3f962d066e9a4b288eb2e8c66b034a9dcd055dabaff9bf3859d282bf45cff\": rpc error: code = NotFound desc = could not find container \"8cc3f962d066e9a4b288eb2e8c66b034a9dcd055dabaff9bf3859d282bf45cff\": container with ID starting with 8cc3f962d066e9a4b288eb2e8c66b034a9dcd055dabaff9bf3859d282bf45cff not found: ID does not exist" Apr 17 17:10:56.840950 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.840910 2573 scope.go:117] "RemoveContainer" containerID="b0bd3f5c6861883442e199baf90e3038fa51b956bc40988c070d4b3abace257d" Apr 17 17:10:56.841175 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:10:56.841156 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0bd3f5c6861883442e199baf90e3038fa51b956bc40988c070d4b3abace257d\": container with ID starting with b0bd3f5c6861883442e199baf90e3038fa51b956bc40988c070d4b3abace257d not found: ID does not exist" containerID="b0bd3f5c6861883442e199baf90e3038fa51b956bc40988c070d4b3abace257d" Apr 17 17:10:56.841277 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.841182 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0bd3f5c6861883442e199baf90e3038fa51b956bc40988c070d4b3abace257d"} err="failed to get container status \"b0bd3f5c6861883442e199baf90e3038fa51b956bc40988c070d4b3abace257d\": rpc error: code = NotFound desc = could not find container \"b0bd3f5c6861883442e199baf90e3038fa51b956bc40988c070d4b3abace257d\": container with ID starting with b0bd3f5c6861883442e199baf90e3038fa51b956bc40988c070d4b3abace257d not found: ID does not exist" Apr 17 17:10:56.841277 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.841197 2573 scope.go:117] "RemoveContainer" containerID="ff79df3773e14a4157a73942a96fad624e5349a9f1ec039f569c7227378c8cf9" Apr 17 17:10:56.841436 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:10:56.841420 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff79df3773e14a4157a73942a96fad624e5349a9f1ec039f569c7227378c8cf9\": container with ID starting with ff79df3773e14a4157a73942a96fad624e5349a9f1ec039f569c7227378c8cf9 not found: ID does not exist" containerID="ff79df3773e14a4157a73942a96fad624e5349a9f1ec039f569c7227378c8cf9" Apr 17 17:10:56.841473 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.841442 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff79df3773e14a4157a73942a96fad624e5349a9f1ec039f569c7227378c8cf9"} err="failed to get container status \"ff79df3773e14a4157a73942a96fad624e5349a9f1ec039f569c7227378c8cf9\": rpc error: code = NotFound desc = could not find container \"ff79df3773e14a4157a73942a96fad624e5349a9f1ec039f569c7227378c8cf9\": container with ID starting with ff79df3773e14a4157a73942a96fad624e5349a9f1ec039f569c7227378c8cf9 not found: ID does not exist" Apr 17 17:10:56.845242 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.845223 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp"] Apr 17 17:10:56.848446 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:56.848424 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-hnwmp"] Apr 17 17:10:57.175162 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:57.175073 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="979422e9-0546-4ccf-919f-ba9134153bfb" path="/var/lib/kubelet/pods/979422e9-0546-4ccf-919f-ba9134153bfb/volumes" Apr 17 17:10:57.823054 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:57.823016 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" event={"ID":"1b86c632-de97-4b52-a959-6436c0781d9c","Type":"ContainerStarted","Data":"dc29a54f0cec9184256eacab35262316eae24a783e06501ba18a2aade23ef4c3"} Apr 17 17:10:57.823516 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:57.823063 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" event={"ID":"1b86c632-de97-4b52-a959-6436c0781d9c","Type":"ContainerStarted","Data":"a68775c6cfbc9350f60f8428659d0a0980c6f3724eb88b5125d331c545892ca8"} Apr 17 17:10:57.823516 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:57.823361 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" Apr 17 17:10:57.848241 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:57.844901 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" podStartSLOduration=6.844876567 podStartE2EDuration="6.844876567s" podCreationTimestamp="2026-04-17 17:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:10:57.841687825 +0000 UTC m=+2375.173188737" watchObservedRunningTime="2026-04-17 17:10:57.844876567 +0000 UTC m=+2375.176377469" Apr 17 17:10:58.827355 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:10:58.827322 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" Apr 17 17:11:04.836053 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:11:04.836025 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" Apr 17 17:11:26.172093 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:11:26.172060 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 17:11:26.172673 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:11:26.172654 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 17:11:34.836793 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:11:34.836754 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" podUID="1b86c632-de97-4b52-a959-6436c0781d9c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.26:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.26:8080: connect: connection refused" Apr 17 17:11:44.837609 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:11:44.837514 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" podUID="1b86c632-de97-4b52-a959-6436c0781d9c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.26:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.26:8080: connect: connection refused" Apr 17 17:11:54.837397 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:11:54.837351 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" podUID="1b86c632-de97-4b52-a959-6436c0781d9c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.26:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.26:8080: connect: connection refused" Apr 17 17:12:04.837189 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:04.837137 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" podUID="1b86c632-de97-4b52-a959-6436c0781d9c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.26:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.26:8080: connect: connection refused" Apr 17 17:12:14.174346 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:14.174307 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" Apr 17 17:12:21.287733 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.287691 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj"] Apr 17 17:12:21.288321 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.288045 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" podUID="1b86c632-de97-4b52-a959-6436c0781d9c" containerName="kserve-container" containerID="cri-o://a68775c6cfbc9350f60f8428659d0a0980c6f3724eb88b5125d331c545892ca8" gracePeriod=30 Apr 17 17:12:21.288321 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.288117 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" podUID="1b86c632-de97-4b52-a959-6436c0781d9c" containerName="kube-rbac-proxy" containerID="cri-o://dc29a54f0cec9184256eacab35262316eae24a783e06501ba18a2aade23ef4c3" gracePeriod=30 Apr 17 17:12:21.414575 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.414547 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh"] Apr 17 17:12:21.414906 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.414890 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="979422e9-0546-4ccf-919f-ba9134153bfb" containerName="storage-initializer" Apr 17 17:12:21.414951 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.414910 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="979422e9-0546-4ccf-919f-ba9134153bfb" containerName="storage-initializer" Apr 17 17:12:21.414951 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.414935 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="979422e9-0546-4ccf-919f-ba9134153bfb" containerName="kserve-container" Apr 17 17:12:21.414951 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.414940 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="979422e9-0546-4ccf-919f-ba9134153bfb" containerName="kserve-container" Apr 17 17:12:21.415042 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.414954 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="979422e9-0546-4ccf-919f-ba9134153bfb" containerName="kube-rbac-proxy" Apr 17 17:12:21.415042 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.414960 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="979422e9-0546-4ccf-919f-ba9134153bfb" containerName="kube-rbac-proxy" Apr 17 17:12:21.415042 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.415012 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="979422e9-0546-4ccf-919f-ba9134153bfb" containerName="kserve-container" Apr 17 17:12:21.415042 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.415020 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="979422e9-0546-4ccf-919f-ba9134153bfb" containerName="kube-rbac-proxy" Apr 17 17:12:21.417907 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.417892 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" Apr 17 17:12:21.419949 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.419925 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-predictor-serving-cert\"" Apr 17 17:12:21.420065 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.419969 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 17 17:12:21.428015 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.427995 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh"] Apr 17 17:12:21.506437 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.506406 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh\" (UID: \"42becd7b-47a9-4cc2-b15d-2b735f4acb4a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" Apr 17 17:12:21.506604 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.506458 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh\" (UID: \"42becd7b-47a9-4cc2-b15d-2b735f4acb4a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" Apr 17 17:12:21.506604 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.506518 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh\" (UID: \"42becd7b-47a9-4cc2-b15d-2b735f4acb4a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" Apr 17 17:12:21.506604 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.506593 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvq8j\" (UniqueName: \"kubernetes.io/projected/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-kube-api-access-gvq8j\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh\" (UID: \"42becd7b-47a9-4cc2-b15d-2b735f4acb4a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" Apr 17 17:12:21.607041 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.606953 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh\" (UID: \"42becd7b-47a9-4cc2-b15d-2b735f4acb4a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" Apr 17 17:12:21.607041 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.607009 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh\" (UID: \"42becd7b-47a9-4cc2-b15d-2b735f4acb4a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" Apr 17 17:12:21.607259 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.607098 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvq8j\" (UniqueName: \"kubernetes.io/projected/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-kube-api-access-gvq8j\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh\" (UID: \"42becd7b-47a9-4cc2-b15d-2b735f4acb4a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" Apr 17 17:12:21.607259 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.607131 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh\" (UID: \"42becd7b-47a9-4cc2-b15d-2b735f4acb4a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" Apr 17 17:12:21.607391 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.607365 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh\" (UID: \"42becd7b-47a9-4cc2-b15d-2b735f4acb4a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" Apr 17 17:12:21.607631 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.607602 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh\" (UID: \"42becd7b-47a9-4cc2-b15d-2b735f4acb4a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" Apr 17 17:12:21.609395 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.609376 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh\" (UID: \"42becd7b-47a9-4cc2-b15d-2b735f4acb4a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" Apr 17 17:12:21.614635 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.614602 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvq8j\" (UniqueName: \"kubernetes.io/projected/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-kube-api-access-gvq8j\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh\" (UID: \"42becd7b-47a9-4cc2-b15d-2b735f4acb4a\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" Apr 17 17:12:21.728489 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.728451 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" Apr 17 17:12:21.847904 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.847878 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh"] Apr 17 17:12:21.850502 ip-10-0-128-217 kubenswrapper[2573]: W0417 17:12:21.850472 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42becd7b_47a9_4cc2_b15d_2b735f4acb4a.slice/crio-291dccc8bd9965716afd0d622a3ee0be46cd74ce8d4366c0ae3a865dc2c31b34 WatchSource:0}: Error finding container 291dccc8bd9965716afd0d622a3ee0be46cd74ce8d4366c0ae3a865dc2c31b34: Status 404 returned error can't find the container with id 291dccc8bd9965716afd0d622a3ee0be46cd74ce8d4366c0ae3a865dc2c31b34 Apr 17 17:12:21.852372 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:21.852358 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:12:22.065154 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:22.065114 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" event={"ID":"42becd7b-47a9-4cc2-b15d-2b735f4acb4a","Type":"ContainerStarted","Data":"38e243c25060c513f5817b0350a9c14d57154bba0a944a0306d8126749c0c142"} Apr 17 17:12:22.065154 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:22.065159 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" event={"ID":"42becd7b-47a9-4cc2-b15d-2b735f4acb4a","Type":"ContainerStarted","Data":"291dccc8bd9965716afd0d622a3ee0be46cd74ce8d4366c0ae3a865dc2c31b34"} Apr 17 17:12:22.066884 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:22.066861 2573 generic.go:358] "Generic (PLEG): container finished" podID="1b86c632-de97-4b52-a959-6436c0781d9c" containerID="dc29a54f0cec9184256eacab35262316eae24a783e06501ba18a2aade23ef4c3" exitCode=2 Apr 17 17:12:22.066994 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:22.066894 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" event={"ID":"1b86c632-de97-4b52-a959-6436c0781d9c","Type":"ContainerDied","Data":"dc29a54f0cec9184256eacab35262316eae24a783e06501ba18a2aade23ef4c3"} Apr 17 17:12:24.171301 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:24.171252 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" podUID="1b86c632-de97-4b52-a959-6436c0781d9c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.26:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.26:8080: connect: connection refused" Apr 17 17:12:24.830635 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:24.830592 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" podUID="1b86c632-de97-4b52-a959-6436c0781d9c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.26:8643/healthz\": dial tcp 10.134.0.26:8643: connect: connection refused" Apr 17 17:12:26.081862 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:26.081836 2573 generic.go:358] "Generic (PLEG): container finished" podID="42becd7b-47a9-4cc2-b15d-2b735f4acb4a" containerID="38e243c25060c513f5817b0350a9c14d57154bba0a944a0306d8126749c0c142" exitCode=0 Apr 17 17:12:26.082192 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:26.081897 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" event={"ID":"42becd7b-47a9-4cc2-b15d-2b735f4acb4a","Type":"ContainerDied","Data":"38e243c25060c513f5817b0350a9c14d57154bba0a944a0306d8126749c0c142"} Apr 17 17:12:26.229242 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:26.229221 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" Apr 17 17:12:26.339940 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:26.339848 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b86c632-de97-4b52-a959-6436c0781d9c-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"1b86c632-de97-4b52-a959-6436c0781d9c\" (UID: \"1b86c632-de97-4b52-a959-6436c0781d9c\") " Apr 17 17:12:26.339940 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:26.339907 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b86c632-de97-4b52-a959-6436c0781d9c-kserve-provision-location\") pod \"1b86c632-de97-4b52-a959-6436c0781d9c\" (UID: \"1b86c632-de97-4b52-a959-6436c0781d9c\") " Apr 17 17:12:26.339940 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:26.339928 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jmw4\" (UniqueName: \"kubernetes.io/projected/1b86c632-de97-4b52-a959-6436c0781d9c-kube-api-access-9jmw4\") pod \"1b86c632-de97-4b52-a959-6436c0781d9c\" (UID: \"1b86c632-de97-4b52-a959-6436c0781d9c\") " Apr 17 17:12:26.340233 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:26.339949 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b86c632-de97-4b52-a959-6436c0781d9c-proxy-tls\") pod \"1b86c632-de97-4b52-a959-6436c0781d9c\" (UID: \"1b86c632-de97-4b52-a959-6436c0781d9c\") " Apr 17 17:12:26.340321 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:26.340285 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b86c632-de97-4b52-a959-6436c0781d9c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1b86c632-de97-4b52-a959-6436c0781d9c" (UID: "1b86c632-de97-4b52-a959-6436c0781d9c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:12:26.340321 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:26.340303 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b86c632-de97-4b52-a959-6436c0781d9c-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config") pod "1b86c632-de97-4b52-a959-6436c0781d9c" (UID: "1b86c632-de97-4b52-a959-6436c0781d9c"). InnerVolumeSpecName "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:12:26.342311 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:26.342287 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b86c632-de97-4b52-a959-6436c0781d9c-kube-api-access-9jmw4" (OuterVolumeSpecName: "kube-api-access-9jmw4") pod "1b86c632-de97-4b52-a959-6436c0781d9c" (UID: "1b86c632-de97-4b52-a959-6436c0781d9c"). InnerVolumeSpecName "kube-api-access-9jmw4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:12:26.342311 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:26.342301 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b86c632-de97-4b52-a959-6436c0781d9c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1b86c632-de97-4b52-a959-6436c0781d9c" (UID: "1b86c632-de97-4b52-a959-6436c0781d9c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:12:26.440814 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:26.440782 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1b86c632-de97-4b52-a959-6436c0781d9c-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:12:26.440814 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:26.440813 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b86c632-de97-4b52-a959-6436c0781d9c-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:12:26.441072 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:26.440849 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9jmw4\" (UniqueName: \"kubernetes.io/projected/1b86c632-de97-4b52-a959-6436c0781d9c-kube-api-access-9jmw4\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:12:26.441072 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:26.440859 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b86c632-de97-4b52-a959-6436c0781d9c-proxy-tls\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:12:27.086632 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:27.086600 2573 generic.go:358] "Generic (PLEG): container finished" podID="1b86c632-de97-4b52-a959-6436c0781d9c" containerID="a68775c6cfbc9350f60f8428659d0a0980c6f3724eb88b5125d331c545892ca8" exitCode=0 Apr 17 17:12:27.087070 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:27.086687 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" Apr 17 17:12:27.087070 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:27.086686 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" event={"ID":"1b86c632-de97-4b52-a959-6436c0781d9c","Type":"ContainerDied","Data":"a68775c6cfbc9350f60f8428659d0a0980c6f3724eb88b5125d331c545892ca8"} Apr 17 17:12:27.087070 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:27.086729 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj" event={"ID":"1b86c632-de97-4b52-a959-6436c0781d9c","Type":"ContainerDied","Data":"0310a6011ea18ccd354efed977aa9b00717576426dd98bd6296901c882c0b200"} Apr 17 17:12:27.087070 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:27.086752 2573 scope.go:117] "RemoveContainer" containerID="dc29a54f0cec9184256eacab35262316eae24a783e06501ba18a2aade23ef4c3" Apr 17 17:12:27.088665 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:27.088641 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" event={"ID":"42becd7b-47a9-4cc2-b15d-2b735f4acb4a","Type":"ContainerStarted","Data":"a0734ae0603ed00cb8c94a62f038b4c0b7cf30833f9bb1ce2689ebd55e95f741"} Apr 17 17:12:27.088760 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:27.088684 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" event={"ID":"42becd7b-47a9-4cc2-b15d-2b735f4acb4a","Type":"ContainerStarted","Data":"d46417d97c0d56ef39a10e33e0e2c1f0c5e85a54ce18ad864f51f43f8e7a9f0e"} Apr 17 17:12:27.099519 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:27.099498 2573 scope.go:117] "RemoveContainer" containerID="a68775c6cfbc9350f60f8428659d0a0980c6f3724eb88b5125d331c545892ca8" Apr 17 17:12:27.107141 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:27.107114 2573 scope.go:117] "RemoveContainer" containerID="0a1a5db1c350b22bcd0c6ce596be2855ee40dc52dc529041d40bfd4b714ad0a2" Apr 17 17:12:27.115104 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:27.115076 2573 scope.go:117] "RemoveContainer" containerID="dc29a54f0cec9184256eacab35262316eae24a783e06501ba18a2aade23ef4c3" Apr 17 17:12:27.115381 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:12:27.115361 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc29a54f0cec9184256eacab35262316eae24a783e06501ba18a2aade23ef4c3\": container with ID starting with dc29a54f0cec9184256eacab35262316eae24a783e06501ba18a2aade23ef4c3 not found: ID does not exist" containerID="dc29a54f0cec9184256eacab35262316eae24a783e06501ba18a2aade23ef4c3" Apr 17 17:12:27.115449 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:27.115391 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc29a54f0cec9184256eacab35262316eae24a783e06501ba18a2aade23ef4c3"} err="failed to get container status \"dc29a54f0cec9184256eacab35262316eae24a783e06501ba18a2aade23ef4c3\": rpc error: code = NotFound desc = could not find container \"dc29a54f0cec9184256eacab35262316eae24a783e06501ba18a2aade23ef4c3\": container with ID starting with dc29a54f0cec9184256eacab35262316eae24a783e06501ba18a2aade23ef4c3 not found: ID does not exist" Apr 17 17:12:27.115449 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:27.115411 2573 scope.go:117] "RemoveContainer" containerID="a68775c6cfbc9350f60f8428659d0a0980c6f3724eb88b5125d331c545892ca8" Apr 17 17:12:27.115581 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:27.115534 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" podStartSLOduration=6.11551751 podStartE2EDuration="6.11551751s" podCreationTimestamp="2026-04-17 17:12:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:12:27.113853186 +0000 UTC m=+2464.445354083" watchObservedRunningTime="2026-04-17 17:12:27.11551751 +0000 UTC m=+2464.447018413" Apr 17 17:12:27.115686 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:12:27.115635 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a68775c6cfbc9350f60f8428659d0a0980c6f3724eb88b5125d331c545892ca8\": container with ID starting with a68775c6cfbc9350f60f8428659d0a0980c6f3724eb88b5125d331c545892ca8 not found: ID does not exist" containerID="a68775c6cfbc9350f60f8428659d0a0980c6f3724eb88b5125d331c545892ca8" Apr 17 17:12:27.115686 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:27.115661 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a68775c6cfbc9350f60f8428659d0a0980c6f3724eb88b5125d331c545892ca8"} err="failed to get container status \"a68775c6cfbc9350f60f8428659d0a0980c6f3724eb88b5125d331c545892ca8\": rpc error: code = NotFound desc = could not find container \"a68775c6cfbc9350f60f8428659d0a0980c6f3724eb88b5125d331c545892ca8\": container with ID starting with a68775c6cfbc9350f60f8428659d0a0980c6f3724eb88b5125d331c545892ca8 not found: ID does not exist" Apr 17 17:12:27.115686 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:27.115679 2573 scope.go:117] "RemoveContainer" containerID="0a1a5db1c350b22bcd0c6ce596be2855ee40dc52dc529041d40bfd4b714ad0a2" Apr 17 17:12:27.115989 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:12:27.115966 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a1a5db1c350b22bcd0c6ce596be2855ee40dc52dc529041d40bfd4b714ad0a2\": container with ID starting with 0a1a5db1c350b22bcd0c6ce596be2855ee40dc52dc529041d40bfd4b714ad0a2 not found: ID does not exist" containerID="0a1a5db1c350b22bcd0c6ce596be2855ee40dc52dc529041d40bfd4b714ad0a2" Apr 17 17:12:27.116063 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:27.116002 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a1a5db1c350b22bcd0c6ce596be2855ee40dc52dc529041d40bfd4b714ad0a2"} err="failed to get container status \"0a1a5db1c350b22bcd0c6ce596be2855ee40dc52dc529041d40bfd4b714ad0a2\": rpc error: code = NotFound desc = could not find container \"0a1a5db1c350b22bcd0c6ce596be2855ee40dc52dc529041d40bfd4b714ad0a2\": container with ID starting with 0a1a5db1c350b22bcd0c6ce596be2855ee40dc52dc529041d40bfd4b714ad0a2 not found: ID does not exist" Apr 17 17:12:27.126013 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:27.125989 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj"] Apr 17 17:12:27.129967 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:27.129946 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-mggxj"] Apr 17 17:12:27.174511 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:27.174484 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b86c632-de97-4b52-a959-6436c0781d9c" path="/var/lib/kubelet/pods/1b86c632-de97-4b52-a959-6436c0781d9c/volumes" Apr 17 17:12:32.089843 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:32.089791 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" Apr 17 17:12:32.090256 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:32.090182 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" Apr 17 17:12:32.094889 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:12:32.094865 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" Apr 17 17:13:03.106348 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:03.106304 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" podUID="42becd7b-47a9-4cc2-b15d-2b735f4acb4a" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.27:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.27:8080: connect: connection refused" Apr 17 17:13:13.106980 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:13.106884 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" podUID="42becd7b-47a9-4cc2-b15d-2b735f4acb4a" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.27:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.27:8080: connect: connection refused" Apr 17 17:13:23.106199 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:23.106160 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" podUID="42becd7b-47a9-4cc2-b15d-2b735f4acb4a" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.27:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.27:8080: connect: connection refused" Apr 17 17:13:33.106569 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:33.106519 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" podUID="42becd7b-47a9-4cc2-b15d-2b735f4acb4a" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.27:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.27:8080: connect: connection refused" Apr 17 17:13:36.171409 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:36.171369 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" podUID="42becd7b-47a9-4cc2-b15d-2b735f4acb4a" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.27:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.27:8080: connect: connection refused" Apr 17 17:13:46.175341 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:46.175311 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" Apr 17 17:13:51.526264 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.526229 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh"] Apr 17 17:13:51.526798 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.526650 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" podUID="42becd7b-47a9-4cc2-b15d-2b735f4acb4a" containerName="kserve-container" containerID="cri-o://d46417d97c0d56ef39a10e33e0e2c1f0c5e85a54ce18ad864f51f43f8e7a9f0e" gracePeriod=30 Apr 17 17:13:51.526908 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.526805 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" podUID="42becd7b-47a9-4cc2-b15d-2b735f4acb4a" containerName="kube-rbac-proxy" containerID="cri-o://a0734ae0603ed00cb8c94a62f038b4c0b7cf30833f9bb1ce2689ebd55e95f741" gracePeriod=30 Apr 17 17:13:51.653904 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.653871 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz"] Apr 17 17:13:51.654180 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.654169 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b86c632-de97-4b52-a959-6436c0781d9c" containerName="kube-rbac-proxy" Apr 17 17:13:51.654222 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.654182 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b86c632-de97-4b52-a959-6436c0781d9c" containerName="kube-rbac-proxy" Apr 17 17:13:51.654222 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.654193 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b86c632-de97-4b52-a959-6436c0781d9c" containerName="kserve-container" Apr 17 17:13:51.654222 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.654198 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b86c632-de97-4b52-a959-6436c0781d9c" containerName="kserve-container" Apr 17 17:13:51.654311 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.654230 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b86c632-de97-4b52-a959-6436c0781d9c" containerName="storage-initializer" Apr 17 17:13:51.654311 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.654237 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b86c632-de97-4b52-a959-6436c0781d9c" containerName="storage-initializer" Apr 17 17:13:51.654311 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.654282 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b86c632-de97-4b52-a959-6436c0781d9c" containerName="kube-rbac-proxy" Apr 17 17:13:51.654311 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.654293 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b86c632-de97-4b52-a959-6436c0781d9c" containerName="kserve-container" Apr 17 17:13:51.657302 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.657285 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" Apr 17 17:13:51.659612 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.659580 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\"" Apr 17 17:13:51.659814 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.659801 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-predictor-serving-cert\"" Apr 17 17:13:51.665451 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.665423 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz"] Apr 17 17:13:51.709691 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.709661 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2nt5\" (UniqueName: \"kubernetes.io/projected/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-kube-api-access-j2nt5\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz\" (UID: \"c2a24d01-2de5-46b2-bda3-0d7e8247c1da\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" Apr 17 17:13:51.709878 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.709700 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz\" (UID: \"c2a24d01-2de5-46b2-bda3-0d7e8247c1da\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" Apr 17 17:13:51.709878 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.709731 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz\" (UID: \"c2a24d01-2de5-46b2-bda3-0d7e8247c1da\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" Apr 17 17:13:51.709878 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.709778 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz\" (UID: \"c2a24d01-2de5-46b2-bda3-0d7e8247c1da\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" Apr 17 17:13:51.811030 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.810933 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz\" (UID: \"c2a24d01-2de5-46b2-bda3-0d7e8247c1da\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" Apr 17 17:13:51.811030 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.810994 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz\" (UID: \"c2a24d01-2de5-46b2-bda3-0d7e8247c1da\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" Apr 17 17:13:51.811263 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.811074 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2nt5\" (UniqueName: \"kubernetes.io/projected/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-kube-api-access-j2nt5\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz\" (UID: \"c2a24d01-2de5-46b2-bda3-0d7e8247c1da\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" Apr 17 17:13:51.811263 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:13:51.811095 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-serving-cert: secret "isvc-predictive-lightgbm-v2-predictor-serving-cert" not found Apr 17 17:13:51.811263 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.811105 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz\" (UID: \"c2a24d01-2de5-46b2-bda3-0d7e8247c1da\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" Apr 17 17:13:51.811263 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:13:51.811178 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-proxy-tls podName:c2a24d01-2de5-46b2-bda3-0d7e8247c1da nodeName:}" failed. No retries permitted until 2026-04-17 17:13:52.311154981 +0000 UTC m=+2549.642655862 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-proxy-tls") pod "isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" (UID: "c2a24d01-2de5-46b2-bda3-0d7e8247c1da") : secret "isvc-predictive-lightgbm-v2-predictor-serving-cert" not found Apr 17 17:13:51.811492 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.811392 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz\" (UID: \"c2a24d01-2de5-46b2-bda3-0d7e8247c1da\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" Apr 17 17:13:51.811738 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.811716 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz\" (UID: \"c2a24d01-2de5-46b2-bda3-0d7e8247c1da\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" Apr 17 17:13:51.819639 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:51.819612 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2nt5\" (UniqueName: \"kubernetes.io/projected/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-kube-api-access-j2nt5\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz\" (UID: \"c2a24d01-2de5-46b2-bda3-0d7e8247c1da\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" Apr 17 17:13:52.095569 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:52.095481 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" podUID="42becd7b-47a9-4cc2-b15d-2b735f4acb4a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.27:8643/healthz\": dial tcp 10.134.0.27:8643: connect: connection refused" Apr 17 17:13:52.315648 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:52.315614 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz\" (UID: \"c2a24d01-2de5-46b2-bda3-0d7e8247c1da\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" Apr 17 17:13:52.318023 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:52.317999 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz\" (UID: \"c2a24d01-2de5-46b2-bda3-0d7e8247c1da\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" Apr 17 17:13:52.326964 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:52.326934 2573 generic.go:358] "Generic (PLEG): container finished" podID="42becd7b-47a9-4cc2-b15d-2b735f4acb4a" containerID="a0734ae0603ed00cb8c94a62f038b4c0b7cf30833f9bb1ce2689ebd55e95f741" exitCode=2 Apr 17 17:13:52.327096 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:52.326999 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" event={"ID":"42becd7b-47a9-4cc2-b15d-2b735f4acb4a","Type":"ContainerDied","Data":"a0734ae0603ed00cb8c94a62f038b4c0b7cf30833f9bb1ce2689ebd55e95f741"} Apr 17 17:13:52.567810 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:52.567761 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" Apr 17 17:13:52.692900 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:52.692731 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz"] Apr 17 17:13:52.695734 ip-10-0-128-217 kubenswrapper[2573]: W0417 17:13:52.695701 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2a24d01_2de5_46b2_bda3_0d7e8247c1da.slice/crio-2681e5965caff6e7fde88594c0831f6667aa4a8a20bc0fa4259ab832fa2d0e92 WatchSource:0}: Error finding container 2681e5965caff6e7fde88594c0831f6667aa4a8a20bc0fa4259ab832fa2d0e92: Status 404 returned error can't find the container with id 2681e5965caff6e7fde88594c0831f6667aa4a8a20bc0fa4259ab832fa2d0e92 Apr 17 17:13:53.331138 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:53.331104 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" event={"ID":"c2a24d01-2de5-46b2-bda3-0d7e8247c1da","Type":"ContainerStarted","Data":"ee33c7b3f2a5c377108983bf9ba910e65d990f11eb08d71c037723f69916efbe"} Apr 17 17:13:53.331138 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:53.331142 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" event={"ID":"c2a24d01-2de5-46b2-bda3-0d7e8247c1da","Type":"ContainerStarted","Data":"2681e5965caff6e7fde88594c0831f6667aa4a8a20bc0fa4259ab832fa2d0e92"} Apr 17 17:13:56.171450 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:56.171406 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" podUID="42becd7b-47a9-4cc2-b15d-2b735f4acb4a" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.27:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.27:8080: connect: connection refused" Apr 17 17:13:56.341707 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:56.341677 2573 generic.go:358] "Generic (PLEG): container finished" podID="42becd7b-47a9-4cc2-b15d-2b735f4acb4a" containerID="d46417d97c0d56ef39a10e33e0e2c1f0c5e85a54ce18ad864f51f43f8e7a9f0e" exitCode=0 Apr 17 17:13:56.341893 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:56.341717 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" event={"ID":"42becd7b-47a9-4cc2-b15d-2b735f4acb4a","Type":"ContainerDied","Data":"d46417d97c0d56ef39a10e33e0e2c1f0c5e85a54ce18ad864f51f43f8e7a9f0e"} Apr 17 17:13:56.381238 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:56.381210 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" Apr 17 17:13:56.450611 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:56.450562 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-proxy-tls\") pod \"42becd7b-47a9-4cc2-b15d-2b735f4acb4a\" (UID: \"42becd7b-47a9-4cc2-b15d-2b735f4acb4a\") " Apr 17 17:13:56.450611 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:56.450611 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"42becd7b-47a9-4cc2-b15d-2b735f4acb4a\" (UID: \"42becd7b-47a9-4cc2-b15d-2b735f4acb4a\") " Apr 17 17:13:56.450887 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:56.450643 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvq8j\" (UniqueName: \"kubernetes.io/projected/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-kube-api-access-gvq8j\") pod \"42becd7b-47a9-4cc2-b15d-2b735f4acb4a\" (UID: \"42becd7b-47a9-4cc2-b15d-2b735f4acb4a\") " Apr 17 17:13:56.450887 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:56.450817 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-kserve-provision-location\") pod \"42becd7b-47a9-4cc2-b15d-2b735f4acb4a\" (UID: \"42becd7b-47a9-4cc2-b15d-2b735f4acb4a\") " Apr 17 17:13:56.451080 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:56.451053 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config") pod "42becd7b-47a9-4cc2-b15d-2b735f4acb4a" (UID: "42becd7b-47a9-4cc2-b15d-2b735f4acb4a"). InnerVolumeSpecName "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:13:56.451144 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:56.451083 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "42becd7b-47a9-4cc2-b15d-2b735f4acb4a" (UID: "42becd7b-47a9-4cc2-b15d-2b735f4acb4a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:13:56.451144 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:56.451122 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:13:56.453336 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:56.453312 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-kube-api-access-gvq8j" (OuterVolumeSpecName: "kube-api-access-gvq8j") pod "42becd7b-47a9-4cc2-b15d-2b735f4acb4a" (UID: "42becd7b-47a9-4cc2-b15d-2b735f4acb4a"). InnerVolumeSpecName "kube-api-access-gvq8j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:13:56.453336 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:56.453326 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "42becd7b-47a9-4cc2-b15d-2b735f4acb4a" (UID: "42becd7b-47a9-4cc2-b15d-2b735f4acb4a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:13:56.552107 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:56.552067 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-proxy-tls\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:13:56.552107 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:56.552103 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gvq8j\" (UniqueName: \"kubernetes.io/projected/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-kube-api-access-gvq8j\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:13:56.552107 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:56.552113 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42becd7b-47a9-4cc2-b15d-2b735f4acb4a-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:13:57.346417 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:57.346383 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" Apr 17 17:13:57.346936 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:57.346383 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh" event={"ID":"42becd7b-47a9-4cc2-b15d-2b735f4acb4a","Type":"ContainerDied","Data":"291dccc8bd9965716afd0d622a3ee0be46cd74ce8d4366c0ae3a865dc2c31b34"} Apr 17 17:13:57.346936 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:57.346522 2573 scope.go:117] "RemoveContainer" containerID="a0734ae0603ed00cb8c94a62f038b4c0b7cf30833f9bb1ce2689ebd55e95f741" Apr 17 17:13:57.347797 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:57.347769 2573 generic.go:358] "Generic (PLEG): container finished" podID="c2a24d01-2de5-46b2-bda3-0d7e8247c1da" containerID="ee33c7b3f2a5c377108983bf9ba910e65d990f11eb08d71c037723f69916efbe" exitCode=0 Apr 17 17:13:57.347886 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:57.347817 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" event={"ID":"c2a24d01-2de5-46b2-bda3-0d7e8247c1da","Type":"ContainerDied","Data":"ee33c7b3f2a5c377108983bf9ba910e65d990f11eb08d71c037723f69916efbe"} Apr 17 17:13:57.354493 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:57.354475 2573 scope.go:117] "RemoveContainer" containerID="d46417d97c0d56ef39a10e33e0e2c1f0c5e85a54ce18ad864f51f43f8e7a9f0e" Apr 17 17:13:57.364000 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:57.363924 2573 scope.go:117] "RemoveContainer" containerID="38e243c25060c513f5817b0350a9c14d57154bba0a944a0306d8126749c0c142" Apr 17 17:13:57.365814 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:57.365790 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh"] Apr 17 17:13:57.371162 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:57.371139 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-5zxhh"] Apr 17 17:13:58.353106 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:58.353073 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" event={"ID":"c2a24d01-2de5-46b2-bda3-0d7e8247c1da","Type":"ContainerStarted","Data":"962ffe4104b36e26976772538336b4bc55beb6ccc81513ab907fd08a905ca047"} Apr 17 17:13:58.353641 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:58.353119 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" event={"ID":"c2a24d01-2de5-46b2-bda3-0d7e8247c1da","Type":"ContainerStarted","Data":"9e083ad50a7347f39a3271b38aad194afb59bceb3d41769c80d4974a43b4c25c"} Apr 17 17:13:58.353641 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:58.353344 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" Apr 17 17:13:58.375568 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:58.375525 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" podStartSLOduration=7.375511867 podStartE2EDuration="7.375511867s" podCreationTimestamp="2026-04-17 17:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:13:58.374184002 +0000 UTC m=+2555.705684906" watchObservedRunningTime="2026-04-17 17:13:58.375511867 +0000 UTC m=+2555.707012768" Apr 17 17:13:59.174611 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:59.174576 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42becd7b-47a9-4cc2-b15d-2b735f4acb4a" path="/var/lib/kubelet/pods/42becd7b-47a9-4cc2-b15d-2b735f4acb4a/volumes" Apr 17 17:13:59.357052 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:13:59.357011 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" Apr 17 17:14:05.365086 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:14:05.365055 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" Apr 17 17:14:35.365677 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:14:35.365639 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" podUID="c2a24d01-2de5-46b2-bda3-0d7e8247c1da" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.28:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.28:8080: connect: connection refused" Apr 17 17:14:45.366162 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:14:45.366078 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" podUID="c2a24d01-2de5-46b2-bda3-0d7e8247c1da" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.28:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.28:8080: connect: connection refused" Apr 17 17:14:55.366134 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:14:55.366092 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" podUID="c2a24d01-2de5-46b2-bda3-0d7e8247c1da" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.28:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.28:8080: connect: connection refused" Apr 17 17:15:05.366579 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:05.366537 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" podUID="c2a24d01-2de5-46b2-bda3-0d7e8247c1da" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.28:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.28:8080: connect: connection refused" Apr 17 17:15:10.171652 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:10.171608 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" podUID="c2a24d01-2de5-46b2-bda3-0d7e8247c1da" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.28:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.28:8080: connect: connection refused" Apr 17 17:15:20.174806 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:20.174773 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" Apr 17 17:15:21.734412 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:21.734382 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz"] Apr 17 17:15:21.734785 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:21.734713 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" podUID="c2a24d01-2de5-46b2-bda3-0d7e8247c1da" containerName="kserve-container" containerID="cri-o://9e083ad50a7347f39a3271b38aad194afb59bceb3d41769c80d4974a43b4c25c" gracePeriod=30 Apr 17 17:15:21.734862 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:21.734765 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" podUID="c2a24d01-2de5-46b2-bda3-0d7e8247c1da" containerName="kube-rbac-proxy" containerID="cri-o://962ffe4104b36e26976772538336b4bc55beb6ccc81513ab907fd08a905ca047" gracePeriod=30 Apr 17 17:15:22.587214 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:22.587178 2573 generic.go:358] "Generic (PLEG): container finished" podID="c2a24d01-2de5-46b2-bda3-0d7e8247c1da" containerID="962ffe4104b36e26976772538336b4bc55beb6ccc81513ab907fd08a905ca047" exitCode=2 Apr 17 17:15:22.587392 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:22.587252 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" event={"ID":"c2a24d01-2de5-46b2-bda3-0d7e8247c1da","Type":"ContainerDied","Data":"962ffe4104b36e26976772538336b4bc55beb6ccc81513ab907fd08a905ca047"} Apr 17 17:15:25.360215 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:25.360172 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" podUID="c2a24d01-2de5-46b2-bda3-0d7e8247c1da" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 17 17:15:26.598689 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:26.598608 2573 generic.go:358] "Generic (PLEG): container finished" podID="c2a24d01-2de5-46b2-bda3-0d7e8247c1da" containerID="9e083ad50a7347f39a3271b38aad194afb59bceb3d41769c80d4974a43b4c25c" exitCode=0 Apr 17 17:15:26.598689 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:26.598647 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" event={"ID":"c2a24d01-2de5-46b2-bda3-0d7e8247c1da","Type":"ContainerDied","Data":"9e083ad50a7347f39a3271b38aad194afb59bceb3d41769c80d4974a43b4c25c"} Apr 17 17:15:26.970955 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:26.970929 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" Apr 17 17:15:27.066811 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:27.066779 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2nt5\" (UniqueName: \"kubernetes.io/projected/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-kube-api-access-j2nt5\") pod \"c2a24d01-2de5-46b2-bda3-0d7e8247c1da\" (UID: \"c2a24d01-2de5-46b2-bda3-0d7e8247c1da\") " Apr 17 17:15:27.066987 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:27.066851 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"c2a24d01-2de5-46b2-bda3-0d7e8247c1da\" (UID: \"c2a24d01-2de5-46b2-bda3-0d7e8247c1da\") " Apr 17 17:15:27.066987 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:27.066881 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-proxy-tls\") pod \"c2a24d01-2de5-46b2-bda3-0d7e8247c1da\" (UID: \"c2a24d01-2de5-46b2-bda3-0d7e8247c1da\") " Apr 17 17:15:27.066987 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:27.066926 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-kserve-provision-location\") pod \"c2a24d01-2de5-46b2-bda3-0d7e8247c1da\" (UID: \"c2a24d01-2de5-46b2-bda3-0d7e8247c1da\") " Apr 17 17:15:27.067309 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:27.067277 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config") pod "c2a24d01-2de5-46b2-bda3-0d7e8247c1da" (UID: "c2a24d01-2de5-46b2-bda3-0d7e8247c1da"). InnerVolumeSpecName "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:15:27.067425 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:27.067300 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c2a24d01-2de5-46b2-bda3-0d7e8247c1da" (UID: "c2a24d01-2de5-46b2-bda3-0d7e8247c1da"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:15:27.068865 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:27.068845 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c2a24d01-2de5-46b2-bda3-0d7e8247c1da" (UID: "c2a24d01-2de5-46b2-bda3-0d7e8247c1da"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:15:27.068966 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:27.068871 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-kube-api-access-j2nt5" (OuterVolumeSpecName: "kube-api-access-j2nt5") pod "c2a24d01-2de5-46b2-bda3-0d7e8247c1da" (UID: "c2a24d01-2de5-46b2-bda3-0d7e8247c1da"). InnerVolumeSpecName "kube-api-access-j2nt5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:15:27.168476 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:27.168379 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:15:27.168476 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:27.168424 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j2nt5\" (UniqueName: \"kubernetes.io/projected/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-kube-api-access-j2nt5\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:15:27.168476 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:27.168436 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:15:27.168476 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:27.168447 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2a24d01-2de5-46b2-bda3-0d7e8247c1da-proxy-tls\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:15:27.603983 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:27.603942 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" event={"ID":"c2a24d01-2de5-46b2-bda3-0d7e8247c1da","Type":"ContainerDied","Data":"2681e5965caff6e7fde88594c0831f6667aa4a8a20bc0fa4259ab832fa2d0e92"} Apr 17 17:15:27.603983 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:27.603985 2573 scope.go:117] "RemoveContainer" containerID="962ffe4104b36e26976772538336b4bc55beb6ccc81513ab907fd08a905ca047" Apr 17 17:15:27.604513 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:27.603988 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz" Apr 17 17:15:27.611769 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:27.611751 2573 scope.go:117] "RemoveContainer" containerID="9e083ad50a7347f39a3271b38aad194afb59bceb3d41769c80d4974a43b4c25c" Apr 17 17:15:27.618589 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:27.618573 2573 scope.go:117] "RemoveContainer" containerID="ee33c7b3f2a5c377108983bf9ba910e65d990f11eb08d71c037723f69916efbe" Apr 17 17:15:27.621610 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:27.621590 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz"] Apr 17 17:15:27.626027 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:27.626009 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-f2xmz"] Apr 17 17:15:29.175027 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:15:29.174995 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2a24d01-2de5-46b2-bda3-0d7e8247c1da" path="/var/lib/kubelet/pods/c2a24d01-2de5-46b2-bda3-0d7e8247c1da/volumes" Apr 17 17:16:26.191134 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:16:26.191100 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 17:16:26.191769 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:16:26.191748 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 17:21:26.212271 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:21:26.212240 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 17:21:26.213681 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:21:26.213660 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 17:22:02.226270 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.226234 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg"] Apr 17 17:22:02.226740 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.226513 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42becd7b-47a9-4cc2-b15d-2b735f4acb4a" containerName="kserve-container" Apr 17 17:22:02.226740 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.226524 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="42becd7b-47a9-4cc2-b15d-2b735f4acb4a" containerName="kserve-container" Apr 17 17:22:02.226740 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.226538 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2a24d01-2de5-46b2-bda3-0d7e8247c1da" containerName="storage-initializer" Apr 17 17:22:02.226740 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.226544 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a24d01-2de5-46b2-bda3-0d7e8247c1da" containerName="storage-initializer" Apr 17 17:22:02.226740 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.226556 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42becd7b-47a9-4cc2-b15d-2b735f4acb4a" containerName="kube-rbac-proxy" Apr 17 17:22:02.226740 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.226561 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="42becd7b-47a9-4cc2-b15d-2b735f4acb4a" containerName="kube-rbac-proxy" Apr 17 17:22:02.226740 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.226567 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2a24d01-2de5-46b2-bda3-0d7e8247c1da" containerName="kserve-container" Apr 17 17:22:02.226740 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.226572 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a24d01-2de5-46b2-bda3-0d7e8247c1da" containerName="kserve-container" Apr 17 17:22:02.226740 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.226581 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2a24d01-2de5-46b2-bda3-0d7e8247c1da" containerName="kube-rbac-proxy" Apr 17 17:22:02.226740 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.226585 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a24d01-2de5-46b2-bda3-0d7e8247c1da" containerName="kube-rbac-proxy" Apr 17 17:22:02.226740 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.226596 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42becd7b-47a9-4cc2-b15d-2b735f4acb4a" containerName="storage-initializer" Apr 17 17:22:02.226740 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.226601 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="42becd7b-47a9-4cc2-b15d-2b735f4acb4a" containerName="storage-initializer" Apr 17 17:22:02.226740 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.226642 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="42becd7b-47a9-4cc2-b15d-2b735f4acb4a" containerName="kube-rbac-proxy" Apr 17 17:22:02.226740 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.226651 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2a24d01-2de5-46b2-bda3-0d7e8247c1da" containerName="kserve-container" Apr 17 17:22:02.226740 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.226656 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="42becd7b-47a9-4cc2-b15d-2b735f4acb4a" containerName="kserve-container" Apr 17 17:22:02.226740 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.226662 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2a24d01-2de5-46b2-bda3-0d7e8247c1da" containerName="kube-rbac-proxy" Apr 17 17:22:02.229626 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.229609 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:22:02.236845 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:22:02.234556 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"isvc-tensorflow-predictor-serving-cert\" is forbidden: User \"system:node:ip-10-0-128-217.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"kserve-ci-e2e-test\": no relationship found between node 'ip-10-0-128-217.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-predictor-serving-cert\"" type="*v1.Secret" Apr 17 17:22:02.236845 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:22:02.234880 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-10-0-128-217.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kserve-ci-e2e-test\": no relationship found between node 'ip-10-0-128-217.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Apr 17 17:22:02.236845 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.234948 2573 status_manager.go:895] "Failed to get status for pod" podUID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" err="pods \"isvc-tensorflow-predictor-6756f669d7-tbfcg\" is forbidden: User \"system:node:ip-10-0-128-217.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kserve-ci-e2e-test\": no relationship found between node 'ip-10-0-128-217.ec2.internal' and this object" Apr 17 17:22:02.236845 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:22:02.235068 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"default-dockercfg-wzvpv\" is forbidden: User \"system:node:ip-10-0-128-217.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"kserve-ci-e2e-test\": no relationship found between node 'ip-10-0-128-217.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-wzvpv\"" type="*v1.Secret" Apr 17 17:22:02.236845 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:22:02.235069 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:ip-10-0-128-217.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kserve-ci-e2e-test\": no relationship found between node 'ip-10-0-128-217.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" type="*v1.ConfigMap" Apr 17 17:22:02.236845 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:22:02.235129 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"isvc-tensorflow-kube-rbac-proxy-sar-config\" is forbidden: User \"system:node:ip-10-0-128-217.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kserve-ci-e2e-test\": no relationship found between node 'ip-10-0-128-217.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-kube-rbac-proxy-sar-config\"" type="*v1.ConfigMap" Apr 17 17:22:02.255099 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.255050 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-tbfcg\" (UID: \"b0d93895-4be9-47a2-b3b5-2b7e0ab92931\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:22:02.255099 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.255099 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjncx\" (UniqueName: \"kubernetes.io/projected/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-kube-api-access-hjncx\") pod \"isvc-tensorflow-predictor-6756f669d7-tbfcg\" (UID: \"b0d93895-4be9-47a2-b3b5-2b7e0ab92931\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:22:02.255281 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.255179 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-tbfcg\" (UID: \"b0d93895-4be9-47a2-b3b5-2b7e0ab92931\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:22:02.255281 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.255220 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-tbfcg\" (UID: \"b0d93895-4be9-47a2-b3b5-2b7e0ab92931\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:22:02.261280 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.261259 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg"] Apr 17 17:22:02.356448 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.356406 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-tbfcg\" (UID: \"b0d93895-4be9-47a2-b3b5-2b7e0ab92931\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:22:02.356659 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.356459 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-tbfcg\" (UID: \"b0d93895-4be9-47a2-b3b5-2b7e0ab92931\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:22:02.356659 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.356486 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hjncx\" (UniqueName: \"kubernetes.io/projected/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-kube-api-access-hjncx\") pod \"isvc-tensorflow-predictor-6756f669d7-tbfcg\" (UID: \"b0d93895-4be9-47a2-b3b5-2b7e0ab92931\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:22:02.356659 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.356532 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-tbfcg\" (UID: \"b0d93895-4be9-47a2-b3b5-2b7e0ab92931\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:22:02.356921 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:02.356903 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-tbfcg\" (UID: \"b0d93895-4be9-47a2-b3b5-2b7e0ab92931\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:22:03.098961 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:03.098927 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 17:22:03.177668 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:03.177636 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-wzvpv\"" Apr 17 17:22:03.333657 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:03.333622 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 17:22:03.337985 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:03.337966 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjncx\" (UniqueName: \"kubernetes.io/projected/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-kube-api-access-hjncx\") pod \"isvc-tensorflow-predictor-6756f669d7-tbfcg\" (UID: \"b0d93895-4be9-47a2-b3b5-2b7e0ab92931\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:22:03.356996 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:22:03.356931 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-tensorflow-predictor-serving-cert: failed to sync secret cache: timed out waiting for the condition Apr 17 17:22:03.357094 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:22:03.357028 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-proxy-tls podName:b0d93895-4be9-47a2-b3b5-2b7e0ab92931 nodeName:}" failed. No retries permitted until 2026-04-17 17:22:03.857008668 +0000 UTC m=+3041.188509552 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-proxy-tls") pod "isvc-tensorflow-predictor-6756f669d7-tbfcg" (UID: "b0d93895-4be9-47a2-b3b5-2b7e0ab92931") : failed to sync secret cache: timed out waiting for the condition Apr 17 17:22:03.357094 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:22:03.356934 2573 configmap.go:193] Couldn't get configMap kserve-ci-e2e-test/isvc-tensorflow-kube-rbac-proxy-sar-config: failed to sync configmap cache: timed out waiting for the condition Apr 17 17:22:03.357094 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:22:03.357074 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-isvc-tensorflow-kube-rbac-proxy-sar-config podName:b0d93895-4be9-47a2-b3b5-2b7e0ab92931 nodeName:}" failed. No retries permitted until 2026-04-17 17:22:03.857064783 +0000 UTC m=+3041.188565667 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "isvc-tensorflow-kube-rbac-proxy-sar-config" (UniqueName: "kubernetes.io/configmap/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-isvc-tensorflow-kube-rbac-proxy-sar-config") pod "isvc-tensorflow-predictor-6756f669d7-tbfcg" (UID: "b0d93895-4be9-47a2-b3b5-2b7e0ab92931") : failed to sync configmap cache: timed out waiting for the condition Apr 17 17:22:03.406239 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:03.406211 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-kube-rbac-proxy-sar-config\"" Apr 17 17:22:03.582379 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:03.582349 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-predictor-serving-cert\"" Apr 17 17:22:03.868397 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:03.868365 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-tbfcg\" (UID: \"b0d93895-4be9-47a2-b3b5-2b7e0ab92931\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:22:03.868397 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:03.868410 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-tbfcg\" (UID: \"b0d93895-4be9-47a2-b3b5-2b7e0ab92931\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:22:03.869068 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:03.869045 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-tbfcg\" (UID: \"b0d93895-4be9-47a2-b3b5-2b7e0ab92931\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:22:03.870896 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:03.870870 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-tbfcg\" (UID: \"b0d93895-4be9-47a2-b3b5-2b7e0ab92931\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:22:04.042409 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:04.042368 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:22:04.160419 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:04.160342 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg"] Apr 17 17:22:04.163462 ip-10-0-128-217 kubenswrapper[2573]: W0417 17:22:04.163430 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0d93895_4be9_47a2_b3b5_2b7e0ab92931.slice/crio-303aaceef98903ca50fb3c7fc89a3546524a26adff70a3002c6d54fd456fdf52 WatchSource:0}: Error finding container 303aaceef98903ca50fb3c7fc89a3546524a26adff70a3002c6d54fd456fdf52: Status 404 returned error can't find the container with id 303aaceef98903ca50fb3c7fc89a3546524a26adff70a3002c6d54fd456fdf52 Apr 17 17:22:04.165386 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:04.165368 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:22:04.663752 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:04.663711 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" event={"ID":"b0d93895-4be9-47a2-b3b5-2b7e0ab92931","Type":"ContainerStarted","Data":"5f78987c536b9bd355532e51f879e6c9b099925e3b7cc8d359fe1d7c81249c55"} Apr 17 17:22:04.663752 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:04.663753 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" event={"ID":"b0d93895-4be9-47a2-b3b5-2b7e0ab92931","Type":"ContainerStarted","Data":"303aaceef98903ca50fb3c7fc89a3546524a26adff70a3002c6d54fd456fdf52"} Apr 17 17:22:08.675146 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:08.675116 2573 generic.go:358] "Generic (PLEG): container finished" podID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" containerID="5f78987c536b9bd355532e51f879e6c9b099925e3b7cc8d359fe1d7c81249c55" exitCode=0 Apr 17 17:22:08.675536 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:08.675157 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" event={"ID":"b0d93895-4be9-47a2-b3b5-2b7e0ab92931","Type":"ContainerDied","Data":"5f78987c536b9bd355532e51f879e6c9b099925e3b7cc8d359fe1d7c81249c55"} Apr 17 17:22:12.688909 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:12.688871 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" event={"ID":"b0d93895-4be9-47a2-b3b5-2b7e0ab92931","Type":"ContainerStarted","Data":"d73f33665e90e5a997adc5aee5c7a85c0c2ab5cc25c6683c9051e0b7ff07e679"} Apr 17 17:22:12.689390 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:12.688919 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" event={"ID":"b0d93895-4be9-47a2-b3b5-2b7e0ab92931","Type":"ContainerStarted","Data":"8ff323c525a43a218d358a66d4ddd04055511666ebe800bfd25b4535d5d438db"} Apr 17 17:22:12.689390 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:12.689147 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:22:12.708119 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:12.708070 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" podStartSLOduration=6.957906046 podStartE2EDuration="10.70805719s" podCreationTimestamp="2026-04-17 17:22:02 +0000 UTC" firstStartedPulling="2026-04-17 17:22:08.676294146 +0000 UTC m=+3046.007795025" lastFinishedPulling="2026-04-17 17:22:12.426445285 +0000 UTC m=+3049.757946169" observedRunningTime="2026-04-17 17:22:12.706802881 +0000 UTC m=+3050.038303783" watchObservedRunningTime="2026-04-17 17:22:12.70805719 +0000 UTC m=+3050.039558091" Apr 17 17:22:13.691539 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:13.691506 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:22:13.692637 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:13.692611 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" podUID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 17 17:22:14.694367 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:14.694324 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" podUID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 17 17:22:19.699206 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:19.699172 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:22:19.699868 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:19.699812 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" podUID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 17 17:22:29.700413 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:29.700386 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:22:43.490164 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:43.490128 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg"] Apr 17 17:22:43.490803 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:43.490449 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" podUID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" containerName="kserve-container" containerID="cri-o://8ff323c525a43a218d358a66d4ddd04055511666ebe800bfd25b4535d5d438db" gracePeriod=30 Apr 17 17:22:43.490803 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:43.490522 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" podUID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" containerName="kube-rbac-proxy" containerID="cri-o://d73f33665e90e5a997adc5aee5c7a85c0c2ab5cc25c6683c9051e0b7ff07e679" gracePeriod=30 Apr 17 17:22:43.603700 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:43.603669 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr"] Apr 17 17:22:43.608796 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:43.608758 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" Apr 17 17:22:43.612481 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:43.612454 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-predictor-serving-cert\"" Apr 17 17:22:43.612596 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:43.612490 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\"" Apr 17 17:22:43.618568 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:43.618543 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr"] Apr 17 17:22:43.695816 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:43.695784 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r575\" (UniqueName: \"kubernetes.io/projected/94f21dee-2d5e-424f-b213-c017e1ff58c2-kube-api-access-5r575\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr\" (UID: \"94f21dee-2d5e-424f-b213-c017e1ff58c2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" Apr 17 17:22:43.695816 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:43.695815 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94f21dee-2d5e-424f-b213-c017e1ff58c2-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr\" (UID: \"94f21dee-2d5e-424f-b213-c017e1ff58c2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" Apr 17 17:22:43.696038 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:43.695885 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94f21dee-2d5e-424f-b213-c017e1ff58c2-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr\" (UID: \"94f21dee-2d5e-424f-b213-c017e1ff58c2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" Apr 17 17:22:43.696038 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:43.695931 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94f21dee-2d5e-424f-b213-c017e1ff58c2-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr\" (UID: \"94f21dee-2d5e-424f-b213-c017e1ff58c2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" Apr 17 17:22:43.779629 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:43.779546 2573 generic.go:358] "Generic (PLEG): container finished" podID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" containerID="d73f33665e90e5a997adc5aee5c7a85c0c2ab5cc25c6683c9051e0b7ff07e679" exitCode=2 Apr 17 17:22:43.779629 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:43.779587 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" event={"ID":"b0d93895-4be9-47a2-b3b5-2b7e0ab92931","Type":"ContainerDied","Data":"d73f33665e90e5a997adc5aee5c7a85c0c2ab5cc25c6683c9051e0b7ff07e679"} Apr 17 17:22:43.796963 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:43.796933 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94f21dee-2d5e-424f-b213-c017e1ff58c2-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr\" (UID: \"94f21dee-2d5e-424f-b213-c017e1ff58c2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" Apr 17 17:22:43.797110 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:43.797013 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5r575\" (UniqueName: \"kubernetes.io/projected/94f21dee-2d5e-424f-b213-c017e1ff58c2-kube-api-access-5r575\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr\" (UID: \"94f21dee-2d5e-424f-b213-c017e1ff58c2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" Apr 17 17:22:43.797110 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:43.797033 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94f21dee-2d5e-424f-b213-c017e1ff58c2-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr\" (UID: \"94f21dee-2d5e-424f-b213-c017e1ff58c2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" Apr 17 17:22:43.797110 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:43.797053 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94f21dee-2d5e-424f-b213-c017e1ff58c2-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr\" (UID: \"94f21dee-2d5e-424f-b213-c017e1ff58c2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" Apr 17 17:22:43.797299 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:22:43.797173 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-serving-cert: secret "isvc-tensorflow-runtime-predictor-serving-cert" not found Apr 17 17:22:43.797299 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:22:43.797244 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94f21dee-2d5e-424f-b213-c017e1ff58c2-proxy-tls podName:94f21dee-2d5e-424f-b213-c017e1ff58c2 nodeName:}" failed. No retries permitted until 2026-04-17 17:22:44.297223295 +0000 UTC m=+3081.628724178 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/94f21dee-2d5e-424f-b213-c017e1ff58c2-proxy-tls") pod "isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" (UID: "94f21dee-2d5e-424f-b213-c017e1ff58c2") : secret "isvc-tensorflow-runtime-predictor-serving-cert" not found Apr 17 17:22:43.797474 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:43.797456 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94f21dee-2d5e-424f-b213-c017e1ff58c2-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr\" (UID: \"94f21dee-2d5e-424f-b213-c017e1ff58c2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" Apr 17 17:22:43.797597 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:43.797579 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94f21dee-2d5e-424f-b213-c017e1ff58c2-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr\" (UID: \"94f21dee-2d5e-424f-b213-c017e1ff58c2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" Apr 17 17:22:43.806152 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:43.806128 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r575\" (UniqueName: \"kubernetes.io/projected/94f21dee-2d5e-424f-b213-c017e1ff58c2-kube-api-access-5r575\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr\" (UID: \"94f21dee-2d5e-424f-b213-c017e1ff58c2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" Apr 17 17:22:44.302147 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:44.302106 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94f21dee-2d5e-424f-b213-c017e1ff58c2-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr\" (UID: \"94f21dee-2d5e-424f-b213-c017e1ff58c2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" Apr 17 17:22:44.304613 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:44.304590 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94f21dee-2d5e-424f-b213-c017e1ff58c2-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr\" (UID: \"94f21dee-2d5e-424f-b213-c017e1ff58c2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" Apr 17 17:22:44.520990 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:44.520944 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" Apr 17 17:22:44.637384 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:44.637351 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr"] Apr 17 17:22:44.640697 ip-10-0-128-217 kubenswrapper[2573]: W0417 17:22:44.640666 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94f21dee_2d5e_424f_b213_c017e1ff58c2.slice/crio-44c62017438b9c59af7f36dce0839c99e7d586398198b853983156e12ca11a19 WatchSource:0}: Error finding container 44c62017438b9c59af7f36dce0839c99e7d586398198b853983156e12ca11a19: Status 404 returned error can't find the container with id 44c62017438b9c59af7f36dce0839c99e7d586398198b853983156e12ca11a19 Apr 17 17:22:44.695233 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:44.695150 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" podUID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.29:8643/healthz\": dial tcp 10.134.0.29:8643: connect: connection refused" Apr 17 17:22:44.783870 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:44.783808 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" event={"ID":"94f21dee-2d5e-424f-b213-c017e1ff58c2","Type":"ContainerStarted","Data":"953ab90a483f7cc061170f5a10817405bb2e4d000fc606802c99ae60d69f5b4f"} Apr 17 17:22:44.783870 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:44.783875 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" event={"ID":"94f21dee-2d5e-424f-b213-c017e1ff58c2","Type":"ContainerStarted","Data":"44c62017438b9c59af7f36dce0839c99e7d586398198b853983156e12ca11a19"} Apr 17 17:22:49.694660 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:49.694623 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" podUID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.29:8643/healthz\": dial tcp 10.134.0.29:8643: connect: connection refused" Apr 17 17:22:49.797843 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:49.797796 2573 generic.go:358] "Generic (PLEG): container finished" podID="94f21dee-2d5e-424f-b213-c017e1ff58c2" containerID="953ab90a483f7cc061170f5a10817405bb2e4d000fc606802c99ae60d69f5b4f" exitCode=0 Apr 17 17:22:49.798029 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:49.797854 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" event={"ID":"94f21dee-2d5e-424f-b213-c017e1ff58c2","Type":"ContainerDied","Data":"953ab90a483f7cc061170f5a10817405bb2e4d000fc606802c99ae60d69f5b4f"} Apr 17 17:22:50.802520 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:50.802483 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" event={"ID":"94f21dee-2d5e-424f-b213-c017e1ff58c2","Type":"ContainerStarted","Data":"3880230f97f0512a35e455e9c85a12bab364e4c2da11654a17a3770af74c0d14"} Apr 17 17:22:50.802520 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:50.802525 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" event={"ID":"94f21dee-2d5e-424f-b213-c017e1ff58c2","Type":"ContainerStarted","Data":"6c10c47b4f76a9c45c6e5128f409566f74c01135c9edfbc81af8af0db9f64f8f"} Apr 17 17:22:50.802970 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:50.802737 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" Apr 17 17:22:50.827132 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:50.827078 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" podStartSLOduration=7.827063065 podStartE2EDuration="7.827063065s" podCreationTimestamp="2026-04-17 17:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:22:50.824920142 +0000 UTC m=+3088.156421046" watchObservedRunningTime="2026-04-17 17:22:50.827063065 +0000 UTC m=+3088.158563968" Apr 17 17:22:51.805815 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:51.805779 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" Apr 17 17:22:51.807056 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:51.807030 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" podUID="94f21dee-2d5e-424f-b213-c017e1ff58c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 17 17:22:52.809123 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:52.809076 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" podUID="94f21dee-2d5e-424f-b213-c017e1ff58c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 17 17:22:54.695492 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:54.695454 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" podUID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.29:8643/healthz\": dial tcp 10.134.0.29:8643: connect: connection refused" Apr 17 17:22:54.695908 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:54.695562 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:22:57.813156 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:57.813127 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" Apr 17 17:22:57.813642 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:57.813616 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" podUID="94f21dee-2d5e-424f-b213-c017e1ff58c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 17 17:22:59.694866 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:22:59.694798 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" podUID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.29:8643/healthz\": dial tcp 10.134.0.29:8643: connect: connection refused" Apr 17 17:23:04.695108 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:04.695061 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" podUID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.29:8643/healthz\": dial tcp 10.134.0.29:8643: connect: connection refused" Apr 17 17:23:07.814612 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:07.814584 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" Apr 17 17:23:09.694897 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:09.694854 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" podUID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.29:8643/healthz\": dial tcp 10.134.0.29:8643: connect: connection refused" Apr 17 17:23:13.871744 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:13.871659 2573 generic.go:358] "Generic (PLEG): container finished" podID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" containerID="8ff323c525a43a218d358a66d4ddd04055511666ebe800bfd25b4535d5d438db" exitCode=137 Apr 17 17:23:13.871744 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:13.871728 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" event={"ID":"b0d93895-4be9-47a2-b3b5-2b7e0ab92931","Type":"ContainerDied","Data":"8ff323c525a43a218d358a66d4ddd04055511666ebe800bfd25b4535d5d438db"} Apr 17 17:23:14.131047 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:14.130981 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:23:14.232865 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:14.232798 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"b0d93895-4be9-47a2-b3b5-2b7e0ab92931\" (UID: \"b0d93895-4be9-47a2-b3b5-2b7e0ab92931\") " Apr 17 17:23:14.233040 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:14.232890 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-proxy-tls\") pod \"b0d93895-4be9-47a2-b3b5-2b7e0ab92931\" (UID: \"b0d93895-4be9-47a2-b3b5-2b7e0ab92931\") " Apr 17 17:23:14.233040 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:14.232934 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjncx\" (UniqueName: \"kubernetes.io/projected/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-kube-api-access-hjncx\") pod \"b0d93895-4be9-47a2-b3b5-2b7e0ab92931\" (UID: \"b0d93895-4be9-47a2-b3b5-2b7e0ab92931\") " Apr 17 17:23:14.233040 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:14.232971 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-kserve-provision-location\") pod \"b0d93895-4be9-47a2-b3b5-2b7e0ab92931\" (UID: \"b0d93895-4be9-47a2-b3b5-2b7e0ab92931\") " Apr 17 17:23:14.233339 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:14.233243 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-isvc-tensorflow-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-kube-rbac-proxy-sar-config") pod "b0d93895-4be9-47a2-b3b5-2b7e0ab92931" (UID: "b0d93895-4be9-47a2-b3b5-2b7e0ab92931"). InnerVolumeSpecName "isvc-tensorflow-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:23:14.235110 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:14.235083 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-kube-api-access-hjncx" (OuterVolumeSpecName: "kube-api-access-hjncx") pod "b0d93895-4be9-47a2-b3b5-2b7e0ab92931" (UID: "b0d93895-4be9-47a2-b3b5-2b7e0ab92931"). InnerVolumeSpecName "kube-api-access-hjncx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:23:14.235190 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:14.235113 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b0d93895-4be9-47a2-b3b5-2b7e0ab92931" (UID: "b0d93895-4be9-47a2-b3b5-2b7e0ab92931"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:23:14.246679 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:14.246642 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b0d93895-4be9-47a2-b3b5-2b7e0ab92931" (UID: "b0d93895-4be9-47a2-b3b5-2b7e0ab92931"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:23:14.333663 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:14.333629 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-isvc-tensorflow-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:23:14.333663 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:14.333661 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-proxy-tls\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:23:14.333858 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:14.333673 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hjncx\" (UniqueName: \"kubernetes.io/projected/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-kube-api-access-hjncx\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:23:14.333858 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:14.333682 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0d93895-4be9-47a2-b3b5-2b7e0ab92931-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:23:14.876290 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:14.876256 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" event={"ID":"b0d93895-4be9-47a2-b3b5-2b7e0ab92931","Type":"ContainerDied","Data":"303aaceef98903ca50fb3c7fc89a3546524a26adff70a3002c6d54fd456fdf52"} Apr 17 17:23:14.876805 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:14.876304 2573 scope.go:117] "RemoveContainer" containerID="d73f33665e90e5a997adc5aee5c7a85c0c2ab5cc25c6683c9051e0b7ff07e679" Apr 17 17:23:14.876805 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:14.876326 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg" Apr 17 17:23:14.884351 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:14.884334 2573 scope.go:117] "RemoveContainer" containerID="8ff323c525a43a218d358a66d4ddd04055511666ebe800bfd25b4535d5d438db" Apr 17 17:23:14.891160 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:14.891141 2573 scope.go:117] "RemoveContainer" containerID="5f78987c536b9bd355532e51f879e6c9b099925e3b7cc8d359fe1d7c81249c55" Apr 17 17:23:14.897057 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:14.897030 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg"] Apr 17 17:23:14.900492 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:14.900471 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-tbfcg"] Apr 17 17:23:15.174591 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:15.174514 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" path="/var/lib/kubelet/pods/b0d93895-4be9-47a2-b3b5-2b7e0ab92931/volumes" Apr 17 17:23:24.676899 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:24.676863 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr"] Apr 17 17:23:24.677384 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:24.677302 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" podUID="94f21dee-2d5e-424f-b213-c017e1ff58c2" containerName="kserve-container" containerID="cri-o://6c10c47b4f76a9c45c6e5128f409566f74c01135c9edfbc81af8af0db9f64f8f" gracePeriod=30 Apr 17 17:23:24.677457 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:24.677345 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" podUID="94f21dee-2d5e-424f-b213-c017e1ff58c2" containerName="kube-rbac-proxy" containerID="cri-o://3880230f97f0512a35e455e9c85a12bab364e4c2da11654a17a3770af74c0d14" gracePeriod=30 Apr 17 17:23:24.906801 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:24.906765 2573 generic.go:358] "Generic (PLEG): container finished" podID="94f21dee-2d5e-424f-b213-c017e1ff58c2" containerID="3880230f97f0512a35e455e9c85a12bab364e4c2da11654a17a3770af74c0d14" exitCode=2 Apr 17 17:23:24.906981 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:24.906849 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" event={"ID":"94f21dee-2d5e-424f-b213-c017e1ff58c2","Type":"ContainerDied","Data":"3880230f97f0512a35e455e9c85a12bab364e4c2da11654a17a3770af74c0d14"} Apr 17 17:23:27.809947 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:27.809905 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" podUID="94f21dee-2d5e-424f-b213-c017e1ff58c2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.30:8643/healthz\": dial tcp 10.134.0.30:8643: connect: connection refused" Apr 17 17:23:32.809987 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:32.809938 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" podUID="94f21dee-2d5e-424f-b213-c017e1ff58c2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.30:8643/healthz\": dial tcp 10.134.0.30:8643: connect: connection refused" Apr 17 17:23:37.809967 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:37.809870 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" podUID="94f21dee-2d5e-424f-b213-c017e1ff58c2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.30:8643/healthz\": dial tcp 10.134.0.30:8643: connect: connection refused" Apr 17 17:23:37.810408 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:37.810008 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" Apr 17 17:23:42.809883 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:42.809809 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" podUID="94f21dee-2d5e-424f-b213-c017e1ff58c2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.30:8643/healthz\": dial tcp 10.134.0.30:8643: connect: connection refused" Apr 17 17:23:47.809625 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:47.809575 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" podUID="94f21dee-2d5e-424f-b213-c017e1ff58c2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.30:8643/healthz\": dial tcp 10.134.0.30:8643: connect: connection refused" Apr 17 17:23:52.809477 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:52.809437 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" podUID="94f21dee-2d5e-424f-b213-c017e1ff58c2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.30:8643/healthz\": dial tcp 10.134.0.30:8643: connect: connection refused" Apr 17 17:23:54.998303 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:54.998269 2573 generic.go:358] "Generic (PLEG): container finished" podID="94f21dee-2d5e-424f-b213-c017e1ff58c2" containerID="6c10c47b4f76a9c45c6e5128f409566f74c01135c9edfbc81af8af0db9f64f8f" exitCode=137 Apr 17 17:23:54.998681 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:54.998338 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" event={"ID":"94f21dee-2d5e-424f-b213-c017e1ff58c2","Type":"ContainerDied","Data":"6c10c47b4f76a9c45c6e5128f409566f74c01135c9edfbc81af8af0db9f64f8f"} Apr 17 17:23:55.313676 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:55.313651 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" Apr 17 17:23:55.447624 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:55.447591 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r575\" (UniqueName: \"kubernetes.io/projected/94f21dee-2d5e-424f-b213-c017e1ff58c2-kube-api-access-5r575\") pod \"94f21dee-2d5e-424f-b213-c017e1ff58c2\" (UID: \"94f21dee-2d5e-424f-b213-c017e1ff58c2\") " Apr 17 17:23:55.447796 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:55.447649 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94f21dee-2d5e-424f-b213-c017e1ff58c2-proxy-tls\") pod \"94f21dee-2d5e-424f-b213-c017e1ff58c2\" (UID: \"94f21dee-2d5e-424f-b213-c017e1ff58c2\") " Apr 17 17:23:55.447796 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:55.447705 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94f21dee-2d5e-424f-b213-c017e1ff58c2-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"94f21dee-2d5e-424f-b213-c017e1ff58c2\" (UID: \"94f21dee-2d5e-424f-b213-c017e1ff58c2\") " Apr 17 17:23:55.447796 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:55.447742 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94f21dee-2d5e-424f-b213-c017e1ff58c2-kserve-provision-location\") pod \"94f21dee-2d5e-424f-b213-c017e1ff58c2\" (UID: \"94f21dee-2d5e-424f-b213-c017e1ff58c2\") " Apr 17 17:23:55.448091 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:55.448065 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94f21dee-2d5e-424f-b213-c017e1ff58c2-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config") pod "94f21dee-2d5e-424f-b213-c017e1ff58c2" (UID: "94f21dee-2d5e-424f-b213-c017e1ff58c2"). InnerVolumeSpecName "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:23:55.449878 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:55.449850 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f21dee-2d5e-424f-b213-c017e1ff58c2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "94f21dee-2d5e-424f-b213-c017e1ff58c2" (UID: "94f21dee-2d5e-424f-b213-c017e1ff58c2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:23:55.449878 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:55.449861 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94f21dee-2d5e-424f-b213-c017e1ff58c2-kube-api-access-5r575" (OuterVolumeSpecName: "kube-api-access-5r575") pod "94f21dee-2d5e-424f-b213-c017e1ff58c2" (UID: "94f21dee-2d5e-424f-b213-c017e1ff58c2"). InnerVolumeSpecName "kube-api-access-5r575". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:23:55.456639 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:55.456615 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94f21dee-2d5e-424f-b213-c017e1ff58c2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "94f21dee-2d5e-424f-b213-c017e1ff58c2" (UID: "94f21dee-2d5e-424f-b213-c017e1ff58c2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:23:55.549344 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:55.549255 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5r575\" (UniqueName: \"kubernetes.io/projected/94f21dee-2d5e-424f-b213-c017e1ff58c2-kube-api-access-5r575\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:23:55.549344 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:55.549290 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94f21dee-2d5e-424f-b213-c017e1ff58c2-proxy-tls\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:23:55.549344 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:55.549302 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94f21dee-2d5e-424f-b213-c017e1ff58c2-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:23:55.549344 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:55.549314 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94f21dee-2d5e-424f-b213-c017e1ff58c2-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:23:56.002923 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:56.002889 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" event={"ID":"94f21dee-2d5e-424f-b213-c017e1ff58c2","Type":"ContainerDied","Data":"44c62017438b9c59af7f36dce0839c99e7d586398198b853983156e12ca11a19"} Apr 17 17:23:56.002923 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:56.002926 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr" Apr 17 17:23:56.003366 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:56.002931 2573 scope.go:117] "RemoveContainer" containerID="3880230f97f0512a35e455e9c85a12bab364e4c2da11654a17a3770af74c0d14" Apr 17 17:23:56.010859 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:56.010819 2573 scope.go:117] "RemoveContainer" containerID="6c10c47b4f76a9c45c6e5128f409566f74c01135c9edfbc81af8af0db9f64f8f" Apr 17 17:23:56.017848 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:56.017811 2573 scope.go:117] "RemoveContainer" containerID="953ab90a483f7cc061170f5a10817405bb2e4d000fc606802c99ae60d69f5b4f" Apr 17 17:23:56.024478 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:56.024450 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr"] Apr 17 17:23:56.028228 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:56.028203 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-gbfnr"] Apr 17 17:23:57.174532 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:23:57.174500 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94f21dee-2d5e-424f-b213-c017e1ff58c2" path="/var/lib/kubelet/pods/94f21dee-2d5e-424f-b213-c017e1ff58c2/volumes" Apr 17 17:25:36.477777 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.477742 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn"] Apr 17 17:25:36.479176 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.478069 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" containerName="kube-rbac-proxy" Apr 17 17:25:36.479176 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.478081 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" containerName="kube-rbac-proxy" Apr 17 17:25:36.479176 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.478093 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" containerName="storage-initializer" Apr 17 17:25:36.479176 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.478103 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" containerName="storage-initializer" Apr 17 17:25:36.479176 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.478110 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94f21dee-2d5e-424f-b213-c017e1ff58c2" containerName="kserve-container" Apr 17 17:25:36.479176 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.478116 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f21dee-2d5e-424f-b213-c017e1ff58c2" containerName="kserve-container" Apr 17 17:25:36.479176 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.478128 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94f21dee-2d5e-424f-b213-c017e1ff58c2" containerName="kube-rbac-proxy" Apr 17 17:25:36.479176 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.478132 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f21dee-2d5e-424f-b213-c017e1ff58c2" containerName="kube-rbac-proxy" Apr 17 17:25:36.479176 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.478141 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94f21dee-2d5e-424f-b213-c017e1ff58c2" containerName="storage-initializer" Apr 17 17:25:36.479176 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.478146 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f21dee-2d5e-424f-b213-c017e1ff58c2" containerName="storage-initializer" Apr 17 17:25:36.479176 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.478154 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" containerName="kserve-container" Apr 17 17:25:36.479176 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.478159 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" containerName="kserve-container" Apr 17 17:25:36.479176 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.478207 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" containerName="kserve-container" Apr 17 17:25:36.479176 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.478216 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0d93895-4be9-47a2-b3b5-2b7e0ab92931" containerName="kube-rbac-proxy" Apr 17 17:25:36.479176 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.478225 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="94f21dee-2d5e-424f-b213-c017e1ff58c2" containerName="kserve-container" Apr 17 17:25:36.479176 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.478231 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="94f21dee-2d5e-424f-b213-c017e1ff58c2" containerName="kube-rbac-proxy" Apr 17 17:25:36.480080 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.480064 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" Apr 17 17:25:36.482543 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.482522 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-kube-rbac-proxy-sar-config\"" Apr 17 17:25:36.482661 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.482560 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 17:25:36.482661 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.482623 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 17:25:36.483639 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.483619 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-wzvpv\"" Apr 17 17:25:36.483639 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.483629 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-predictor-serving-cert\"" Apr 17 17:25:36.495242 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.495223 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn"] Apr 17 17:25:36.604207 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.604166 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27a33006-d91a-47d1-a17a-19c085df60d8-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8dqn\" (UID: \"27a33006-d91a-47d1-a17a-19c085df60d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" Apr 17 17:25:36.604414 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.604215 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/27a33006-d91a-47d1-a17a-19c085df60d8-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8dqn\" (UID: \"27a33006-d91a-47d1-a17a-19c085df60d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" Apr 17 17:25:36.604414 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.604284 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/27a33006-d91a-47d1-a17a-19c085df60d8-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8dqn\" (UID: \"27a33006-d91a-47d1-a17a-19c085df60d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" Apr 17 17:25:36.604414 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.604313 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncwdg\" (UniqueName: \"kubernetes.io/projected/27a33006-d91a-47d1-a17a-19c085df60d8-kube-api-access-ncwdg\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8dqn\" (UID: \"27a33006-d91a-47d1-a17a-19c085df60d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" Apr 17 17:25:36.705481 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.705444 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27a33006-d91a-47d1-a17a-19c085df60d8-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8dqn\" (UID: \"27a33006-d91a-47d1-a17a-19c085df60d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" Apr 17 17:25:36.705481 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.705486 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/27a33006-d91a-47d1-a17a-19c085df60d8-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8dqn\" (UID: \"27a33006-d91a-47d1-a17a-19c085df60d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" Apr 17 17:25:36.705715 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.705515 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/27a33006-d91a-47d1-a17a-19c085df60d8-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8dqn\" (UID: \"27a33006-d91a-47d1-a17a-19c085df60d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" Apr 17 17:25:36.705715 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.705637 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ncwdg\" (UniqueName: \"kubernetes.io/projected/27a33006-d91a-47d1-a17a-19c085df60d8-kube-api-access-ncwdg\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8dqn\" (UID: \"27a33006-d91a-47d1-a17a-19c085df60d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" Apr 17 17:25:36.705975 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.705953 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/27a33006-d91a-47d1-a17a-19c085df60d8-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8dqn\" (UID: \"27a33006-d91a-47d1-a17a-19c085df60d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" Apr 17 17:25:36.706219 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.706204 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/27a33006-d91a-47d1-a17a-19c085df60d8-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8dqn\" (UID: \"27a33006-d91a-47d1-a17a-19c085df60d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" Apr 17 17:25:36.707961 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.707944 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27a33006-d91a-47d1-a17a-19c085df60d8-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8dqn\" (UID: \"27a33006-d91a-47d1-a17a-19c085df60d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" Apr 17 17:25:36.714757 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.714730 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncwdg\" (UniqueName: \"kubernetes.io/projected/27a33006-d91a-47d1-a17a-19c085df60d8-kube-api-access-ncwdg\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8dqn\" (UID: \"27a33006-d91a-47d1-a17a-19c085df60d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" Apr 17 17:25:36.790348 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.790251 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" Apr 17 17:25:36.906739 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:36.906699 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn"] Apr 17 17:25:36.909813 ip-10-0-128-217 kubenswrapper[2573]: W0417 17:25:36.909782 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27a33006_d91a_47d1_a17a_19c085df60d8.slice/crio-d13a33933c6c336dc95888d5f3c7e530d914df39a53a45d51c35cd7eb4c90867 WatchSource:0}: Error finding container d13a33933c6c336dc95888d5f3c7e530d914df39a53a45d51c35cd7eb4c90867: Status 404 returned error can't find the container with id d13a33933c6c336dc95888d5f3c7e530d914df39a53a45d51c35cd7eb4c90867 Apr 17 17:25:37.268286 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:37.268255 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" event={"ID":"27a33006-d91a-47d1-a17a-19c085df60d8","Type":"ContainerStarted","Data":"8d65a86adce213307bb97788aba9ecf106ae8a9045010597bb5010a4fabd2571"} Apr 17 17:25:37.268491 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:37.268293 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" event={"ID":"27a33006-d91a-47d1-a17a-19c085df60d8","Type":"ContainerStarted","Data":"d13a33933c6c336dc95888d5f3c7e530d914df39a53a45d51c35cd7eb4c90867"} Apr 17 17:25:41.281024 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:41.280989 2573 generic.go:358] "Generic (PLEG): container finished" podID="27a33006-d91a-47d1-a17a-19c085df60d8" containerID="8d65a86adce213307bb97788aba9ecf106ae8a9045010597bb5010a4fabd2571" exitCode=0 Apr 17 17:25:41.281446 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:25:41.281063 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" event={"ID":"27a33006-d91a-47d1-a17a-19c085df60d8","Type":"ContainerDied","Data":"8d65a86adce213307bb97788aba9ecf106ae8a9045010597bb5010a4fabd2571"} Apr 17 17:26:01.339812 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:26:01.339774 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" event={"ID":"27a33006-d91a-47d1-a17a-19c085df60d8","Type":"ContainerStarted","Data":"46fd8dc256cbfc1a9ae93ea1560ae3503461706cedbb9f38284bf54c87d1e264"} Apr 17 17:26:01.339812 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:26:01.339817 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" event={"ID":"27a33006-d91a-47d1-a17a-19c085df60d8","Type":"ContainerStarted","Data":"618ede44d589016b614c7c32c3725e24b734b51578a813cd47d41afedae4d800"} Apr 17 17:26:01.340323 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:26:01.340140 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" Apr 17 17:26:01.340323 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:26:01.340173 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" Apr 17 17:26:01.341535 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:26:01.341509 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" podUID="27a33006-d91a-47d1-a17a-19c085df60d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 17 17:26:01.361797 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:26:01.361743 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" podStartSLOduration=5.756923582 podStartE2EDuration="25.361726348s" podCreationTimestamp="2026-04-17 17:25:36 +0000 UTC" firstStartedPulling="2026-04-17 17:25:41.282225155 +0000 UTC m=+3258.613726034" lastFinishedPulling="2026-04-17 17:26:00.887027917 +0000 UTC m=+3278.218528800" observedRunningTime="2026-04-17 17:26:01.360245648 +0000 UTC m=+3278.691746549" watchObservedRunningTime="2026-04-17 17:26:01.361726348 +0000 UTC m=+3278.693227250" Apr 17 17:26:02.343056 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:26:02.343015 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" podUID="27a33006-d91a-47d1-a17a-19c085df60d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 17 17:26:07.346876 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:26:07.346847 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" Apr 17 17:26:07.347427 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:26:07.347400 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" podUID="27a33006-d91a-47d1-a17a-19c085df60d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 17 17:26:17.347690 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:26:17.347652 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" podUID="27a33006-d91a-47d1-a17a-19c085df60d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 17 17:26:26.238905 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:26:26.238874 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 17:26:26.239910 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:26:26.239886 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 17:26:27.348050 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:26:27.348009 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" podUID="27a33006-d91a-47d1-a17a-19c085df60d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 17 17:26:37.348084 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:26:37.347998 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" podUID="27a33006-d91a-47d1-a17a-19c085df60d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 17 17:26:47.347298 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:26:47.347254 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" podUID="27a33006-d91a-47d1-a17a-19c085df60d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 17 17:26:57.347929 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:26:57.347894 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" podUID="27a33006-d91a-47d1-a17a-19c085df60d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 17 17:27:07.347987 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:07.347950 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" Apr 17 17:27:16.621220 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:16.621184 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn"] Apr 17 17:27:16.621678 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:16.621618 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" podUID="27a33006-d91a-47d1-a17a-19c085df60d8" containerName="kserve-container" containerID="cri-o://618ede44d589016b614c7c32c3725e24b734b51578a813cd47d41afedae4d800" gracePeriod=30 Apr 17 17:27:16.621841 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:16.621692 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" podUID="27a33006-d91a-47d1-a17a-19c085df60d8" containerName="kube-rbac-proxy" containerID="cri-o://46fd8dc256cbfc1a9ae93ea1560ae3503461706cedbb9f38284bf54c87d1e264" gracePeriod=30 Apr 17 17:27:17.343944 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:17.343889 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" podUID="27a33006-d91a-47d1-a17a-19c085df60d8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 17 17:27:17.348072 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:17.348044 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" podUID="27a33006-d91a-47d1-a17a-19c085df60d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 17 17:27:17.553001 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:17.552960 2573 generic.go:358] "Generic (PLEG): container finished" podID="27a33006-d91a-47d1-a17a-19c085df60d8" containerID="46fd8dc256cbfc1a9ae93ea1560ae3503461706cedbb9f38284bf54c87d1e264" exitCode=2 Apr 17 17:27:17.553171 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:17.553033 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" event={"ID":"27a33006-d91a-47d1-a17a-19c085df60d8","Type":"ContainerDied","Data":"46fd8dc256cbfc1a9ae93ea1560ae3503461706cedbb9f38284bf54c87d1e264"} Apr 17 17:27:20.359034 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.359011 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" Apr 17 17:27:20.423343 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.423241 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/27a33006-d91a-47d1-a17a-19c085df60d8-kserve-provision-location\") pod \"27a33006-d91a-47d1-a17a-19c085df60d8\" (UID: \"27a33006-d91a-47d1-a17a-19c085df60d8\") " Apr 17 17:27:20.423343 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.423302 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncwdg\" (UniqueName: \"kubernetes.io/projected/27a33006-d91a-47d1-a17a-19c085df60d8-kube-api-access-ncwdg\") pod \"27a33006-d91a-47d1-a17a-19c085df60d8\" (UID: \"27a33006-d91a-47d1-a17a-19c085df60d8\") " Apr 17 17:27:20.423582 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.423357 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/27a33006-d91a-47d1-a17a-19c085df60d8-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"27a33006-d91a-47d1-a17a-19c085df60d8\" (UID: \"27a33006-d91a-47d1-a17a-19c085df60d8\") " Apr 17 17:27:20.423582 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.423420 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27a33006-d91a-47d1-a17a-19c085df60d8-proxy-tls\") pod \"27a33006-d91a-47d1-a17a-19c085df60d8\" (UID: \"27a33006-d91a-47d1-a17a-19c085df60d8\") " Apr 17 17:27:20.423672 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.423624 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27a33006-d91a-47d1-a17a-19c085df60d8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "27a33006-d91a-47d1-a17a-19c085df60d8" (UID: "27a33006-d91a-47d1-a17a-19c085df60d8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:27:20.423761 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.423736 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27a33006-d91a-47d1-a17a-19c085df60d8-isvc-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-kube-rbac-proxy-sar-config") pod "27a33006-d91a-47d1-a17a-19c085df60d8" (UID: "27a33006-d91a-47d1-a17a-19c085df60d8"). InnerVolumeSpecName "isvc-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:27:20.425359 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.425337 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27a33006-d91a-47d1-a17a-19c085df60d8-kube-api-access-ncwdg" (OuterVolumeSpecName: "kube-api-access-ncwdg") pod "27a33006-d91a-47d1-a17a-19c085df60d8" (UID: "27a33006-d91a-47d1-a17a-19c085df60d8"). InnerVolumeSpecName "kube-api-access-ncwdg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:27:20.425479 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.425465 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a33006-d91a-47d1-a17a-19c085df60d8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "27a33006-d91a-47d1-a17a-19c085df60d8" (UID: "27a33006-d91a-47d1-a17a-19c085df60d8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:27:20.524364 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.524326 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27a33006-d91a-47d1-a17a-19c085df60d8-proxy-tls\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:27:20.524364 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.524357 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/27a33006-d91a-47d1-a17a-19c085df60d8-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:27:20.524543 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.524380 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ncwdg\" (UniqueName: \"kubernetes.io/projected/27a33006-d91a-47d1-a17a-19c085df60d8-kube-api-access-ncwdg\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:27:20.524543 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.524391 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/27a33006-d91a-47d1-a17a-19c085df60d8-isvc-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:27:20.562588 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.562560 2573 generic.go:358] "Generic (PLEG): container finished" podID="27a33006-d91a-47d1-a17a-19c085df60d8" containerID="618ede44d589016b614c7c32c3725e24b734b51578a813cd47d41afedae4d800" exitCode=0 Apr 17 17:27:20.562761 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.562624 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" event={"ID":"27a33006-d91a-47d1-a17a-19c085df60d8","Type":"ContainerDied","Data":"618ede44d589016b614c7c32c3725e24b734b51578a813cd47d41afedae4d800"} Apr 17 17:27:20.562761 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.562648 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" Apr 17 17:27:20.562761 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.562661 2573 scope.go:117] "RemoveContainer" containerID="46fd8dc256cbfc1a9ae93ea1560ae3503461706cedbb9f38284bf54c87d1e264" Apr 17 17:27:20.562942 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.562652 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn" event={"ID":"27a33006-d91a-47d1-a17a-19c085df60d8","Type":"ContainerDied","Data":"d13a33933c6c336dc95888d5f3c7e530d914df39a53a45d51c35cd7eb4c90867"} Apr 17 17:27:20.570486 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.570468 2573 scope.go:117] "RemoveContainer" containerID="618ede44d589016b614c7c32c3725e24b734b51578a813cd47d41afedae4d800" Apr 17 17:27:20.577325 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.577309 2573 scope.go:117] "RemoveContainer" containerID="8d65a86adce213307bb97788aba9ecf106ae8a9045010597bb5010a4fabd2571" Apr 17 17:27:20.583717 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.583694 2573 scope.go:117] "RemoveContainer" containerID="46fd8dc256cbfc1a9ae93ea1560ae3503461706cedbb9f38284bf54c87d1e264" Apr 17 17:27:20.584008 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:27:20.583982 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46fd8dc256cbfc1a9ae93ea1560ae3503461706cedbb9f38284bf54c87d1e264\": container with ID starting with 46fd8dc256cbfc1a9ae93ea1560ae3503461706cedbb9f38284bf54c87d1e264 not found: ID does not exist" containerID="46fd8dc256cbfc1a9ae93ea1560ae3503461706cedbb9f38284bf54c87d1e264" Apr 17 17:27:20.584112 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.584013 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46fd8dc256cbfc1a9ae93ea1560ae3503461706cedbb9f38284bf54c87d1e264"} err="failed to get container status \"46fd8dc256cbfc1a9ae93ea1560ae3503461706cedbb9f38284bf54c87d1e264\": rpc error: code = NotFound desc = could not find container \"46fd8dc256cbfc1a9ae93ea1560ae3503461706cedbb9f38284bf54c87d1e264\": container with ID starting with 46fd8dc256cbfc1a9ae93ea1560ae3503461706cedbb9f38284bf54c87d1e264 not found: ID does not exist" Apr 17 17:27:20.584112 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.584037 2573 scope.go:117] "RemoveContainer" containerID="618ede44d589016b614c7c32c3725e24b734b51578a813cd47d41afedae4d800" Apr 17 17:27:20.584293 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:27:20.584276 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"618ede44d589016b614c7c32c3725e24b734b51578a813cd47d41afedae4d800\": container with ID starting with 618ede44d589016b614c7c32c3725e24b734b51578a813cd47d41afedae4d800 not found: ID does not exist" containerID="618ede44d589016b614c7c32c3725e24b734b51578a813cd47d41afedae4d800" Apr 17 17:27:20.584357 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.584305 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"618ede44d589016b614c7c32c3725e24b734b51578a813cd47d41afedae4d800"} err="failed to get container status \"618ede44d589016b614c7c32c3725e24b734b51578a813cd47d41afedae4d800\": rpc error: code = NotFound desc = could not find container \"618ede44d589016b614c7c32c3725e24b734b51578a813cd47d41afedae4d800\": container with ID starting with 618ede44d589016b614c7c32c3725e24b734b51578a813cd47d41afedae4d800 not found: ID does not exist" Apr 17 17:27:20.584357 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.584325 2573 scope.go:117] "RemoveContainer" containerID="8d65a86adce213307bb97788aba9ecf106ae8a9045010597bb5010a4fabd2571" Apr 17 17:27:20.584446 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.584350 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn"] Apr 17 17:27:20.584585 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:27:20.584567 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d65a86adce213307bb97788aba9ecf106ae8a9045010597bb5010a4fabd2571\": container with ID starting with 8d65a86adce213307bb97788aba9ecf106ae8a9045010597bb5010a4fabd2571 not found: ID does not exist" containerID="8d65a86adce213307bb97788aba9ecf106ae8a9045010597bb5010a4fabd2571" Apr 17 17:27:20.584623 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.584591 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d65a86adce213307bb97788aba9ecf106ae8a9045010597bb5010a4fabd2571"} err="failed to get container status \"8d65a86adce213307bb97788aba9ecf106ae8a9045010597bb5010a4fabd2571\": rpc error: code = NotFound desc = could not find container \"8d65a86adce213307bb97788aba9ecf106ae8a9045010597bb5010a4fabd2571\": container with ID starting with 8d65a86adce213307bb97788aba9ecf106ae8a9045010597bb5010a4fabd2571 not found: ID does not exist" Apr 17 17:27:20.590679 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:20.590658 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8dqn"] Apr 17 17:27:21.179006 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:27:21.174858 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27a33006-d91a-47d1-a17a-19c085df60d8" path="/var/lib/kubelet/pods/27a33006-d91a-47d1-a17a-19c085df60d8/volumes" Apr 17 17:28:57.077558 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.077523 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp"] Apr 17 17:28:57.078208 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.077952 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27a33006-d91a-47d1-a17a-19c085df60d8" containerName="storage-initializer" Apr 17 17:28:57.078208 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.077971 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a33006-d91a-47d1-a17a-19c085df60d8" containerName="storage-initializer" Apr 17 17:28:57.078208 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.078000 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27a33006-d91a-47d1-a17a-19c085df60d8" containerName="kserve-container" Apr 17 17:28:57.078208 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.078008 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a33006-d91a-47d1-a17a-19c085df60d8" containerName="kserve-container" Apr 17 17:28:57.078208 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.078017 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27a33006-d91a-47d1-a17a-19c085df60d8" containerName="kube-rbac-proxy" Apr 17 17:28:57.078208 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.078026 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a33006-d91a-47d1-a17a-19c085df60d8" containerName="kube-rbac-proxy" Apr 17 17:28:57.078208 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.078090 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="27a33006-d91a-47d1-a17a-19c085df60d8" containerName="kserve-container" Apr 17 17:28:57.078208 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.078103 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="27a33006-d91a-47d1-a17a-19c085df60d8" containerName="kube-rbac-proxy" Apr 17 17:28:57.080194 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.080173 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" Apr 17 17:28:57.082624 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.082601 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\"" Apr 17 17:28:57.082736 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.082630 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 17:28:57.082736 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.082654 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 17:28:57.083683 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.083661 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-predictor-serving-cert\"" Apr 17 17:28:57.083810 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.083680 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-wzvpv\"" Apr 17 17:28:57.090550 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.090528 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp"] Apr 17 17:28:57.175501 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.175471 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc9ll\" (UniqueName: \"kubernetes.io/projected/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-kube-api-access-kc9ll\") pod \"isvc-xgboost-runtime-predictor-779db84d9-swtpp\" (UID: \"b859a8ab-a54a-4bc7-ab41-0f0dabe79205\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" Apr 17 17:28:57.175673 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.175510 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-swtpp\" (UID: \"b859a8ab-a54a-4bc7-ab41-0f0dabe79205\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" Apr 17 17:28:57.175673 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.175540 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-swtpp\" (UID: \"b859a8ab-a54a-4bc7-ab41-0f0dabe79205\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" Apr 17 17:28:57.175673 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.175587 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-swtpp\" (UID: \"b859a8ab-a54a-4bc7-ab41-0f0dabe79205\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" Apr 17 17:28:57.276478 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.276443 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-swtpp\" (UID: \"b859a8ab-a54a-4bc7-ab41-0f0dabe79205\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" Apr 17 17:28:57.276651 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.276502 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kc9ll\" (UniqueName: \"kubernetes.io/projected/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-kube-api-access-kc9ll\") pod \"isvc-xgboost-runtime-predictor-779db84d9-swtpp\" (UID: \"b859a8ab-a54a-4bc7-ab41-0f0dabe79205\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" Apr 17 17:28:57.276651 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.276543 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-swtpp\" (UID: \"b859a8ab-a54a-4bc7-ab41-0f0dabe79205\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" Apr 17 17:28:57.276651 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.276577 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-swtpp\" (UID: \"b859a8ab-a54a-4bc7-ab41-0f0dabe79205\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" Apr 17 17:28:57.277001 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.276973 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-swtpp\" (UID: \"b859a8ab-a54a-4bc7-ab41-0f0dabe79205\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" Apr 17 17:28:57.277295 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.277274 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-swtpp\" (UID: \"b859a8ab-a54a-4bc7-ab41-0f0dabe79205\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" Apr 17 17:28:57.279173 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.279153 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-swtpp\" (UID: \"b859a8ab-a54a-4bc7-ab41-0f0dabe79205\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" Apr 17 17:28:57.284106 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.284078 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc9ll\" (UniqueName: \"kubernetes.io/projected/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-kube-api-access-kc9ll\") pod \"isvc-xgboost-runtime-predictor-779db84d9-swtpp\" (UID: \"b859a8ab-a54a-4bc7-ab41-0f0dabe79205\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" Apr 17 17:28:57.391015 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.390932 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" Apr 17 17:28:57.510042 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.510017 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp"] Apr 17 17:28:57.512366 ip-10-0-128-217 kubenswrapper[2573]: W0417 17:28:57.512340 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb859a8ab_a54a_4bc7_ab41_0f0dabe79205.slice/crio-3a2d7c43fc61da52a43f0f8f27474611ba7d33f684123d29d9a3ec1b9f93f36c WatchSource:0}: Error finding container 3a2d7c43fc61da52a43f0f8f27474611ba7d33f684123d29d9a3ec1b9f93f36c: Status 404 returned error can't find the container with id 3a2d7c43fc61da52a43f0f8f27474611ba7d33f684123d29d9a3ec1b9f93f36c Apr 17 17:28:57.514248 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.514233 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:28:57.830082 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.830046 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" event={"ID":"b859a8ab-a54a-4bc7-ab41-0f0dabe79205","Type":"ContainerStarted","Data":"350113543affdca94626c314ad28435b8112c094d8397074566cc6ce4f761cda"} Apr 17 17:28:57.830082 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:28:57.830085 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" event={"ID":"b859a8ab-a54a-4bc7-ab41-0f0dabe79205","Type":"ContainerStarted","Data":"3a2d7c43fc61da52a43f0f8f27474611ba7d33f684123d29d9a3ec1b9f93f36c"} Apr 17 17:29:01.355667 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:29:01.355641 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb859a8ab_a54a_4bc7_ab41_0f0dabe79205.slice/crio-conmon-350113543affdca94626c314ad28435b8112c094d8397074566cc6ce4f761cda.scope\": RecentStats: unable to find data in memory cache]" Apr 17 17:29:01.843540 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:29:01.843510 2573 generic.go:358] "Generic (PLEG): container finished" podID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerID="350113543affdca94626c314ad28435b8112c094d8397074566cc6ce4f761cda" exitCode=0 Apr 17 17:29:01.843710 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:29:01.843588 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" event={"ID":"b859a8ab-a54a-4bc7-ab41-0f0dabe79205","Type":"ContainerDied","Data":"350113543affdca94626c314ad28435b8112c094d8397074566cc6ce4f761cda"} Apr 17 17:29:02.848248 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:29:02.848213 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" event={"ID":"b859a8ab-a54a-4bc7-ab41-0f0dabe79205","Type":"ContainerStarted","Data":"25d7de9e9a5d66424f1e30b4c37fdf5b2b4e50c13e6f9e3d9cd0ebb318a8a631"} Apr 17 17:29:02.848248 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:29:02.848255 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" event={"ID":"b859a8ab-a54a-4bc7-ab41-0f0dabe79205","Type":"ContainerStarted","Data":"b526aedc1c2816f5d61c5b71a8b0b2ae7867222ca1d61421245c16ff351a8b72"} Apr 17 17:29:02.848656 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:29:02.848570 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" Apr 17 17:29:02.848693 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:29:02.848679 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" Apr 17 17:29:02.850022 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:29:02.849996 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" podUID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 17 17:29:02.869496 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:29:02.869448 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" podStartSLOduration=5.86943168 podStartE2EDuration="5.86943168s" podCreationTimestamp="2026-04-17 17:28:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:29:02.868210218 +0000 UTC m=+3460.199711126" watchObservedRunningTime="2026-04-17 17:29:02.86943168 +0000 UTC m=+3460.200932582" Apr 17 17:29:03.851241 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:29:03.851199 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" podUID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 17 17:29:08.857110 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:29:08.857075 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" Apr 17 17:29:08.857622 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:29:08.857598 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" podUID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 17 17:29:18.857559 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:29:18.857517 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" podUID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 17 17:29:28.857644 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:29:28.857598 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" podUID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 17 17:29:38.857991 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:29:38.857902 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" podUID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 17 17:29:48.858038 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:29:48.857997 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" podUID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 17 17:29:58.858543 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:29:58.858505 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" podUID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 17 17:30:08.858032 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:08.857993 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" Apr 17 17:30:17.178302 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:17.178269 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp"] Apr 17 17:30:17.178772 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:17.178667 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" podUID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerName="kserve-container" containerID="cri-o://b526aedc1c2816f5d61c5b71a8b0b2ae7867222ca1d61421245c16ff351a8b72" gracePeriod=30 Apr 17 17:30:17.178772 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:17.178699 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" podUID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerName="kube-rbac-proxy" containerID="cri-o://25d7de9e9a5d66424f1e30b4c37fdf5b2b4e50c13e6f9e3d9cd0ebb318a8a631" gracePeriod=30 Apr 17 17:30:18.059271 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:18.059239 2573 generic.go:358] "Generic (PLEG): container finished" podID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerID="25d7de9e9a5d66424f1e30b4c37fdf5b2b4e50c13e6f9e3d9cd0ebb318a8a631" exitCode=2 Apr 17 17:30:18.059271 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:18.059275 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" event={"ID":"b859a8ab-a54a-4bc7-ab41-0f0dabe79205","Type":"ContainerDied","Data":"25d7de9e9a5d66424f1e30b4c37fdf5b2b4e50c13e6f9e3d9cd0ebb318a8a631"} Apr 17 17:30:18.852146 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:18.852102 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" podUID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.32:8643/healthz\": dial tcp 10.134.0.32:8643: connect: connection refused" Apr 17 17:30:18.857891 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:18.857862 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" podUID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 17 17:30:20.917708 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:20.917684 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" Apr 17 17:30:21.050392 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.050301 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-proxy-tls\") pod \"b859a8ab-a54a-4bc7-ab41-0f0dabe79205\" (UID: \"b859a8ab-a54a-4bc7-ab41-0f0dabe79205\") " Apr 17 17:30:21.050392 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.050379 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-kserve-provision-location\") pod \"b859a8ab-a54a-4bc7-ab41-0f0dabe79205\" (UID: \"b859a8ab-a54a-4bc7-ab41-0f0dabe79205\") " Apr 17 17:30:21.050591 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.050400 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc9ll\" (UniqueName: \"kubernetes.io/projected/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-kube-api-access-kc9ll\") pod \"b859a8ab-a54a-4bc7-ab41-0f0dabe79205\" (UID: \"b859a8ab-a54a-4bc7-ab41-0f0dabe79205\") " Apr 17 17:30:21.050591 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.050423 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"b859a8ab-a54a-4bc7-ab41-0f0dabe79205\" (UID: \"b859a8ab-a54a-4bc7-ab41-0f0dabe79205\") " Apr 17 17:30:21.050707 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.050685 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b859a8ab-a54a-4bc7-ab41-0f0dabe79205" (UID: "b859a8ab-a54a-4bc7-ab41-0f0dabe79205"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:30:21.050951 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.050919 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-isvc-xgboost-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-runtime-kube-rbac-proxy-sar-config") pod "b859a8ab-a54a-4bc7-ab41-0f0dabe79205" (UID: "b859a8ab-a54a-4bc7-ab41-0f0dabe79205"). InnerVolumeSpecName "isvc-xgboost-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:30:21.052449 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.052426 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b859a8ab-a54a-4bc7-ab41-0f0dabe79205" (UID: "b859a8ab-a54a-4bc7-ab41-0f0dabe79205"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:30:21.052541 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.052447 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-kube-api-access-kc9ll" (OuterVolumeSpecName: "kube-api-access-kc9ll") pod "b859a8ab-a54a-4bc7-ab41-0f0dabe79205" (UID: "b859a8ab-a54a-4bc7-ab41-0f0dabe79205"). InnerVolumeSpecName "kube-api-access-kc9ll". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:30:21.068430 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.068400 2573 generic.go:358] "Generic (PLEG): container finished" podID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerID="b526aedc1c2816f5d61c5b71a8b0b2ae7867222ca1d61421245c16ff351a8b72" exitCode=0 Apr 17 17:30:21.068549 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.068443 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" event={"ID":"b859a8ab-a54a-4bc7-ab41-0f0dabe79205","Type":"ContainerDied","Data":"b526aedc1c2816f5d61c5b71a8b0b2ae7867222ca1d61421245c16ff351a8b72"} Apr 17 17:30:21.068549 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.068476 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" event={"ID":"b859a8ab-a54a-4bc7-ab41-0f0dabe79205","Type":"ContainerDied","Data":"3a2d7c43fc61da52a43f0f8f27474611ba7d33f684123d29d9a3ec1b9f93f36c"} Apr 17 17:30:21.068549 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.068486 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp" Apr 17 17:30:21.068549 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.068495 2573 scope.go:117] "RemoveContainer" containerID="25d7de9e9a5d66424f1e30b4c37fdf5b2b4e50c13e6f9e3d9cd0ebb318a8a631" Apr 17 17:30:21.076527 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.076506 2573 scope.go:117] "RemoveContainer" containerID="b526aedc1c2816f5d61c5b71a8b0b2ae7867222ca1d61421245c16ff351a8b72" Apr 17 17:30:21.083928 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.083907 2573 scope.go:117] "RemoveContainer" containerID="350113543affdca94626c314ad28435b8112c094d8397074566cc6ce4f761cda" Apr 17 17:30:21.089409 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.089383 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp"] Apr 17 17:30:21.091737 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.091715 2573 scope.go:117] "RemoveContainer" containerID="25d7de9e9a5d66424f1e30b4c37fdf5b2b4e50c13e6f9e3d9cd0ebb318a8a631" Apr 17 17:30:21.092277 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:30:21.092249 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25d7de9e9a5d66424f1e30b4c37fdf5b2b4e50c13e6f9e3d9cd0ebb318a8a631\": container with ID starting with 25d7de9e9a5d66424f1e30b4c37fdf5b2b4e50c13e6f9e3d9cd0ebb318a8a631 not found: ID does not exist" containerID="25d7de9e9a5d66424f1e30b4c37fdf5b2b4e50c13e6f9e3d9cd0ebb318a8a631" Apr 17 17:30:21.092419 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.092285 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25d7de9e9a5d66424f1e30b4c37fdf5b2b4e50c13e6f9e3d9cd0ebb318a8a631"} err="failed to get container status \"25d7de9e9a5d66424f1e30b4c37fdf5b2b4e50c13e6f9e3d9cd0ebb318a8a631\": rpc error: code = NotFound desc = could not find container \"25d7de9e9a5d66424f1e30b4c37fdf5b2b4e50c13e6f9e3d9cd0ebb318a8a631\": container with ID starting with 25d7de9e9a5d66424f1e30b4c37fdf5b2b4e50c13e6f9e3d9cd0ebb318a8a631 not found: ID does not exist" Apr 17 17:30:21.092419 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.092305 2573 scope.go:117] "RemoveContainer" containerID="b526aedc1c2816f5d61c5b71a8b0b2ae7867222ca1d61421245c16ff351a8b72" Apr 17 17:30:21.092419 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.092406 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-swtpp"] Apr 17 17:30:21.092618 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:30:21.092592 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b526aedc1c2816f5d61c5b71a8b0b2ae7867222ca1d61421245c16ff351a8b72\": container with ID starting with b526aedc1c2816f5d61c5b71a8b0b2ae7867222ca1d61421245c16ff351a8b72 not found: ID does not exist" containerID="b526aedc1c2816f5d61c5b71a8b0b2ae7867222ca1d61421245c16ff351a8b72" Apr 17 17:30:21.092870 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.092847 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b526aedc1c2816f5d61c5b71a8b0b2ae7867222ca1d61421245c16ff351a8b72"} err="failed to get container status \"b526aedc1c2816f5d61c5b71a8b0b2ae7867222ca1d61421245c16ff351a8b72\": rpc error: code = NotFound desc = could not find container \"b526aedc1c2816f5d61c5b71a8b0b2ae7867222ca1d61421245c16ff351a8b72\": container with ID starting with b526aedc1c2816f5d61c5b71a8b0b2ae7867222ca1d61421245c16ff351a8b72 not found: ID does not exist" Apr 17 17:30:21.092929 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.092872 2573 scope.go:117] "RemoveContainer" containerID="350113543affdca94626c314ad28435b8112c094d8397074566cc6ce4f761cda" Apr 17 17:30:21.093114 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:30:21.093098 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"350113543affdca94626c314ad28435b8112c094d8397074566cc6ce4f761cda\": container with ID starting with 350113543affdca94626c314ad28435b8112c094d8397074566cc6ce4f761cda not found: ID does not exist" containerID="350113543affdca94626c314ad28435b8112c094d8397074566cc6ce4f761cda" Apr 17 17:30:21.093156 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.093118 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"350113543affdca94626c314ad28435b8112c094d8397074566cc6ce4f761cda"} err="failed to get container status \"350113543affdca94626c314ad28435b8112c094d8397074566cc6ce4f761cda\": rpc error: code = NotFound desc = could not find container \"350113543affdca94626c314ad28435b8112c094d8397074566cc6ce4f761cda\": container with ID starting with 350113543affdca94626c314ad28435b8112c094d8397074566cc6ce4f761cda not found: ID does not exist" Apr 17 17:30:21.151259 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.151221 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:30:21.151259 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.151252 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kc9ll\" (UniqueName: \"kubernetes.io/projected/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-kube-api-access-kc9ll\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:30:21.151259 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.151263 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:30:21.151475 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.151274 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b859a8ab-a54a-4bc7-ab41-0f0dabe79205-proxy-tls\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:30:21.175467 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:30:21.175438 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" path="/var/lib/kubelet/pods/b859a8ab-a54a-4bc7-ab41-0f0dabe79205/volumes" Apr 17 17:31:17.414434 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.414398 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6"] Apr 17 17:31:17.415038 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.414700 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerName="kube-rbac-proxy" Apr 17 17:31:17.415038 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.414712 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerName="kube-rbac-proxy" Apr 17 17:31:17.415038 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.414727 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerName="kserve-container" Apr 17 17:31:17.415038 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.414733 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerName="kserve-container" Apr 17 17:31:17.415038 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.414746 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerName="storage-initializer" Apr 17 17:31:17.415038 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.414752 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerName="storage-initializer" Apr 17 17:31:17.415038 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.414800 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerName="kserve-container" Apr 17 17:31:17.415038 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.414811 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b859a8ab-a54a-4bc7-ab41-0f0dabe79205" containerName="kube-rbac-proxy" Apr 17 17:31:17.417785 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.417770 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" Apr 17 17:31:17.420093 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.420067 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-predictor-serving-cert\"" Apr 17 17:31:17.420210 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.420101 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 17:31:17.420210 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.420170 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 17:31:17.421308 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.421291 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-wzvpv\"" Apr 17 17:31:17.421400 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.421292 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 17 17:31:17.427500 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.427479 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6"] Apr 17 17:31:17.483604 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.483556 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f9206ea-f0a1-4d5c-8217-c43d749d9309-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6\" (UID: \"2f9206ea-f0a1-4d5c-8217-c43d749d9309\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" Apr 17 17:31:17.483813 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.483651 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2f9206ea-f0a1-4d5c-8217-c43d749d9309-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6\" (UID: \"2f9206ea-f0a1-4d5c-8217-c43d749d9309\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" Apr 17 17:31:17.483813 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.483680 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2f9206ea-f0a1-4d5c-8217-c43d749d9309-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6\" (UID: \"2f9206ea-f0a1-4d5c-8217-c43d749d9309\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" Apr 17 17:31:17.483813 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.483699 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxbnd\" (UniqueName: \"kubernetes.io/projected/2f9206ea-f0a1-4d5c-8217-c43d749d9309-kube-api-access-zxbnd\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6\" (UID: \"2f9206ea-f0a1-4d5c-8217-c43d749d9309\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" Apr 17 17:31:17.586307 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.585315 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f9206ea-f0a1-4d5c-8217-c43d749d9309-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6\" (UID: \"2f9206ea-f0a1-4d5c-8217-c43d749d9309\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" Apr 17 17:31:17.586307 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.585439 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2f9206ea-f0a1-4d5c-8217-c43d749d9309-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6\" (UID: \"2f9206ea-f0a1-4d5c-8217-c43d749d9309\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" Apr 17 17:31:17.586307 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.585480 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2f9206ea-f0a1-4d5c-8217-c43d749d9309-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6\" (UID: \"2f9206ea-f0a1-4d5c-8217-c43d749d9309\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" Apr 17 17:31:17.586307 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.585513 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxbnd\" (UniqueName: \"kubernetes.io/projected/2f9206ea-f0a1-4d5c-8217-c43d749d9309-kube-api-access-zxbnd\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6\" (UID: \"2f9206ea-f0a1-4d5c-8217-c43d749d9309\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" Apr 17 17:31:17.586307 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.586118 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2f9206ea-f0a1-4d5c-8217-c43d749d9309-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6\" (UID: \"2f9206ea-f0a1-4d5c-8217-c43d749d9309\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" Apr 17 17:31:17.586745 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.586664 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2f9206ea-f0a1-4d5c-8217-c43d749d9309-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6\" (UID: \"2f9206ea-f0a1-4d5c-8217-c43d749d9309\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" Apr 17 17:31:17.589243 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:31:17.586868 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-v2-predictor-serving-cert: secret "isvc-xgboost-v2-predictor-serving-cert" not found Apr 17 17:31:17.589243 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:31:17.586940 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f9206ea-f0a1-4d5c-8217-c43d749d9309-proxy-tls podName:2f9206ea-f0a1-4d5c-8217-c43d749d9309 nodeName:}" failed. No retries permitted until 2026-04-17 17:31:18.086920114 +0000 UTC m=+3595.418420997 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2f9206ea-f0a1-4d5c-8217-c43d749d9309-proxy-tls") pod "isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" (UID: "2f9206ea-f0a1-4d5c-8217-c43d749d9309") : secret "isvc-xgboost-v2-predictor-serving-cert" not found Apr 17 17:31:17.594007 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:17.593986 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxbnd\" (UniqueName: \"kubernetes.io/projected/2f9206ea-f0a1-4d5c-8217-c43d749d9309-kube-api-access-zxbnd\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6\" (UID: \"2f9206ea-f0a1-4d5c-8217-c43d749d9309\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" Apr 17 17:31:18.089279 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:18.089233 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f9206ea-f0a1-4d5c-8217-c43d749d9309-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6\" (UID: \"2f9206ea-f0a1-4d5c-8217-c43d749d9309\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" Apr 17 17:31:18.091756 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:18.091732 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f9206ea-f0a1-4d5c-8217-c43d749d9309-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6\" (UID: \"2f9206ea-f0a1-4d5c-8217-c43d749d9309\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" Apr 17 17:31:18.329056 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:18.329018 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" Apr 17 17:31:18.461910 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:18.461877 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6"] Apr 17 17:31:18.464731 ip-10-0-128-217 kubenswrapper[2573]: W0417 17:31:18.464696 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f9206ea_f0a1_4d5c_8217_c43d749d9309.slice/crio-ce6a13e60e431b2b4a5dd10eb31a84bccbafcd903a0ea41a46af877c5237e5b8 WatchSource:0}: Error finding container ce6a13e60e431b2b4a5dd10eb31a84bccbafcd903a0ea41a46af877c5237e5b8: Status 404 returned error can't find the container with id ce6a13e60e431b2b4a5dd10eb31a84bccbafcd903a0ea41a46af877c5237e5b8 Apr 17 17:31:19.220950 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:19.220908 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" event={"ID":"2f9206ea-f0a1-4d5c-8217-c43d749d9309","Type":"ContainerStarted","Data":"feeeab9a7921c8d69607bfe1183a7a42f6e4b9223213ec01059383deb2a59346"} Apr 17 17:31:19.220950 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:19.220947 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" event={"ID":"2f9206ea-f0a1-4d5c-8217-c43d749d9309","Type":"ContainerStarted","Data":"ce6a13e60e431b2b4a5dd10eb31a84bccbafcd903a0ea41a46af877c5237e5b8"} Apr 17 17:31:23.235095 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:23.235061 2573 generic.go:358] "Generic (PLEG): container finished" podID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerID="feeeab9a7921c8d69607bfe1183a7a42f6e4b9223213ec01059383deb2a59346" exitCode=0 Apr 17 17:31:23.235493 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:23.235114 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" event={"ID":"2f9206ea-f0a1-4d5c-8217-c43d749d9309","Type":"ContainerDied","Data":"feeeab9a7921c8d69607bfe1183a7a42f6e4b9223213ec01059383deb2a59346"} Apr 17 17:31:24.239583 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:24.239547 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" event={"ID":"2f9206ea-f0a1-4d5c-8217-c43d749d9309","Type":"ContainerStarted","Data":"1becef190ad0ef99c96c246744a3b6a8af8af96515b87f16ce392870a579ad09"} Apr 17 17:31:24.239583 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:24.239587 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" event={"ID":"2f9206ea-f0a1-4d5c-8217-c43d749d9309","Type":"ContainerStarted","Data":"310efe70c2209755f057179d3ab58ab57862cdce933791a73405fded45a03daa"} Apr 17 17:31:24.240090 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:24.239884 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" Apr 17 17:31:24.240090 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:24.240018 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" Apr 17 17:31:24.241504 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:24.241472 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" podUID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 17 17:31:24.262808 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:24.262763 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" podStartSLOduration=7.262750522 podStartE2EDuration="7.262750522s" podCreationTimestamp="2026-04-17 17:31:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:31:24.261662367 +0000 UTC m=+3601.593163269" watchObservedRunningTime="2026-04-17 17:31:24.262750522 +0000 UTC m=+3601.594251424" Apr 17 17:31:25.242987 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:25.242947 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" podUID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 17 17:31:26.258880 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:26.258849 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 17:31:26.260709 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:26.260690 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 17:31:30.248326 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:30.248297 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" Apr 17 17:31:30.248982 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:30.248951 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" podUID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 17 17:31:40.249702 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:40.249659 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" podUID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 17 17:31:50.249506 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:31:50.249464 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" podUID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 17 17:32:00.249389 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:00.249347 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" podUID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 17 17:32:10.249453 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:10.249415 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" podUID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 17 17:32:20.248923 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:20.248883 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" podUID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 17 17:32:30.249780 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:30.249742 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" Apr 17 17:32:37.520372 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:37.520281 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6"] Apr 17 17:32:37.520956 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:37.520714 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" podUID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerName="kserve-container" containerID="cri-o://310efe70c2209755f057179d3ab58ab57862cdce933791a73405fded45a03daa" gracePeriod=30 Apr 17 17:32:37.520956 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:37.520874 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" podUID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerName="kube-rbac-proxy" containerID="cri-o://1becef190ad0ef99c96c246744a3b6a8af8af96515b87f16ce392870a579ad09" gracePeriod=30 Apr 17 17:32:38.449542 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:38.449510 2573 generic.go:358] "Generic (PLEG): container finished" podID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerID="1becef190ad0ef99c96c246744a3b6a8af8af96515b87f16ce392870a579ad09" exitCode=2 Apr 17 17:32:38.449724 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:38.449563 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" event={"ID":"2f9206ea-f0a1-4d5c-8217-c43d749d9309","Type":"ContainerDied","Data":"1becef190ad0ef99c96c246744a3b6a8af8af96515b87f16ce392870a579ad09"} Apr 17 17:32:40.244127 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:40.244082 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" podUID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": dial tcp 10.134.0.33:8643: connect: connection refused" Apr 17 17:32:40.249195 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:40.249160 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" podUID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 17 17:32:41.264443 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.264418 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" Apr 17 17:32:41.342326 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.342231 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f9206ea-f0a1-4d5c-8217-c43d749d9309-proxy-tls\") pod \"2f9206ea-f0a1-4d5c-8217-c43d749d9309\" (UID: \"2f9206ea-f0a1-4d5c-8217-c43d749d9309\") " Apr 17 17:32:41.342326 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.342278 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxbnd\" (UniqueName: \"kubernetes.io/projected/2f9206ea-f0a1-4d5c-8217-c43d749d9309-kube-api-access-zxbnd\") pod \"2f9206ea-f0a1-4d5c-8217-c43d749d9309\" (UID: \"2f9206ea-f0a1-4d5c-8217-c43d749d9309\") " Apr 17 17:32:41.342326 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.342307 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2f9206ea-f0a1-4d5c-8217-c43d749d9309-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"2f9206ea-f0a1-4d5c-8217-c43d749d9309\" (UID: \"2f9206ea-f0a1-4d5c-8217-c43d749d9309\") " Apr 17 17:32:41.342551 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.342493 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2f9206ea-f0a1-4d5c-8217-c43d749d9309-kserve-provision-location\") pod \"2f9206ea-f0a1-4d5c-8217-c43d749d9309\" (UID: \"2f9206ea-f0a1-4d5c-8217-c43d749d9309\") " Apr 17 17:32:41.342681 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.342654 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f9206ea-f0a1-4d5c-8217-c43d749d9309-isvc-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-kube-rbac-proxy-sar-config") pod "2f9206ea-f0a1-4d5c-8217-c43d749d9309" (UID: "2f9206ea-f0a1-4d5c-8217-c43d749d9309"). InnerVolumeSpecName "isvc-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:32:41.342866 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.342815 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f9206ea-f0a1-4d5c-8217-c43d749d9309-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2f9206ea-f0a1-4d5c-8217-c43d749d9309" (UID: "2f9206ea-f0a1-4d5c-8217-c43d749d9309"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:32:41.344455 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.344422 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f9206ea-f0a1-4d5c-8217-c43d749d9309-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2f9206ea-f0a1-4d5c-8217-c43d749d9309" (UID: "2f9206ea-f0a1-4d5c-8217-c43d749d9309"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:32:41.344658 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.344458 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f9206ea-f0a1-4d5c-8217-c43d749d9309-kube-api-access-zxbnd" (OuterVolumeSpecName: "kube-api-access-zxbnd") pod "2f9206ea-f0a1-4d5c-8217-c43d749d9309" (UID: "2f9206ea-f0a1-4d5c-8217-c43d749d9309"). InnerVolumeSpecName "kube-api-access-zxbnd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:32:41.444041 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.444006 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f9206ea-f0a1-4d5c-8217-c43d749d9309-proxy-tls\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:32:41.444041 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.444034 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zxbnd\" (UniqueName: \"kubernetes.io/projected/2f9206ea-f0a1-4d5c-8217-c43d749d9309-kube-api-access-zxbnd\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:32:41.444041 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.444045 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2f9206ea-f0a1-4d5c-8217-c43d749d9309-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:32:41.444281 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.444056 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2f9206ea-f0a1-4d5c-8217-c43d749d9309-kserve-provision-location\") on node \"ip-10-0-128-217.ec2.internal\" DevicePath \"\"" Apr 17 17:32:41.459638 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.459604 2573 generic.go:358] "Generic (PLEG): container finished" podID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerID="310efe70c2209755f057179d3ab58ab57862cdce933791a73405fded45a03daa" exitCode=0 Apr 17 17:32:41.459804 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.459695 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" Apr 17 17:32:41.459804 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.459692 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" event={"ID":"2f9206ea-f0a1-4d5c-8217-c43d749d9309","Type":"ContainerDied","Data":"310efe70c2209755f057179d3ab58ab57862cdce933791a73405fded45a03daa"} Apr 17 17:32:41.459935 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.459809 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6" event={"ID":"2f9206ea-f0a1-4d5c-8217-c43d749d9309","Type":"ContainerDied","Data":"ce6a13e60e431b2b4a5dd10eb31a84bccbafcd903a0ea41a46af877c5237e5b8"} Apr 17 17:32:41.459935 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.459842 2573 scope.go:117] "RemoveContainer" containerID="1becef190ad0ef99c96c246744a3b6a8af8af96515b87f16ce392870a579ad09" Apr 17 17:32:41.468527 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.468511 2573 scope.go:117] "RemoveContainer" containerID="310efe70c2209755f057179d3ab58ab57862cdce933791a73405fded45a03daa" Apr 17 17:32:41.475671 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.475654 2573 scope.go:117] "RemoveContainer" containerID="feeeab9a7921c8d69607bfe1183a7a42f6e4b9223213ec01059383deb2a59346" Apr 17 17:32:41.482370 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.482347 2573 scope.go:117] "RemoveContainer" containerID="1becef190ad0ef99c96c246744a3b6a8af8af96515b87f16ce392870a579ad09" Apr 17 17:32:41.482478 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.482408 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6"] Apr 17 17:32:41.482630 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:32:41.482611 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1becef190ad0ef99c96c246744a3b6a8af8af96515b87f16ce392870a579ad09\": container with ID starting with 1becef190ad0ef99c96c246744a3b6a8af8af96515b87f16ce392870a579ad09 not found: ID does not exist" containerID="1becef190ad0ef99c96c246744a3b6a8af8af96515b87f16ce392870a579ad09" Apr 17 17:32:41.482697 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.482641 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1becef190ad0ef99c96c246744a3b6a8af8af96515b87f16ce392870a579ad09"} err="failed to get container status \"1becef190ad0ef99c96c246744a3b6a8af8af96515b87f16ce392870a579ad09\": rpc error: code = NotFound desc = could not find container \"1becef190ad0ef99c96c246744a3b6a8af8af96515b87f16ce392870a579ad09\": container with ID starting with 1becef190ad0ef99c96c246744a3b6a8af8af96515b87f16ce392870a579ad09 not found: ID does not exist" Apr 17 17:32:41.482697 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.482666 2573 scope.go:117] "RemoveContainer" containerID="310efe70c2209755f057179d3ab58ab57862cdce933791a73405fded45a03daa" Apr 17 17:32:41.482911 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:32:41.482894 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"310efe70c2209755f057179d3ab58ab57862cdce933791a73405fded45a03daa\": container with ID starting with 310efe70c2209755f057179d3ab58ab57862cdce933791a73405fded45a03daa not found: ID does not exist" containerID="310efe70c2209755f057179d3ab58ab57862cdce933791a73405fded45a03daa" Apr 17 17:32:41.482963 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.482917 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"310efe70c2209755f057179d3ab58ab57862cdce933791a73405fded45a03daa"} err="failed to get container status \"310efe70c2209755f057179d3ab58ab57862cdce933791a73405fded45a03daa\": rpc error: code = NotFound desc = could not find container \"310efe70c2209755f057179d3ab58ab57862cdce933791a73405fded45a03daa\": container with ID starting with 310efe70c2209755f057179d3ab58ab57862cdce933791a73405fded45a03daa not found: ID does not exist" Apr 17 17:32:41.482963 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.482934 2573 scope.go:117] "RemoveContainer" containerID="feeeab9a7921c8d69607bfe1183a7a42f6e4b9223213ec01059383deb2a59346" Apr 17 17:32:41.483147 ip-10-0-128-217 kubenswrapper[2573]: E0417 17:32:41.483130 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feeeab9a7921c8d69607bfe1183a7a42f6e4b9223213ec01059383deb2a59346\": container with ID starting with feeeab9a7921c8d69607bfe1183a7a42f6e4b9223213ec01059383deb2a59346 not found: ID does not exist" containerID="feeeab9a7921c8d69607bfe1183a7a42f6e4b9223213ec01059383deb2a59346" Apr 17 17:32:41.483192 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.483153 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feeeab9a7921c8d69607bfe1183a7a42f6e4b9223213ec01059383deb2a59346"} err="failed to get container status \"feeeab9a7921c8d69607bfe1183a7a42f6e4b9223213ec01059383deb2a59346\": rpc error: code = NotFound desc = could not find container \"feeeab9a7921c8d69607bfe1183a7a42f6e4b9223213ec01059383deb2a59346\": container with ID starting with feeeab9a7921c8d69607bfe1183a7a42f6e4b9223213ec01059383deb2a59346 not found: ID does not exist" Apr 17 17:32:41.489888 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:41.489868 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-gvgd6"] Apr 17 17:32:43.174151 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:32:43.174120 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" path="/var/lib/kubelet/pods/2f9206ea-f0a1-4d5c-8217-c43d749d9309/volumes" Apr 17 17:36:26.279992 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:36:26.279963 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 17:36:26.282450 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:36:26.282367 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 17:38:36.553187 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:36.553108 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-bv57b_4729b8b3-8325-43e0-b518-b015db422a04/global-pull-secret-syncer/0.log" Apr 17 17:38:36.729772 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:36.729742 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-s2xcr_04fe438d-c4b2-4123-8dce-24e40c4f8332/konnectivity-agent/0.log" Apr 17 17:38:36.817257 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:36.817164 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-217.ec2.internal_57a323f22f4a50ec542cb175406e5b82/haproxy/0.log" Apr 17 17:38:40.462409 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:40.462374 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bbsgj_37301673-ea1b-4db5-8fb2-107b9ee330de/node-exporter/0.log" Apr 17 17:38:40.485996 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:40.485970 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bbsgj_37301673-ea1b-4db5-8fb2-107b9ee330de/kube-rbac-proxy/0.log" Apr 17 17:38:40.510790 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:40.510766 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bbsgj_37301673-ea1b-4db5-8fb2-107b9ee330de/init-textfile/0.log" Apr 17 17:38:41.024596 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:41.024568 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-sv26b_d717c3f3-1354-407d-bd66-ab20ed5781e7/prometheus-operator/0.log" Apr 17 17:38:41.044358 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:41.044329 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-sv26b_d717c3f3-1354-407d-bd66-ab20ed5781e7/kube-rbac-proxy/0.log" Apr 17 17:38:41.182252 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:41.182224 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5895cb458-4gd9v_01879656-f233-4f71-af88-26fb36ea40f9/thanos-query/0.log" Apr 17 17:38:41.207033 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:41.207008 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5895cb458-4gd9v_01879656-f233-4f71-af88-26fb36ea40f9/kube-rbac-proxy-web/0.log" Apr 17 17:38:41.232609 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:41.232585 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5895cb458-4gd9v_01879656-f233-4f71-af88-26fb36ea40f9/kube-rbac-proxy/0.log" Apr 17 17:38:41.255384 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:41.255362 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5895cb458-4gd9v_01879656-f233-4f71-af88-26fb36ea40f9/prom-label-proxy/0.log" Apr 17 17:38:41.282148 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:41.282065 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5895cb458-4gd9v_01879656-f233-4f71-af88-26fb36ea40f9/kube-rbac-proxy-rules/0.log" Apr 17 17:38:41.308732 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:41.308683 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5895cb458-4gd9v_01879656-f233-4f71-af88-26fb36ea40f9/kube-rbac-proxy-metrics/0.log" Apr 17 17:38:43.803228 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:43.803195 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4"] Apr 17 17:38:43.803645 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:43.803478 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerName="kube-rbac-proxy" Apr 17 17:38:43.803645 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:43.803488 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerName="kube-rbac-proxy" Apr 17 17:38:43.803645 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:43.803499 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerName="storage-initializer" Apr 17 17:38:43.803645 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:43.803505 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerName="storage-initializer" Apr 17 17:38:43.803645 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:43.803511 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerName="kserve-container" Apr 17 17:38:43.803645 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:43.803517 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerName="kserve-container" Apr 17 17:38:43.803645 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:43.803561 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerName="kserve-container" Apr 17 17:38:43.803645 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:43.803569 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f9206ea-f0a1-4d5c-8217-c43d749d9309" containerName="kube-rbac-proxy" Apr 17 17:38:43.806321 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:43.806305 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4" Apr 17 17:38:43.808618 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:43.808596 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5lbcd\"/\"default-dockercfg-r4zks\"" Apr 17 17:38:43.808737 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:43.808722 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5lbcd\"/\"openshift-service-ca.crt\"" Apr 17 17:38:43.809604 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:43.809591 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5lbcd\"/\"kube-root-ca.crt\"" Apr 17 17:38:43.816971 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:43.816938 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4"] Apr 17 17:38:43.910453 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:43.910415 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4c273472-950c-4072-9822-bff2c78edd64-podres\") pod \"perf-node-gather-daemonset-v44z4\" (UID: \"4c273472-950c-4072-9822-bff2c78edd64\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4" Apr 17 17:38:43.910453 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:43.910458 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c273472-950c-4072-9822-bff2c78edd64-sys\") pod \"perf-node-gather-daemonset-v44z4\" (UID: \"4c273472-950c-4072-9822-bff2c78edd64\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4" Apr 17 17:38:43.910675 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:43.910494 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c273472-950c-4072-9822-bff2c78edd64-lib-modules\") pod \"perf-node-gather-daemonset-v44z4\" (UID: \"4c273472-950c-4072-9822-bff2c78edd64\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4" Apr 17 17:38:43.910675 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:43.910529 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4c273472-950c-4072-9822-bff2c78edd64-proc\") pod \"perf-node-gather-daemonset-v44z4\" (UID: \"4c273472-950c-4072-9822-bff2c78edd64\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4" Apr 17 17:38:43.910675 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:43.910564 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn9g8\" (UniqueName: \"kubernetes.io/projected/4c273472-950c-4072-9822-bff2c78edd64-kube-api-access-tn9g8\") pod \"perf-node-gather-daemonset-v44z4\" (UID: \"4c273472-950c-4072-9822-bff2c78edd64\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4" Apr 17 17:38:44.011908 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:44.011869 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4c273472-950c-4072-9822-bff2c78edd64-proc\") pod \"perf-node-gather-daemonset-v44z4\" (UID: \"4c273472-950c-4072-9822-bff2c78edd64\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4" Apr 17 17:38:44.012092 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:44.011919 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tn9g8\" (UniqueName: \"kubernetes.io/projected/4c273472-950c-4072-9822-bff2c78edd64-kube-api-access-tn9g8\") pod \"perf-node-gather-daemonset-v44z4\" (UID: \"4c273472-950c-4072-9822-bff2c78edd64\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4" Apr 17 17:38:44.012092 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:44.011961 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4c273472-950c-4072-9822-bff2c78edd64-podres\") pod \"perf-node-gather-daemonset-v44z4\" (UID: \"4c273472-950c-4072-9822-bff2c78edd64\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4" Apr 17 17:38:44.012092 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:44.011982 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c273472-950c-4072-9822-bff2c78edd64-sys\") pod \"perf-node-gather-daemonset-v44z4\" (UID: \"4c273472-950c-4072-9822-bff2c78edd64\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4" Apr 17 17:38:44.012092 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:44.012000 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4c273472-950c-4072-9822-bff2c78edd64-proc\") pod \"perf-node-gather-daemonset-v44z4\" (UID: \"4c273472-950c-4072-9822-bff2c78edd64\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4" Apr 17 17:38:44.012092 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:44.012009 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c273472-950c-4072-9822-bff2c78edd64-lib-modules\") pod \"perf-node-gather-daemonset-v44z4\" (UID: \"4c273472-950c-4072-9822-bff2c78edd64\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4" Apr 17 17:38:44.012092 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:44.012059 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c273472-950c-4072-9822-bff2c78edd64-sys\") pod \"perf-node-gather-daemonset-v44z4\" (UID: \"4c273472-950c-4072-9822-bff2c78edd64\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4" Apr 17 17:38:44.012290 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:44.012119 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4c273472-950c-4072-9822-bff2c78edd64-podres\") pod \"perf-node-gather-daemonset-v44z4\" (UID: \"4c273472-950c-4072-9822-bff2c78edd64\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4" Apr 17 17:38:44.012290 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:44.012160 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c273472-950c-4072-9822-bff2c78edd64-lib-modules\") pod \"perf-node-gather-daemonset-v44z4\" (UID: \"4c273472-950c-4072-9822-bff2c78edd64\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4" Apr 17 17:38:44.022435 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:44.022402 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn9g8\" (UniqueName: \"kubernetes.io/projected/4c273472-950c-4072-9822-bff2c78edd64-kube-api-access-tn9g8\") pod \"perf-node-gather-daemonset-v44z4\" (UID: \"4c273472-950c-4072-9822-bff2c78edd64\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4" Apr 17 17:38:44.117472 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:44.117382 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4" Apr 17 17:38:44.237970 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:44.237904 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4"] Apr 17 17:38:44.240701 ip-10-0-128-217 kubenswrapper[2573]: W0417 17:38:44.240662 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4c273472_950c_4072_9822_bff2c78edd64.slice/crio-d1eacb254546a9fee1ed1efea57a8001b2d3af530274a6fb3e8742e0146d9996 WatchSource:0}: Error finding container d1eacb254546a9fee1ed1efea57a8001b2d3af530274a6fb3e8742e0146d9996: Status 404 returned error can't find the container with id d1eacb254546a9fee1ed1efea57a8001b2d3af530274a6fb3e8742e0146d9996 Apr 17 17:38:44.242281 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:44.242262 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:38:44.453447 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:44.453415 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jdmrz_fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c/dns/0.log" Apr 17 17:38:44.460186 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:44.460159 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4" event={"ID":"4c273472-950c-4072-9822-bff2c78edd64","Type":"ContainerStarted","Data":"3a219edf236bccef943bff55eff1d989fce132153f4740bbd920d424483d4ad4"} Apr 17 17:38:44.460186 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:44.460190 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4" event={"ID":"4c273472-950c-4072-9822-bff2c78edd64","Type":"ContainerStarted","Data":"d1eacb254546a9fee1ed1efea57a8001b2d3af530274a6fb3e8742e0146d9996"} Apr 17 17:38:44.460417 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:44.460274 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4" Apr 17 17:38:44.475224 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:44.475179 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4" podStartSLOduration=1.475165514 podStartE2EDuration="1.475165514s" podCreationTimestamp="2026-04-17 17:38:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:38:44.474567331 +0000 UTC m=+4041.806068233" watchObservedRunningTime="2026-04-17 17:38:44.475165514 +0000 UTC m=+4041.806666437" Apr 17 17:38:44.475993 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:44.475972 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jdmrz_fed429ae-fb2e-4ed3-9dc4-7d01dec6bb2c/kube-rbac-proxy/0.log" Apr 17 17:38:44.545037 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:44.545009 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hqjjj_a4e3fb1c-519b-4c02-9326-fd056001ad1b/dns-node-resolver/0.log" Apr 17 17:38:45.041763 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:45.041731 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6598b846c9-zgtcv_bfb79489-ea31-453d-8166-9a00d7bf66e1/registry/0.log" Apr 17 17:38:45.115255 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:45.115224 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wtvtv_66fec0a1-a09d-4a76-b857-2877ab654053/node-ca/0.log" Apr 17 17:38:46.179404 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:46.179373 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zb6kc_20e90cc0-23dd-4714-b716-64ca208935e2/serve-healthcheck-canary/0.log" Apr 17 17:38:46.630918 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:46.630892 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7b9dc_11a75458-c09d-4d9f-8d73-5085ca8421a0/kube-rbac-proxy/0.log" Apr 17 17:38:46.653735 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:46.653707 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7b9dc_11a75458-c09d-4d9f-8d73-5085ca8421a0/exporter/0.log" Apr 17 17:38:46.677452 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:46.677422 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7b9dc_11a75458-c09d-4d9f-8d73-5085ca8421a0/extractor/0.log" Apr 17 17:38:48.913103 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:48.913074 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-6x29f_c07f2a87-1031-4c35-890f-056f15117ced/manager/0.log" Apr 17 17:38:49.417301 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:49.417257 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-5wr9s_71949cb1-8366-4768-b562-c0473ec01812/seaweedfs/0.log" Apr 17 17:38:49.442969 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:49.442938 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-kq5nt_6e391735-dd05-49a1-9116-d67840d417a9/seaweedfs-tls-custom/0.log" Apr 17 17:38:50.471916 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:50.471879 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-v44z4" Apr 17 17:38:53.385109 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:53.385077 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zkjjb_d1b9bfc8-7736-49f6-8463-ca6a7796d051/migrator/0.log" Apr 17 17:38:53.414963 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:53.414929 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zkjjb_d1b9bfc8-7736-49f6-8463-ca6a7796d051/graceful-termination/0.log" Apr 17 17:38:54.628209 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:54.628178 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5kfdr_7238545f-7cf1-4c61-a1dc-f5a458a0c5ed/kube-multus/0.log" Apr 17 17:38:54.829044 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:54.829014 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9j69g_6e5430f1-c021-4f4d-bedc-fafa1ec4d260/kube-multus-additional-cni-plugins/0.log" Apr 17 17:38:54.850505 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:54.850481 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9j69g_6e5430f1-c021-4f4d-bedc-fafa1ec4d260/egress-router-binary-copy/0.log" Apr 17 17:38:54.870819 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:54.870792 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9j69g_6e5430f1-c021-4f4d-bedc-fafa1ec4d260/cni-plugins/0.log" Apr 17 17:38:54.890843 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:54.890806 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9j69g_6e5430f1-c021-4f4d-bedc-fafa1ec4d260/bond-cni-plugin/0.log" Apr 17 17:38:54.910917 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:54.910896 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9j69g_6e5430f1-c021-4f4d-bedc-fafa1ec4d260/routeoverride-cni/0.log" Apr 17 17:38:54.930635 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:54.930612 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9j69g_6e5430f1-c021-4f4d-bedc-fafa1ec4d260/whereabouts-cni-bincopy/0.log" Apr 17 17:38:54.951686 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:54.951661 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9j69g_6e5430f1-c021-4f4d-bedc-fafa1ec4d260/whereabouts-cni/0.log" Apr 17 17:38:55.344549 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:55.344524 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zkmq8_0d9d52ff-d172-4b74-90ce-5ef0ac75662c/network-metrics-daemon/0.log" Apr 17 17:38:55.364509 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:55.364428 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zkmq8_0d9d52ff-d172-4b74-90ce-5ef0ac75662c/kube-rbac-proxy/0.log" Apr 17 17:38:56.969795 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:56.969768 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-controller/0.log" Apr 17 17:38:56.986772 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:56.986746 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/0.log" Apr 17 17:38:57.021154 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:57.021109 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovn-acl-logging/1.log" Apr 17 17:38:57.049878 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:57.049853 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/kube-rbac-proxy-node/0.log" Apr 17 17:38:57.072244 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:57.072210 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 17:38:57.088799 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:57.088777 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/northd/0.log" Apr 17 17:38:57.111529 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:57.111507 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/nbdb/0.log" Apr 17 17:38:57.136409 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:57.136382 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/sbdb/0.log" Apr 17 17:38:57.302504 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:57.302423 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4s682_1d463d09-7ae3-4a07-b80e-6078d9f0801d/ovnkube-controller/0.log" Apr 17 17:38:58.323368 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:58.319073 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-z2wfh_10fbceca-37d7-4803-b22d-19039688034a/network-check-target-container/0.log" Apr 17 17:38:59.187198 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:59.187173 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-n5p68_74a720b2-f2e9-4ae6-98cf-494d329dd9e7/iptables-alerter/0.log" Apr 17 17:38:59.858003 ip-10-0-128-217 kubenswrapper[2573]: I0417 17:38:59.857973 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-9dd68_34116767-97f7-4597-bf99-9ab932940d12/tuned/0.log"