Apr 23 17:55:48.068078 ip-10-0-137-157 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 17:55:48.068091 ip-10-0-137-157 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 17:55:48.068101 ip-10-0-137-157 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 17:55:48.068458 ip-10-0-137-157 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 17:55:58.241603 ip-10-0-137-157 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 17:55:58.241621 ip-10-0-137-157 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot ad2e606f17a34e79be58768450f3cfd3 -- Apr 23 17:58:30.718817 ip-10-0-137-157 systemd[1]: Starting Kubernetes Kubelet... Apr 23 17:58:31.154914 ip-10-0-137-157 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:58:31.154914 ip-10-0-137-157 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 17:58:31.154914 ip-10-0-137-157 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:58:31.154914 ip-10-0-137-157 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 17:58:31.154914 ip-10-0-137-157 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:58:31.156513 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.156402 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 17:58:31.161386 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161362 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:58:31.161386 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161383 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:58:31.161386 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161390 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:58:31.161555 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161395 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:58:31.161555 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161398 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:58:31.161555 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161408 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:58:31.161555 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161411 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:58:31.161555 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161414 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:58:31.161555 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161416 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:58:31.161555 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161419 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:58:31.161555 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161422 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:58:31.161555 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161425 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:58:31.161555 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161428 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:58:31.161555 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161430 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:58:31.161555 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161433 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:58:31.161555 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161436 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:58:31.161555 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161439 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:58:31.161555 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161442 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:58:31.161555 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161444 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:58:31.161555 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161447 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:58:31.161555 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161449 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:58:31.161555 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161452 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:58:31.161555 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161454 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:58:31.162034 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161470 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:58:31.162034 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161474 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:58:31.162034 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161476 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:58:31.162034 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161479 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:58:31.162034 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161489 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:58:31.162034 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161491 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:58:31.162034 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161494 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:58:31.162034 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161496 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:58:31.162034 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161499 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:58:31.162034 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161501 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:58:31.162034 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161504 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:58:31.162034 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161506 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:58:31.162034 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161509 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:58:31.162034 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161512 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:58:31.162034 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161515 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:58:31.162034 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161518 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:58:31.162034 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161522 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:58:31.162034 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161524 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:58:31.162034 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161527 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:58:31.162034 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161530 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:58:31.162533 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161532 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:58:31.162533 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161535 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:58:31.162533 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161540 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:58:31.162533 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161543 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:58:31.162533 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161546 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:58:31.162533 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161549 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:58:31.162533 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161552 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:58:31.162533 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161554 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:58:31.162533 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161557 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:58:31.162533 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161560 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:58:31.162533 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161562 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:58:31.162533 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161565 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:58:31.162533 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161568 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:58:31.162533 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161571 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:58:31.162533 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161573 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:58:31.162533 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161576 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:58:31.162533 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161578 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:58:31.162533 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161580 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:58:31.162533 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161583 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:58:31.163017 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161585 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:58:31.163017 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161588 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:58:31.163017 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161590 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:58:31.163017 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161593 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:58:31.163017 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161596 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:58:31.163017 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161598 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:58:31.163017 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161601 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:58:31.163017 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161604 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:58:31.163017 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161607 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:58:31.163017 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161610 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:58:31.163017 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161613 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:58:31.163017 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161616 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:58:31.163017 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161618 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:58:31.163017 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161621 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:58:31.163017 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161623 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:58:31.163017 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161627 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:58:31.163017 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161631 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:58:31.163017 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161634 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:58:31.163017 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161636 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:58:31.163017 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161639 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:58:31.163502 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161642 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:58:31.163502 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161645 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:58:31.163502 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161647 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:58:31.163502 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.161654 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:58:31.163502 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162049 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:58:31.163502 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162053 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:58:31.163502 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162056 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:58:31.163502 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162059 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:58:31.163502 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162062 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:58:31.163502 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162065 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:58:31.163502 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162067 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:58:31.163502 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162070 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:58:31.163502 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162072 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:58:31.163502 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162075 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:58:31.163502 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162078 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:58:31.163502 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162080 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:58:31.163502 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162083 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:58:31.163502 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162086 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:58:31.163502 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162089 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:58:31.163502 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162091 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:58:31.163975 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162094 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:58:31.163975 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162097 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:58:31.163975 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162099 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:58:31.163975 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162102 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:58:31.163975 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162105 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:58:31.163975 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162107 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:58:31.163975 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162110 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:58:31.163975 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162113 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:58:31.163975 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162116 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:58:31.163975 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162118 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:58:31.163975 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162121 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:58:31.163975 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162125 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:58:31.163975 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162128 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:58:31.163975 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162131 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:58:31.163975 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162134 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:58:31.163975 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162137 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:58:31.163975 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162140 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:58:31.163975 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162142 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:58:31.163975 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162145 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:58:31.164429 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162147 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:58:31.164429 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162150 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:58:31.164429 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162152 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:58:31.164429 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162155 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:58:31.164429 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162157 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:58:31.164429 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162160 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:58:31.164429 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162162 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:58:31.164429 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162164 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:58:31.164429 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162167 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:58:31.164429 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162169 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:58:31.164429 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162171 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:58:31.164429 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162176 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:58:31.164429 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162179 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:58:31.164429 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162181 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:58:31.164429 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162184 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:58:31.164429 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162186 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:58:31.164429 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162189 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:58:31.164429 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162192 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:58:31.164429 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162194 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:58:31.164429 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162197 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:58:31.164946 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162199 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:58:31.164946 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162201 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:58:31.164946 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162204 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:58:31.164946 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162207 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:58:31.164946 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162209 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:58:31.164946 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162211 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:58:31.164946 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162214 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:58:31.164946 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162216 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:58:31.164946 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162219 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:58:31.164946 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162222 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:58:31.164946 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162224 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:58:31.164946 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162227 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:58:31.164946 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162230 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:58:31.164946 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162234 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:58:31.164946 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162237 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:58:31.164946 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162240 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:58:31.164946 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162242 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:58:31.164946 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162245 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:58:31.164946 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162248 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:58:31.165446 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162251 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:58:31.165446 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162253 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:58:31.165446 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162256 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:58:31.165446 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162258 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:58:31.165446 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162261 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:58:31.165446 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162264 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:58:31.165446 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162266 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:58:31.165446 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162269 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:58:31.165446 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162271 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:58:31.165446 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162273 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:58:31.165446 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162276 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:58:31.165446 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162279 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:58:31.165446 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162355 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 17:58:31.165446 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162362 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 17:58:31.165446 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162369 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 17:58:31.165446 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162373 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 17:58:31.165446 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162378 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 17:58:31.165446 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162381 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 17:58:31.165446 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162385 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 17:58:31.165446 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162389 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 17:58:31.165446 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162393 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162396 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162400 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162403 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162406 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162409 2576 flags.go:64] FLAG: --cgroup-root="" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162412 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162415 2576 flags.go:64] FLAG: --client-ca-file="" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162418 2576 flags.go:64] FLAG: --cloud-config="" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162420 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162423 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162426 2576 flags.go:64] FLAG: --cluster-domain="" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162429 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162432 2576 flags.go:64] FLAG: --config-dir="" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162435 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162439 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162443 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162447 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162450 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162453 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162456 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162474 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162477 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162480 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162483 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 17:58:31.166017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162487 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162490 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162493 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162495 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162499 2576 flags.go:64] FLAG: --enable-server="true" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162502 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162507 2576 flags.go:64] FLAG: --event-burst="100" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162510 2576 flags.go:64] FLAG: --event-qps="50" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162512 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162515 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162518 2576 flags.go:64] FLAG: --eviction-hard="" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162522 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162525 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162528 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162531 2576 flags.go:64] FLAG: --eviction-soft="" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162534 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162537 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162540 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162543 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162546 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162549 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162552 2576 flags.go:64] FLAG: --feature-gates="" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162556 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162560 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162563 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 17:58:31.166636 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162566 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162569 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162572 2576 flags.go:64] FLAG: --help="false" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162575 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-137-157.ec2.internal" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162578 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162582 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162584 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162588 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162591 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162594 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162597 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162600 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162603 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162606 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162609 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162612 2576 flags.go:64] FLAG: --kube-reserved="" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162615 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162617 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162620 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162623 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162626 2576 flags.go:64] FLAG: --lock-file="" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162629 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162631 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162634 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 17:58:31.167221 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162640 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162643 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162646 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162648 2576 flags.go:64] FLAG: --logging-format="text" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162651 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162658 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162662 2576 flags.go:64] FLAG: --manifest-url="" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162665 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162669 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162672 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162676 2576 flags.go:64] FLAG: --max-pods="110" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162679 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162682 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162685 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162688 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162691 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162694 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162697 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162704 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162707 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162710 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162714 2576 flags.go:64] FLAG: --pod-cidr="" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162717 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 17:58:31.167808 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162723 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162726 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162729 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162732 2576 flags.go:64] FLAG: --port="10250" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162735 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162737 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-077dba1f439b7794a" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162740 2576 flags.go:64] FLAG: --qos-reserved="" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162743 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162746 2576 flags.go:64] FLAG: --register-node="true" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162749 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162752 2576 flags.go:64] FLAG: --register-with-taints="" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162756 2576 flags.go:64] FLAG: --registry-burst="10" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162759 2576 flags.go:64] FLAG: --registry-qps="5" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162762 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162764 2576 flags.go:64] FLAG: --reserved-memory="" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162770 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162773 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162776 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162779 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162782 2576 flags.go:64] FLAG: --runonce="false" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162785 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162787 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162791 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162793 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162796 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162799 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 17:58:31.168352 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162802 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162805 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162807 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162810 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162813 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162816 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162819 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162822 2576 flags.go:64] FLAG: --system-cgroups="" Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162825 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162830 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162833 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162835 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162839 2576 flags.go:64] FLAG: --tls-min-version="" Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162841 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162844 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162847 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162850 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162853 2576 flags.go:64] FLAG: --v="2" Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162857 2576 flags.go:64] FLAG: --version="false" Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162861 2576 flags.go:64] FLAG: --vmodule="" Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162865 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.162870 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162970 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162975 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:58:31.169051 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162978 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:58:31.169661 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162980 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:58:31.169661 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162985 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:58:31.169661 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162988 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:58:31.169661 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162991 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:58:31.169661 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162994 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:58:31.169661 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.162998 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:58:31.169661 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163001 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:58:31.169661 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163006 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:58:31.169661 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163008 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:58:31.169661 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163011 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:58:31.169661 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163014 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:58:31.169661 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163017 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:58:31.169661 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163019 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:58:31.169661 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163022 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:58:31.169661 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163025 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:58:31.169661 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163028 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:58:31.169661 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163030 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:58:31.169661 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163033 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:58:31.169661 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163035 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:58:31.170177 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163038 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:58:31.170177 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163040 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:58:31.170177 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163042 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:58:31.170177 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163044 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:58:31.170177 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163047 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:58:31.170177 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163049 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:58:31.170177 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163052 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:58:31.170177 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163054 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:58:31.170177 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163057 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:58:31.170177 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163060 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:58:31.170177 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163063 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:58:31.170177 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163066 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:58:31.170177 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163068 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:58:31.170177 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163071 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:58:31.170177 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163073 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:58:31.170177 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163075 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:58:31.170177 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163078 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:58:31.170177 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163080 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:58:31.170177 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163082 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:58:31.170177 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163085 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:58:31.170689 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163088 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:58:31.170689 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163091 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:58:31.170689 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163093 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:58:31.170689 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163096 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:58:31.170689 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163099 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:58:31.170689 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163101 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:58:31.170689 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163104 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:58:31.170689 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163107 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:58:31.170689 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163109 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:58:31.170689 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163111 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:58:31.170689 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163114 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:58:31.170689 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163116 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:58:31.170689 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163119 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:58:31.170689 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163121 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:58:31.170689 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163123 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:58:31.170689 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163126 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:58:31.170689 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163128 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:58:31.170689 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163131 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:58:31.170689 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163133 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:58:31.170689 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163135 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:58:31.171175 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163138 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:58:31.171175 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163141 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:58:31.171175 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163144 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:58:31.171175 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163147 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:58:31.171175 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163149 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:58:31.171175 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163152 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:58:31.171175 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163154 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:58:31.171175 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163157 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:58:31.171175 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163159 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:58:31.171175 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163162 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:58:31.171175 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163164 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:58:31.171175 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163166 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:58:31.171175 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163171 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:58:31.171175 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163174 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:58:31.171175 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163178 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:58:31.171175 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163181 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:58:31.171175 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163184 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:58:31.171175 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163186 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:58:31.171175 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163189 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:58:31.171175 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163191 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:58:31.171685 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163194 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:58:31.171685 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163196 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:58:31.171685 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163199 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:58:31.171685 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.163201 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:58:31.171685 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.163723 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:58:31.171685 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.170925 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 17:58:31.171685 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.170944 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 17:58:31.171685 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.170992 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:58:31.171685 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.170998 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:58:31.171685 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171001 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:58:31.171685 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171004 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:58:31.171685 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171007 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:58:31.171685 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171010 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:58:31.171685 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171013 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:58:31.171685 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171017 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:58:31.171685 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171019 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:58:31.172089 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171022 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:58:31.172089 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171025 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:58:31.172089 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171028 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:58:31.172089 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171030 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:58:31.172089 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171033 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:58:31.172089 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171036 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:58:31.172089 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171040 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:58:31.172089 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171044 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:58:31.172089 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171047 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:58:31.172089 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171050 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:58:31.172089 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171052 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:58:31.172089 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171055 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:58:31.172089 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171058 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:58:31.172089 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171060 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:58:31.172089 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171063 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:58:31.172089 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171065 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:58:31.172089 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171068 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:58:31.172089 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171070 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:58:31.172089 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171073 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:58:31.172649 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171075 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:58:31.172649 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171078 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:58:31.172649 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171080 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:58:31.172649 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171084 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:58:31.172649 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171087 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:58:31.172649 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171090 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:58:31.172649 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171092 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:58:31.172649 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171095 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:58:31.172649 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171098 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:58:31.172649 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171100 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:58:31.172649 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171103 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:58:31.172649 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171105 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:58:31.172649 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171108 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:58:31.172649 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171110 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:58:31.172649 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171113 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:58:31.172649 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171115 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:58:31.172649 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171117 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:58:31.172649 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171120 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:58:31.172649 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171122 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:58:31.173131 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171125 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:58:31.173131 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171127 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:58:31.173131 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171130 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:58:31.173131 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171132 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:58:31.173131 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171135 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:58:31.173131 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171137 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:58:31.173131 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171140 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:58:31.173131 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171142 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:58:31.173131 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171144 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:58:31.173131 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171147 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:58:31.173131 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171149 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:58:31.173131 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171151 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:58:31.173131 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171154 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:58:31.173131 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171156 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:58:31.173131 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171159 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:58:31.173131 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171161 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:58:31.173131 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171165 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:58:31.173131 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171168 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:58:31.173131 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171170 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:58:31.173131 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171173 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:58:31.173637 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171177 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:58:31.173637 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171181 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:58:31.173637 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171184 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:58:31.173637 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171187 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:58:31.173637 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171189 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:58:31.173637 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171192 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:58:31.173637 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171195 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:58:31.173637 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171197 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:58:31.173637 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171200 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:58:31.173637 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171202 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:58:31.173637 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171204 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:58:31.173637 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171207 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:58:31.173637 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171209 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:58:31.173637 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171212 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:58:31.173637 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171214 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:58:31.173637 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171217 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:58:31.173637 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171219 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:58:31.173637 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171221 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:58:31.173637 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171224 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:58:31.174097 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.171229 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:58:31.174097 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171336 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:58:31.174097 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171340 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:58:31.174097 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171343 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:58:31.174097 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171346 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:58:31.174097 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171348 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:58:31.174097 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171351 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:58:31.174097 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171354 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:58:31.174097 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171357 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:58:31.174097 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171360 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:58:31.174097 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171363 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:58:31.174097 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171367 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:58:31.174097 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171369 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:58:31.174097 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171372 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:58:31.174097 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171375 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:58:31.174097 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171377 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:58:31.174557 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171380 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:58:31.174557 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171383 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:58:31.174557 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171385 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:58:31.174557 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171387 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:58:31.174557 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171390 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:58:31.174557 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171392 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:58:31.174557 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171395 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:58:31.174557 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171397 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:58:31.174557 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171399 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:58:31.174557 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171402 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:58:31.174557 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171404 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:58:31.174557 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171407 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:58:31.174557 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171409 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:58:31.174557 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171413 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:58:31.174557 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171416 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:58:31.174557 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171419 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:58:31.174557 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171422 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:58:31.174557 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171424 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:58:31.174557 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171427 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:58:31.175029 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171429 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:58:31.175029 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171432 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:58:31.175029 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171435 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:58:31.175029 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171437 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:58:31.175029 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171440 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:58:31.175029 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171442 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:58:31.175029 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171445 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:58:31.175029 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171448 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:58:31.175029 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171451 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:58:31.175029 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171454 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:58:31.175029 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171476 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:58:31.175029 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171480 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:58:31.175029 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171483 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:58:31.175029 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171485 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:58:31.175029 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171488 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:58:31.175029 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171491 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:58:31.175029 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171493 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:58:31.175029 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171495 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:58:31.175029 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171498 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:58:31.175029 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171500 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:58:31.175531 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171503 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:58:31.175531 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171506 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:58:31.175531 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171508 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:58:31.175531 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171511 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:58:31.175531 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171513 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:58:31.175531 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171516 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:58:31.175531 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171518 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:58:31.175531 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171521 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:58:31.175531 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171523 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:58:31.175531 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171526 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:58:31.175531 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171528 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:58:31.175531 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171531 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:58:31.175531 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171533 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:58:31.175531 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171535 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:58:31.175531 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171538 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:58:31.175531 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171540 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:58:31.175531 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171542 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:58:31.175531 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171545 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:58:31.175531 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171547 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:58:31.175531 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171550 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:58:31.176021 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171553 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:58:31.176021 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171555 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:58:31.176021 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171558 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:58:31.176021 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171561 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:58:31.176021 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171564 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:58:31.176021 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171568 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:58:31.176021 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171570 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:58:31.176021 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171573 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:58:31.176021 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171575 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:58:31.176021 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171577 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:58:31.176021 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171580 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:58:31.176021 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:31.171582 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:58:31.176021 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.171587 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:58:31.176021 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.172280 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 17:58:31.176358 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.174424 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 17:58:31.176358 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.175315 2576 server.go:1019] "Starting client certificate rotation" Apr 23 17:58:31.176358 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.175410 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:58:31.176358 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.175451 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:58:31.201244 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.201223 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:58:31.203938 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.203916 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:58:31.220378 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.220356 2576 log.go:25] "Validated CRI v1 runtime API" Apr 23 17:58:31.225621 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.225601 2576 log.go:25] "Validated CRI v1 image API" Apr 23 17:58:31.226850 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.226832 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 17:58:31.231190 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.231165 2576 fs.go:135] Filesystem UUIDs: map[2a3b3802-eb45-4109-b553-c51fef6ec60f:/dev/nvme0n1p3 57873d7a-dcff-459d-82e7-f63d5acfaf3a:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 23 17:58:31.231277 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.231189 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 17:58:31.232242 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.232224 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:58:31.237141 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.237012 2576 manager.go:217] Machine: {Timestamp:2026-04-23 17:58:31.235119668 +0000 UTC m=+0.402446978 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3088023 MemoryCapacity:32812163072 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c9d6c6c2d8523910431cc0920fae3 SystemUUID:ec2c9d6c-6c2d-8523-9104-31cc0920fae3 BootID:ad2e606f-17a3-4e79-be58-768450f3cfd3 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406081536 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:6f:36:64:0c:af Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:6f:36:64:0c:af Speed:0 Mtu:9001} {Name:ovs-system MacAddress:52:1f:42:bf:e3:81 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812163072 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 17:58:31.237141 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.237130 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 17:58:31.237307 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.237234 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 17:58:31.239805 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.239775 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 17:58:31.239975 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.239807 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-157.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 17:58:31.240063 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.239989 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 17:58:31.240063 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.240001 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 17:58:31.240063 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.240019 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:58:31.240848 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.240835 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:58:31.241617 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.241604 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:58:31.241910 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.241897 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 17:58:31.244166 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.244153 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 23 17:58:31.244234 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.244173 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 17:58:31.244234 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.244189 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 17:58:31.244234 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.244203 2576 kubelet.go:397] "Adding apiserver pod source" Apr 23 17:58:31.244234 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.244217 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 17:58:31.245217 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.245204 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:58:31.245283 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.245228 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:58:31.248085 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.248066 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 17:58:31.249377 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.249363 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 17:58:31.251007 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.250993 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 17:58:31.251068 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.251019 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 17:58:31.251068 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.251025 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 17:58:31.251068 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.251030 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 17:58:31.251068 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.251036 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 17:58:31.251068 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.251041 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 17:58:31.251068 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.251047 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 17:58:31.251068 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.251053 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 17:58:31.251068 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.251059 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 17:58:31.251068 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.251065 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 17:58:31.251300 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.251077 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 17:58:31.251300 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.251086 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 17:58:31.252654 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.252643 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 17:58:31.252654 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.252654 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 17:58:31.254885 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:31.254853 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-157.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 17:58:31.255093 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.255070 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-157.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:58:31.255301 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:31.255272 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 17:58:31.256526 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.256513 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 17:58:31.256573 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.256549 2576 server.go:1295] "Started kubelet" Apr 23 17:58:31.256657 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.256626 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 17:58:31.257097 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.256893 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 17:58:31.257247 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.257226 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 17:58:31.257527 ip-10-0-137-157 systemd[1]: Started Kubernetes Kubelet. Apr 23 17:58:31.258547 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.258518 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 17:58:31.259559 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.259547 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 23 17:58:31.266090 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.266064 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 17:58:31.266562 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.266533 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 17:58:31.266964 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:31.266932 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 17:58:31.267288 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.267268 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 17:58:31.267288 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.267270 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 17:58:31.267408 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.267302 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 17:58:31.267408 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.267367 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 23 17:58:31.267408 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.267376 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 23 17:58:31.267644 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:31.267602 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-157.ec2.internal\" not found" Apr 23 17:58:31.267644 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.267645 2576 factory.go:153] Registering CRI-O factory Apr 23 17:58:31.267798 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.267665 2576 factory.go:223] Registration of the crio container factory successfully Apr 23 17:58:31.267798 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.267712 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 17:58:31.267798 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.267722 2576 factory.go:55] Registering systemd factory Apr 23 17:58:31.267798 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.267731 2576 factory.go:223] Registration of the systemd container factory successfully Apr 23 17:58:31.267798 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.267751 2576 factory.go:103] Registering Raw factory Apr 23 17:58:31.267798 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.267765 2576 manager.go:1196] Started watching for new ooms in manager Apr 23 17:58:31.268079 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.268058 2576 manager.go:319] Starting recovery of all containers Apr 23 17:58:31.271606 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:31.271558 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-137-157.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 17:58:31.271898 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:31.271876 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 17:58:31.272332 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:31.271394 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-157.ec2.internal.18a90e2c43ff4a45 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-157.ec2.internal,UID:ip-10-0-137-157.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-157.ec2.internal,},FirstTimestamp:2026-04-23 17:58:31.256525381 +0000 UTC m=+0.423852690,LastTimestamp:2026-04-23 17:58:31.256525381 +0000 UTC m=+0.423852690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-157.ec2.internal,}" Apr 23 17:58:31.276673 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.276538 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-r9zlb" Apr 23 17:58:31.278789 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.278775 2576 manager.go:324] Recovery completed Apr 23 17:58:31.282803 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.282791 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:58:31.284067 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.284052 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-r9zlb" Apr 23 17:58:31.285336 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.285322 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-157.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:58:31.285400 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.285365 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-157.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:58:31.285400 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.285376 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-157.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:58:31.285946 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.285931 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 17:58:31.285946 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.285946 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 17:58:31.286041 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.285963 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:58:31.286812 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:31.286741 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-157.ec2.internal.18a90e2c45b6ecae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-157.ec2.internal,UID:ip-10-0-137-157.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-137-157.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-137-157.ec2.internal,},FirstTimestamp:2026-04-23 17:58:31.285337262 +0000 UTC m=+0.452664572,LastTimestamp:2026-04-23 17:58:31.285337262 +0000 UTC m=+0.452664572,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-157.ec2.internal,}" Apr 23 17:58:31.288326 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.288313 2576 policy_none.go:49] "None policy: Start" Apr 23 17:58:31.288402 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.288330 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 17:58:31.288402 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.288341 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 23 17:58:31.327043 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.327027 2576 manager.go:341] "Starting Device Plugin manager" Apr 23 17:58:31.340810 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:31.327077 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 17:58:31.340810 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.327093 2576 server.go:85] "Starting device plugin registration server" Apr 23 17:58:31.340810 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.327775 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 17:58:31.340810 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.327796 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 17:58:31.340810 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.328068 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 17:58:31.340810 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.328193 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 17:58:31.340810 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.328206 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 17:58:31.340810 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:31.328891 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 17:58:31.340810 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:31.328959 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-157.ec2.internal\" not found" Apr 23 17:58:31.408866 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.408787 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 17:58:31.410110 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.410094 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 17:58:31.410180 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.410125 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 17:58:31.410180 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.410144 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 17:58:31.410180 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.410152 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 17:58:31.410328 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:31.410186 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 17:58:31.412684 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.412665 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:58:31.428764 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.428746 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:58:31.429710 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.429695 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-157.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:58:31.429769 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.429724 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-157.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:58:31.429769 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.429735 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-157.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:58:31.429769 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.429756 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-157.ec2.internal" Apr 23 17:58:31.438701 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.438684 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-157.ec2.internal" Apr 23 17:58:31.438765 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:31.438705 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-157.ec2.internal\": node \"ip-10-0-137-157.ec2.internal\" not found" Apr 23 17:58:31.465219 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:31.465197 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-157.ec2.internal\" not found" Apr 23 17:58:31.510726 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.510686 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-157.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-157.ec2.internal"] Apr 23 17:58:31.510820 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.510807 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:58:31.512535 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.512520 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-157.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:58:31.512592 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.512550 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-157.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:58:31.512592 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.512560 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-157.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:58:31.513879 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.513867 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:58:31.514024 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.514009 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-157.ec2.internal" Apr 23 17:58:31.514076 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.514041 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:58:31.514608 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.514579 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-157.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:58:31.514696 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.514589 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-157.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:58:31.514696 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.514616 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-157.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:58:31.514696 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.514630 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-157.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:58:31.514696 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.514632 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-157.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:58:31.514696 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.514646 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-157.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:58:31.515916 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.515898 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-157.ec2.internal" Apr 23 17:58:31.515997 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.515929 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:58:31.516578 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.516562 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-157.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:58:31.516650 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.516592 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-157.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:58:31.516650 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.516618 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-157.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:58:31.539129 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:31.539111 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-157.ec2.internal\" not found" node="ip-10-0-137-157.ec2.internal" Apr 23 17:58:31.543563 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:31.543546 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-157.ec2.internal\" not found" node="ip-10-0-137-157.ec2.internal" Apr 23 17:58:31.565866 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:31.565842 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-157.ec2.internal\" not found" Apr 23 17:58:31.568671 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.568654 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e0be1704db5e6b55a9c3869587898cb5-config\") pod \"kube-apiserver-proxy-ip-10-0-137-157.ec2.internal\" (UID: \"e0be1704db5e6b55a9c3869587898cb5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-157.ec2.internal" Apr 23 17:58:31.568745 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.568680 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/51c48410f0a83abd5140e158944b1f13-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-157.ec2.internal\" (UID: \"51c48410f0a83abd5140e158944b1f13\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-157.ec2.internal" Apr 23 17:58:31.568745 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.568697 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/51c48410f0a83abd5140e158944b1f13-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-157.ec2.internal\" (UID: \"51c48410f0a83abd5140e158944b1f13\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-157.ec2.internal" Apr 23 17:58:31.666999 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:31.666931 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-157.ec2.internal\" not found" Apr 23 17:58:31.669198 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.669171 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/51c48410f0a83abd5140e158944b1f13-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-157.ec2.internal\" (UID: \"51c48410f0a83abd5140e158944b1f13\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-157.ec2.internal" Apr 23 17:58:31.669265 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.669209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/51c48410f0a83abd5140e158944b1f13-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-157.ec2.internal\" (UID: \"51c48410f0a83abd5140e158944b1f13\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-157.ec2.internal" Apr 23 17:58:31.669265 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.669228 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e0be1704db5e6b55a9c3869587898cb5-config\") pod \"kube-apiserver-proxy-ip-10-0-137-157.ec2.internal\" (UID: \"e0be1704db5e6b55a9c3869587898cb5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-157.ec2.internal" Apr 23 17:58:31.669265 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.669256 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/51c48410f0a83abd5140e158944b1f13-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-157.ec2.internal\" (UID: \"51c48410f0a83abd5140e158944b1f13\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-157.ec2.internal" Apr 23 17:58:31.669368 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.669259 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/51c48410f0a83abd5140e158944b1f13-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-157.ec2.internal\" (UID: \"51c48410f0a83abd5140e158944b1f13\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-157.ec2.internal" Apr 23 17:58:31.669368 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.669291 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e0be1704db5e6b55a9c3869587898cb5-config\") pod \"kube-apiserver-proxy-ip-10-0-137-157.ec2.internal\" (UID: \"e0be1704db5e6b55a9c3869587898cb5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-157.ec2.internal" Apr 23 17:58:31.767683 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:31.767656 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-157.ec2.internal\" not found" Apr 23 17:58:31.841017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.840986 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-157.ec2.internal" Apr 23 17:58:31.846637 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:31.846618 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-157.ec2.internal" Apr 23 17:58:31.868214 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:31.868183 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-157.ec2.internal\" not found" Apr 23 17:58:31.968694 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:31.968608 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-157.ec2.internal\" not found" Apr 23 17:58:32.068975 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:32.068950 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-157.ec2.internal\" not found" Apr 23 17:58:32.070563 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:32.070542 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:58:32.136249 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:32.136223 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:58:32.169746 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:32.169722 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-157.ec2.internal\" not found" Apr 23 17:58:32.175000 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:32.174989 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 17:58:32.175115 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:32.175101 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 17:58:32.175157 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:32.175137 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 17:58:32.175186 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:32.175144 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 17:58:32.266899 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:32.266872 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 17:58:32.269799 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:32.269780 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-157.ec2.internal\" not found" Apr 23 17:58:32.276225 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:32.276205 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:58:32.286772 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:32.286736 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 17:53:31 +0000 UTC" deadline="2027-12-06 19:37:04.64024047 +0000 UTC" Apr 23 17:58:32.286772 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:32.286767 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14209h38m32.353476765s" Apr 23 17:58:32.302548 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:32.302531 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-zdkh5" Apr 23 17:58:32.310233 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:32.310187 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0be1704db5e6b55a9c3869587898cb5.slice/crio-9c659c7559d392f41c78906f8b6c59fadb9fecb2b7466f0ed3ebadebff30c7d3 WatchSource:0}: Error finding container 9c659c7559d392f41c78906f8b6c59fadb9fecb2b7466f0ed3ebadebff30c7d3: Status 404 returned error can't find the container with id 9c659c7559d392f41c78906f8b6c59fadb9fecb2b7466f0ed3ebadebff30c7d3 Apr 23 17:58:32.310309 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:32.310232 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-zdkh5" Apr 23 17:58:32.310908 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:32.310889 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51c48410f0a83abd5140e158944b1f13.slice/crio-27b69b76ee8d8a0bd42a043b47d13ebca826ec1ad15f7248a3f8d3a13fc5f861 WatchSource:0}: Error finding container 27b69b76ee8d8a0bd42a043b47d13ebca826ec1ad15f7248a3f8d3a13fc5f861: Status 404 returned error can't find the container with id 27b69b76ee8d8a0bd42a043b47d13ebca826ec1ad15f7248a3f8d3a13fc5f861 Apr 23 17:58:32.316920 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:32.316903 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:58:32.370901 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:32.370874 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-157.ec2.internal\" not found" Apr 23 17:58:32.413359 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:32.413299 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-157.ec2.internal" event={"ID":"e0be1704db5e6b55a9c3869587898cb5","Type":"ContainerStarted","Data":"9c659c7559d392f41c78906f8b6c59fadb9fecb2b7466f0ed3ebadebff30c7d3"} Apr 23 17:58:32.414242 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:32.414218 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-157.ec2.internal" event={"ID":"51c48410f0a83abd5140e158944b1f13","Type":"ContainerStarted","Data":"27b69b76ee8d8a0bd42a043b47d13ebca826ec1ad15f7248a3f8d3a13fc5f861"} Apr 23 17:58:32.425649 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:32.425630 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:58:32.467398 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:32.467369 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-157.ec2.internal" Apr 23 17:58:32.480587 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:32.480530 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:58:32.481453 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:32.481429 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-157.ec2.internal" Apr 23 17:58:32.489454 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:32.489437 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:58:33.245428 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.245398 2576 apiserver.go:52] "Watching apiserver" Apr 23 17:58:33.252331 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.252306 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 17:58:33.252747 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.252721 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6","openshift-image-registry/node-ca-k8ncf","openshift-multus/multus-additional-cni-plugins-xznrt","openshift-network-diagnostics/network-check-target-58nsw","openshift-network-operator/iptables-alerter-fhvhd","kube-system/konnectivity-agent-7kq2r","openshift-cluster-node-tuning-operator/tuned-jpgtk","openshift-dns/node-resolver-x7kw5","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-157.ec2.internal","openshift-multus/multus-vrgrj","openshift-multus/network-metrics-daemon-q98mx","openshift-ovn-kubernetes/ovnkube-node-7xtlp","kube-system/kube-apiserver-proxy-ip-10-0-137-157.ec2.internal"] Apr 23 17:58:33.255119 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.255098 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.255227 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.255209 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-k8ncf" Apr 23 17:58:33.258490 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.257793 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 17:58:33.258490 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.257852 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:58:33.258490 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.258094 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-przhk\"" Apr 23 17:58:33.258490 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.258137 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 17:58:33.258490 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.258187 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wpjkn\"" Apr 23 17:58:33.258490 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.258322 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 17:58:33.258490 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.258400 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 17:58:33.259527 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.259496 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:58:33.259893 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.259874 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.259959 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:33.259827 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58nsw" podUID="8f6f3666-ed32-45c7-8804-c3a7951d4815" Apr 23 17:58:33.261557 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.261529 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fhvhd" Apr 23 17:58:33.262109 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.262088 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 17:58:33.262291 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.262275 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 17:58:33.262367 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.262298 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 17:58:33.262503 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.262485 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-2wb4q\"" Apr 23 17:58:33.262629 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.262561 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 17:58:33.262844 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.262825 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 17:58:33.263664 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.263645 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 17:58:33.263742 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.263685 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:58:33.263806 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.263753 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-zljhc\"" Apr 23 17:58:33.263806 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.263761 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7kq2r" Apr 23 17:58:33.263986 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.263970 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 17:58:33.265663 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.265427 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 17:58:33.265663 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.265614 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-j66km\"" Apr 23 17:58:33.265663 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.265621 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 17:58:33.266549 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.266529 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.266824 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.266682 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-x7kw5" Apr 23 17:58:33.267913 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.267894 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.268307 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.268286 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 17:58:33.268445 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.268291 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 17:58:33.268549 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.268531 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mm2md\"" Apr 23 17:58:33.268665 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.268291 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 17:58:33.268848 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.268831 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 17:58:33.268967 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.268833 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 17:58:33.269042 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.268983 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-5gfcv\"" Apr 23 17:58:33.269154 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.269139 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:33.269249 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:33.269218 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q98mx" podUID="ab8ad387-1bfb-42ab-ad18-c8ea20362f8f" Apr 23 17:58:33.269918 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.269900 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 17:58:33.270008 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.269987 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mqqwn\"" Apr 23 17:58:33.270586 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.270569 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.272425 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.272409 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 17:58:33.272534 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.272445 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-75hp2\"" Apr 23 17:58:33.272713 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.272692 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 17:58:33.273026 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.272999 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 17:58:33.273026 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.273014 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 17:58:33.273279 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.273231 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 17:58:33.273418 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.273232 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 17:58:33.278232 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278213 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-system-cni-dir\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.278330 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278242 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-multus-socket-dir-parent\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.278330 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278269 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-etc-kubernetes\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.278330 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278288 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-run\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.278330 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278310 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/365c5763-fac0-4121-9de2-0a669a25bc8c-serviceca\") pod \"node-ca-k8ncf\" (UID: \"365c5763-fac0-4121-9de2-0a669a25bc8c\") " pod="openshift-image-registry/node-ca-k8ncf" Apr 23 17:58:33.278535 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278346 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqdxt\" (UniqueName: \"kubernetes.io/projected/3ce8a6d4-d062-4813-b21e-b06b4d147b13-kube-api-access-wqdxt\") pod \"node-resolver-x7kw5\" (UID: \"3ce8a6d4-d062-4813-b21e-b06b4d147b13\") " pod="openshift-dns/node-resolver-x7kw5" Apr 23 17:58:33.278535 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278384 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs\") pod \"network-metrics-daemon-q98mx\" (UID: \"ab8ad387-1bfb-42ab-ad18-c8ea20362f8f\") " pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:33.278535 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278408 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/60dd6cf9-cd33-4a4c-bc5c-61d0fcd7b08a-konnectivity-ca\") pod \"konnectivity-agent-7kq2r\" (UID: \"60dd6cf9-cd33-4a4c-bc5c-61d0fcd7b08a\") " pod="kube-system/konnectivity-agent-7kq2r" Apr 23 17:58:33.278535 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278440 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-host\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.278535 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278509 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3ce8a6d4-d062-4813-b21e-b06b4d147b13-hosts-file\") pod \"node-resolver-x7kw5\" (UID: \"3ce8a6d4-d062-4813-b21e-b06b4d147b13\") " pod="openshift-dns/node-resolver-x7kw5" Apr 23 17:58:33.278535 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278530 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-host-run-netns\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.278757 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278546 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9wkp\" (UniqueName: \"kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp\") pod \"network-check-target-58nsw\" (UID: \"8f6f3666-ed32-45c7-8804-c3a7951d4815\") " pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:58:33.278757 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278565 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-etc-systemd\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.278757 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278594 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b7dafa22-aab9-4274-a348-a27afa11e470-iptables-alerter-script\") pod \"iptables-alerter-fhvhd\" (UID: \"b7dafa22-aab9-4274-a348-a27afa11e470\") " pod="openshift-network-operator/iptables-alerter-fhvhd" Apr 23 17:58:33.278757 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278637 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-multus-cni-dir\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.278757 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278671 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-host-var-lib-cni-bin\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.278757 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278702 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-etc-sysconfig\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.278757 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/29f27201-1f58-4672-a5e6-64fe7aef7d2e-tmp\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.279011 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278801 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr5hw\" (UniqueName: \"kubernetes.io/projected/b7dafa22-aab9-4274-a348-a27afa11e470-kube-api-access-xr5hw\") pod \"iptables-alerter-fhvhd\" (UID: \"b7dafa22-aab9-4274-a348-a27afa11e470\") " pod="openshift-network-operator/iptables-alerter-fhvhd" Apr 23 17:58:33.279011 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278846 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52klg\" (UniqueName: \"kubernetes.io/projected/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-kube-api-access-52klg\") pod \"network-metrics-daemon-q98mx\" (UID: \"ab8ad387-1bfb-42ab-ad18-c8ea20362f8f\") " pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:33.279011 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278893 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f66c5d99-c18e-417a-b05d-439bf68ddbff-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.279011 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-host-var-lib-kubelet\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.279011 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.278986 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f66c5d99-c18e-417a-b05d-439bf68ddbff-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.279011 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279007 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-lib-modules\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.279211 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279021 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ce8a6d4-d062-4813-b21e-b06b4d147b13-tmp-dir\") pod \"node-resolver-x7kw5\" (UID: \"3ce8a6d4-d062-4813-b21e-b06b4d147b13\") " pod="openshift-dns/node-resolver-x7kw5" Apr 23 17:58:33.279211 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279039 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-cnibin\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.279211 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279053 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-host-run-k8s-cni-cncf-io\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.279211 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279093 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-host-run-multus-certs\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.279211 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279145 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f66c5d99-c18e-417a-b05d-439bf68ddbff-os-release\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.279211 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f66c5d99-c18e-417a-b05d-439bf68ddbff-cni-binary-copy\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.279211 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279196 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f66c5d99-c18e-417a-b05d-439bf68ddbff-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.279447 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279227 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-etc-sysctl-d\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.279447 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279271 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-sys\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.279447 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/29f27201-1f58-4672-a5e6-64fe7aef7d2e-etc-tuned\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.279447 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279365 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c190cfdd-0e26-48e7-b404-b2097e05ec6e-socket-dir\") pod \"aws-ebs-csi-driver-node-jbnf6\" (UID: \"c190cfdd-0e26-48e7-b404-b2097e05ec6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.279447 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279389 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-var-lib-kubelet\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.279447 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279406 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-etc-sysctl-conf\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.279447 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279428 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssf6w\" (UniqueName: \"kubernetes.io/projected/29f27201-1f58-4672-a5e6-64fe7aef7d2e-kube-api-access-ssf6w\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.279894 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279454 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx7bx\" (UniqueName: \"kubernetes.io/projected/365c5763-fac0-4121-9de2-0a669a25bc8c-kube-api-access-fx7bx\") pod \"node-ca-k8ncf\" (UID: \"365c5763-fac0-4121-9de2-0a669a25bc8c\") " pod="openshift-image-registry/node-ca-k8ncf" Apr 23 17:58:33.279894 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279503 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c190cfdd-0e26-48e7-b404-b2097e05ec6e-sys-fs\") pod \"aws-ebs-csi-driver-node-jbnf6\" (UID: \"c190cfdd-0e26-48e7-b404-b2097e05ec6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.279894 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279520 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2bcz\" (UniqueName: \"kubernetes.io/projected/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-kube-api-access-f2bcz\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.279894 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279566 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c190cfdd-0e26-48e7-b404-b2097e05ec6e-device-dir\") pod \"aws-ebs-csi-driver-node-jbnf6\" (UID: \"c190cfdd-0e26-48e7-b404-b2097e05ec6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.279894 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279606 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c190cfdd-0e26-48e7-b404-b2097e05ec6e-etc-selinux\") pod \"aws-ebs-csi-driver-node-jbnf6\" (UID: \"c190cfdd-0e26-48e7-b404-b2097e05ec6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.279894 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279652 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-etc-kubernetes\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.280170 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279925 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c190cfdd-0e26-48e7-b404-b2097e05ec6e-registration-dir\") pod \"aws-ebs-csi-driver-node-jbnf6\" (UID: \"c190cfdd-0e26-48e7-b404-b2097e05ec6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.280170 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279958 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c190cfdd-0e26-48e7-b404-b2097e05ec6e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jbnf6\" (UID: \"c190cfdd-0e26-48e7-b404-b2097e05ec6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.280170 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.279993 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-host-var-lib-cni-multus\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.280170 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.280047 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f66c5d99-c18e-417a-b05d-439bf68ddbff-system-cni-dir\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.280170 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.280091 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d8z8\" (UniqueName: \"kubernetes.io/projected/f66c5d99-c18e-417a-b05d-439bf68ddbff-kube-api-access-6d8z8\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.280170 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.280144 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/60dd6cf9-cd33-4a4c-bc5c-61d0fcd7b08a-agent-certs\") pod \"konnectivity-agent-7kq2r\" (UID: \"60dd6cf9-cd33-4a4c-bc5c-61d0fcd7b08a\") " pod="kube-system/konnectivity-agent-7kq2r" Apr 23 17:58:33.280386 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.280200 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-etc-modprobe-d\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.280386 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.280250 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-multus-daemon-config\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.280386 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.280303 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/365c5763-fac0-4121-9de2-0a669a25bc8c-host\") pod \"node-ca-k8ncf\" (UID: \"365c5763-fac0-4121-9de2-0a669a25bc8c\") " pod="openshift-image-registry/node-ca-k8ncf" Apr 23 17:58:33.280386 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.280347 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-os-release\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.280554 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.280386 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-cni-binary-copy\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.280554 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.280413 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f66c5d99-c18e-417a-b05d-439bf68ddbff-cnibin\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.280554 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.280439 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b7dafa22-aab9-4274-a348-a27afa11e470-host-slash\") pod \"iptables-alerter-fhvhd\" (UID: \"b7dafa22-aab9-4274-a348-a27afa11e470\") " pod="openshift-network-operator/iptables-alerter-fhvhd" Apr 23 17:58:33.280554 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.280486 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6n88\" (UniqueName: \"kubernetes.io/projected/c190cfdd-0e26-48e7-b404-b2097e05ec6e-kube-api-access-c6n88\") pod \"aws-ebs-csi-driver-node-jbnf6\" (UID: \"c190cfdd-0e26-48e7-b404-b2097e05ec6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.280554 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.280514 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-hostroot\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.280554 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.280538 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-multus-conf-dir\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.311142 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.311116 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:53:32 +0000 UTC" deadline="2027-12-15 17:23:55.626619231 +0000 UTC" Apr 23 17:58:33.311142 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.311139 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14423h25m22.315482778s" Apr 23 17:58:33.316474 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.316418 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:58:33.368946 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.368916 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 17:58:33.381009 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.380978 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-etc-sysctl-conf\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.381161 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381015 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssf6w\" (UniqueName: \"kubernetes.io/projected/29f27201-1f58-4672-a5e6-64fe7aef7d2e-kube-api-access-ssf6w\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.381161 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381048 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fx7bx\" (UniqueName: \"kubernetes.io/projected/365c5763-fac0-4121-9de2-0a669a25bc8c-kube-api-access-fx7bx\") pod \"node-ca-k8ncf\" (UID: \"365c5763-fac0-4121-9de2-0a669a25bc8c\") " pod="openshift-image-registry/node-ca-k8ncf" Apr 23 17:58:33.381161 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381082 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c190cfdd-0e26-48e7-b404-b2097e05ec6e-sys-fs\") pod \"aws-ebs-csi-driver-node-jbnf6\" (UID: \"c190cfdd-0e26-48e7-b404-b2097e05ec6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.381161 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381107 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2bcz\" (UniqueName: \"kubernetes.io/projected/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-kube-api-access-f2bcz\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.381161 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381140 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-ovn-node-metrics-cert\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.381403 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381165 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c190cfdd-0e26-48e7-b404-b2097e05ec6e-device-dir\") pod \"aws-ebs-csi-driver-node-jbnf6\" (UID: \"c190cfdd-0e26-48e7-b404-b2097e05ec6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.381403 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381177 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-etc-sysctl-conf\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.381403 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381192 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c190cfdd-0e26-48e7-b404-b2097e05ec6e-etc-selinux\") pod \"aws-ebs-csi-driver-node-jbnf6\" (UID: \"c190cfdd-0e26-48e7-b404-b2097e05ec6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.381403 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381198 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c190cfdd-0e26-48e7-b404-b2097e05ec6e-sys-fs\") pod \"aws-ebs-csi-driver-node-jbnf6\" (UID: \"c190cfdd-0e26-48e7-b404-b2097e05ec6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.381403 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381218 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-etc-kubernetes\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.381403 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381247 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-host-slash\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.381403 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381272 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-run-ovn\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.381403 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381276 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-etc-kubernetes\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.381403 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381298 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c190cfdd-0e26-48e7-b404-b2097e05ec6e-registration-dir\") pod \"aws-ebs-csi-driver-node-jbnf6\" (UID: \"c190cfdd-0e26-48e7-b404-b2097e05ec6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.381403 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381340 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c190cfdd-0e26-48e7-b404-b2097e05ec6e-device-dir\") pod \"aws-ebs-csi-driver-node-jbnf6\" (UID: \"c190cfdd-0e26-48e7-b404-b2097e05ec6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.381403 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381342 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c190cfdd-0e26-48e7-b404-b2097e05ec6e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jbnf6\" (UID: \"c190cfdd-0e26-48e7-b404-b2097e05ec6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.381403 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381358 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c190cfdd-0e26-48e7-b404-b2097e05ec6e-registration-dir\") pod \"aws-ebs-csi-driver-node-jbnf6\" (UID: \"c190cfdd-0e26-48e7-b404-b2097e05ec6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.381403 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381384 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c190cfdd-0e26-48e7-b404-b2097e05ec6e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jbnf6\" (UID: \"c190cfdd-0e26-48e7-b404-b2097e05ec6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.381403 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-host-var-lib-cni-multus\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.381403 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381398 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c190cfdd-0e26-48e7-b404-b2097e05ec6e-etc-selinux\") pod \"aws-ebs-csi-driver-node-jbnf6\" (UID: \"c190cfdd-0e26-48e7-b404-b2097e05ec6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.382066 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-host-var-lib-cni-multus\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.382066 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381423 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f66c5d99-c18e-417a-b05d-439bf68ddbff-system-cni-dir\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.382066 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6d8z8\" (UniqueName: \"kubernetes.io/projected/f66c5d99-c18e-417a-b05d-439bf68ddbff-kube-api-access-6d8z8\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.382066 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381507 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/60dd6cf9-cd33-4a4c-bc5c-61d0fcd7b08a-agent-certs\") pod \"konnectivity-agent-7kq2r\" (UID: \"60dd6cf9-cd33-4a4c-bc5c-61d0fcd7b08a\") " pod="kube-system/konnectivity-agent-7kq2r" Apr 23 17:58:33.382066 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381522 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-etc-modprobe-d\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.382066 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381537 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-multus-daemon-config\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.382066 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381507 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f66c5d99-c18e-417a-b05d-439bf68ddbff-system-cni-dir\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.382066 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/365c5763-fac0-4121-9de2-0a669a25bc8c-host\") pod \"node-ca-k8ncf\" (UID: \"365c5763-fac0-4121-9de2-0a669a25bc8c\") " pod="openshift-image-registry/node-ca-k8ncf" Apr 23 17:58:33.382066 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381609 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-os-release\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.382066 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381637 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-cni-binary-copy\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.382066 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381665 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f66c5d99-c18e-417a-b05d-439bf68ddbff-cnibin\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.382066 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381667 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-etc-modprobe-d\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.382066 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381694 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-run-openvswitch\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.382066 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381697 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-os-release\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.382066 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381723 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwjrt\" (UniqueName: \"kubernetes.io/projected/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-kube-api-access-qwjrt\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.382066 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381746 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/365c5763-fac0-4121-9de2-0a669a25bc8c-host\") pod \"node-ca-k8ncf\" (UID: \"365c5763-fac0-4121-9de2-0a669a25bc8c\") " pod="openshift-image-registry/node-ca-k8ncf" Apr 23 17:58:33.382066 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381755 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b7dafa22-aab9-4274-a348-a27afa11e470-host-slash\") pod \"iptables-alerter-fhvhd\" (UID: \"b7dafa22-aab9-4274-a348-a27afa11e470\") " pod="openshift-network-operator/iptables-alerter-fhvhd" Apr 23 17:58:33.382828 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381797 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b7dafa22-aab9-4274-a348-a27afa11e470-host-slash\") pod \"iptables-alerter-fhvhd\" (UID: \"b7dafa22-aab9-4274-a348-a27afa11e470\") " pod="openshift-network-operator/iptables-alerter-fhvhd" Apr 23 17:58:33.382828 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381805 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f66c5d99-c18e-417a-b05d-439bf68ddbff-cnibin\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.382828 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381803 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6n88\" (UniqueName: \"kubernetes.io/projected/c190cfdd-0e26-48e7-b404-b2097e05ec6e-kube-api-access-c6n88\") pod \"aws-ebs-csi-driver-node-jbnf6\" (UID: \"c190cfdd-0e26-48e7-b404-b2097e05ec6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.382828 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381842 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-hostroot\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.382828 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381867 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-multus-conf-dir\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.382828 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381895 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-var-lib-openvswitch\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.382828 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381924 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-ovnkube-script-lib\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.382828 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381955 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-system-cni-dir\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.382828 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.381980 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-multus-socket-dir-parent\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.382828 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382007 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.382828 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-etc-kubernetes\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.382828 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-hostroot\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.382828 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382066 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-run\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.382828 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382082 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-multus-conf-dir\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.382828 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382098 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-multus-daemon-config\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.382828 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382111 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/365c5763-fac0-4121-9de2-0a669a25bc8c-serviceca\") pod \"node-ca-k8ncf\" (UID: \"365c5763-fac0-4121-9de2-0a669a25bc8c\") " pod="openshift-image-registry/node-ca-k8ncf" Apr 23 17:58:33.382828 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382141 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqdxt\" (UniqueName: \"kubernetes.io/projected/3ce8a6d4-d062-4813-b21e-b06b4d147b13-kube-api-access-wqdxt\") pod \"node-resolver-x7kw5\" (UID: \"3ce8a6d4-d062-4813-b21e-b06b4d147b13\") " pod="openshift-dns/node-resolver-x7kw5" Apr 23 17:58:33.383634 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382147 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-system-cni-dir\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.383634 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382143 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 17:58:33.383634 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382195 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs\") pod \"network-metrics-daemon-q98mx\" (UID: \"ab8ad387-1bfb-42ab-ad18-c8ea20362f8f\") " pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:33.383634 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382208 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-run\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.383634 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382224 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/60dd6cf9-cd33-4a4c-bc5c-61d0fcd7b08a-konnectivity-ca\") pod \"konnectivity-agent-7kq2r\" (UID: \"60dd6cf9-cd33-4a4c-bc5c-61d0fcd7b08a\") " pod="kube-system/konnectivity-agent-7kq2r" Apr 23 17:58:33.383634 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382261 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-host\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.383634 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:33.382300 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:33.383634 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382347 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-multus-socket-dir-parent\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.383634 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:33.382380 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs podName:ab8ad387-1bfb-42ab-ad18-c8ea20362f8f nodeName:}" failed. No retries permitted until 2026-04-23 17:58:33.882340847 +0000 UTC m=+3.049668144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs") pod "network-metrics-daemon-q98mx" (UID: "ab8ad387-1bfb-42ab-ad18-c8ea20362f8f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:33.383634 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382414 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3ce8a6d4-d062-4813-b21e-b06b4d147b13-hosts-file\") pod \"node-resolver-x7kw5\" (UID: \"3ce8a6d4-d062-4813-b21e-b06b4d147b13\") " pod="openshift-dns/node-resolver-x7kw5" Apr 23 17:58:33.383634 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382441 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-host-run-netns\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.383634 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382495 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-run-systemd\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.383634 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382521 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-etc-openvswitch\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.383634 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382547 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9wkp\" (UniqueName: \"kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp\") pod \"network-check-target-58nsw\" (UID: \"8f6f3666-ed32-45c7-8804-c3a7951d4815\") " pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:58:33.383634 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382570 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3ce8a6d4-d062-4813-b21e-b06b4d147b13-hosts-file\") pod \"node-resolver-x7kw5\" (UID: \"3ce8a6d4-d062-4813-b21e-b06b4d147b13\") " pod="openshift-dns/node-resolver-x7kw5" Apr 23 17:58:33.383634 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382573 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-ovnkube-config\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.383634 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382580 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/365c5763-fac0-4121-9de2-0a669a25bc8c-serviceca\") pod \"node-ca-k8ncf\" (UID: \"365c5763-fac0-4121-9de2-0a669a25bc8c\") " pod="openshift-image-registry/node-ca-k8ncf" Apr 23 17:58:33.383634 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382624 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-host-run-netns\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.384269 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382628 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-etc-kubernetes\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.384269 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382690 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-etc-systemd\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.384269 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-host\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.384269 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382786 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b7dafa22-aab9-4274-a348-a27afa11e470-iptables-alerter-script\") pod \"iptables-alerter-fhvhd\" (UID: \"b7dafa22-aab9-4274-a348-a27afa11e470\") " pod="openshift-network-operator/iptables-alerter-fhvhd" Apr 23 17:58:33.384269 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382783 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-cni-binary-copy\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.384269 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382816 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-etc-systemd\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.384269 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382837 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/60dd6cf9-cd33-4a4c-bc5c-61d0fcd7b08a-konnectivity-ca\") pod \"konnectivity-agent-7kq2r\" (UID: \"60dd6cf9-cd33-4a4c-bc5c-61d0fcd7b08a\") " pod="kube-system/konnectivity-agent-7kq2r" Apr 23 17:58:33.384269 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382850 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-multus-cni-dir\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.384269 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382940 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-host-var-lib-cni-bin\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.384269 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382982 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-systemd-units\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.384269 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382994 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-multus-cni-dir\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.384269 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.382985 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-host-var-lib-cni-bin\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.384269 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383009 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-host-cni-netd\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.384269 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-etc-sysconfig\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.384269 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/29f27201-1f58-4672-a5e6-64fe7aef7d2e-tmp\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.384269 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383086 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xr5hw\" (UniqueName: \"kubernetes.io/projected/b7dafa22-aab9-4274-a348-a27afa11e470-kube-api-access-xr5hw\") pod \"iptables-alerter-fhvhd\" (UID: \"b7dafa22-aab9-4274-a348-a27afa11e470\") " pod="openshift-network-operator/iptables-alerter-fhvhd" Apr 23 17:58:33.384269 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-etc-sysconfig\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.384269 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383131 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52klg\" (UniqueName: \"kubernetes.io/projected/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-kube-api-access-52klg\") pod \"network-metrics-daemon-q98mx\" (UID: \"ab8ad387-1bfb-42ab-ad18-c8ea20362f8f\") " pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:33.385152 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383157 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f66c5d99-c18e-417a-b05d-439bf68ddbff-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.385152 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383183 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-log-socket\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.385152 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-host-var-lib-kubelet\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.385152 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383237 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f66c5d99-c18e-417a-b05d-439bf68ddbff-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.385152 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383253 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b7dafa22-aab9-4274-a348-a27afa11e470-iptables-alerter-script\") pod \"iptables-alerter-fhvhd\" (UID: \"b7dafa22-aab9-4274-a348-a27afa11e470\") " pod="openshift-network-operator/iptables-alerter-fhvhd" Apr 23 17:58:33.385152 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383282 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-host-run-ovn-kubernetes\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.385152 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-lib-modules\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.385152 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383337 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ce8a6d4-d062-4813-b21e-b06b4d147b13-tmp-dir\") pod \"node-resolver-x7kw5\" (UID: \"3ce8a6d4-d062-4813-b21e-b06b4d147b13\") " pod="openshift-dns/node-resolver-x7kw5" Apr 23 17:58:33.385152 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383339 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-host-var-lib-kubelet\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.385152 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383363 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-cnibin\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.385152 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383390 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-host-run-k8s-cni-cncf-io\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.385152 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383413 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-host-run-multus-certs\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.385152 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383446 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f66c5d99-c18e-417a-b05d-439bf68ddbff-os-release\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.385152 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383503 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-lib-modules\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.385152 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383508 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f66c5d99-c18e-417a-b05d-439bf68ddbff-cni-binary-copy\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.385152 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383588 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f66c5d99-c18e-417a-b05d-439bf68ddbff-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.385152 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383647 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-etc-sysctl-d\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.385944 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383801 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-sys\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.385944 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383828 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/29f27201-1f58-4672-a5e6-64fe7aef7d2e-etc-tuned\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.385944 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383853 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c190cfdd-0e26-48e7-b404-b2097e05ec6e-socket-dir\") pod \"aws-ebs-csi-driver-node-jbnf6\" (UID: \"c190cfdd-0e26-48e7-b404-b2097e05ec6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.385944 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383879 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f66c5d99-c18e-417a-b05d-439bf68ddbff-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.385944 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383910 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ce8a6d4-d062-4813-b21e-b06b4d147b13-tmp-dir\") pod \"node-resolver-x7kw5\" (UID: \"3ce8a6d4-d062-4813-b21e-b06b4d147b13\") " pod="openshift-dns/node-resolver-x7kw5" Apr 23 17:58:33.385944 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-host-kubelet\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.385944 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383969 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f66c5d99-c18e-417a-b05d-439bf68ddbff-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.385944 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383989 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-etc-sysctl-d\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.385944 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383996 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-host-run-netns\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.385944 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.383993 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f66c5d99-c18e-417a-b05d-439bf68ddbff-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.385944 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.384022 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-host-run-multus-certs\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.385944 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.384026 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-host-run-k8s-cni-cncf-io\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.385944 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.384026 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-node-log\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.385944 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.384032 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f66c5d99-c18e-417a-b05d-439bf68ddbff-cni-binary-copy\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.385944 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.384045 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-sys\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.385944 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.384070 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f66c5d99-c18e-417a-b05d-439bf68ddbff-os-release\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.385944 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.384094 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-cnibin\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.386710 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.384105 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-host-cni-bin\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.386710 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.384145 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c190cfdd-0e26-48e7-b404-b2097e05ec6e-socket-dir\") pod \"aws-ebs-csi-driver-node-jbnf6\" (UID: \"c190cfdd-0e26-48e7-b404-b2097e05ec6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.386710 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.384154 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-var-lib-kubelet\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.386710 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.384184 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-env-overrides\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.386710 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.384193 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/29f27201-1f58-4672-a5e6-64fe7aef7d2e-var-lib-kubelet\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.386710 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.386430 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/29f27201-1f58-4672-a5e6-64fe7aef7d2e-tmp\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.386710 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.386547 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/29f27201-1f58-4672-a5e6-64fe7aef7d2e-etc-tuned\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.387034 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.386819 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/60dd6cf9-cd33-4a4c-bc5c-61d0fcd7b08a-agent-certs\") pod \"konnectivity-agent-7kq2r\" (UID: \"60dd6cf9-cd33-4a4c-bc5c-61d0fcd7b08a\") " pod="kube-system/konnectivity-agent-7kq2r" Apr 23 17:58:33.394176 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:33.394152 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:58:33.394176 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:33.394174 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:58:33.394347 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:33.394184 2576 projected.go:194] Error preparing data for projected volume kube-api-access-x9wkp for pod openshift-network-diagnostics/network-check-target-58nsw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:33.394347 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:33.394246 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp podName:8f6f3666-ed32-45c7-8804-c3a7951d4815 nodeName:}" failed. No retries permitted until 2026-04-23 17:58:33.894231116 +0000 UTC m=+3.061558417 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-x9wkp" (UniqueName: "kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp") pod "network-check-target-58nsw" (UID: "8f6f3666-ed32-45c7-8804-c3a7951d4815") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:33.395820 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.395798 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2bcz\" (UniqueName: \"kubernetes.io/projected/f6a1e63d-4ac5-4843-8b2f-4842157bdd00-kube-api-access-f2bcz\") pod \"multus-vrgrj\" (UID: \"f6a1e63d-4ac5-4843-8b2f-4842157bdd00\") " pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.398021 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.397994 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d8z8\" (UniqueName: \"kubernetes.io/projected/f66c5d99-c18e-417a-b05d-439bf68ddbff-kube-api-access-6d8z8\") pod \"multus-additional-cni-plugins-xznrt\" (UID: \"f66c5d99-c18e-417a-b05d-439bf68ddbff\") " pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.398938 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.398907 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr5hw\" (UniqueName: \"kubernetes.io/projected/b7dafa22-aab9-4274-a348-a27afa11e470-kube-api-access-xr5hw\") pod \"iptables-alerter-fhvhd\" (UID: \"b7dafa22-aab9-4274-a348-a27afa11e470\") " pod="openshift-network-operator/iptables-alerter-fhvhd" Apr 23 17:58:33.398938 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.398902 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6n88\" (UniqueName: \"kubernetes.io/projected/c190cfdd-0e26-48e7-b404-b2097e05ec6e-kube-api-access-c6n88\") pod \"aws-ebs-csi-driver-node-jbnf6\" (UID: \"c190cfdd-0e26-48e7-b404-b2097e05ec6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.399323 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.399304 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx7bx\" (UniqueName: \"kubernetes.io/projected/365c5763-fac0-4121-9de2-0a669a25bc8c-kube-api-access-fx7bx\") pod \"node-ca-k8ncf\" (UID: \"365c5763-fac0-4121-9de2-0a669a25bc8c\") " pod="openshift-image-registry/node-ca-k8ncf" Apr 23 17:58:33.401247 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.401230 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqdxt\" (UniqueName: \"kubernetes.io/projected/3ce8a6d4-d062-4813-b21e-b06b4d147b13-kube-api-access-wqdxt\") pod \"node-resolver-x7kw5\" (UID: \"3ce8a6d4-d062-4813-b21e-b06b4d147b13\") " pod="openshift-dns/node-resolver-x7kw5" Apr 23 17:58:33.406831 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.406812 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52klg\" (UniqueName: \"kubernetes.io/projected/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-kube-api-access-52klg\") pod \"network-metrics-daemon-q98mx\" (UID: \"ab8ad387-1bfb-42ab-ad18-c8ea20362f8f\") " pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:33.433580 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.433546 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssf6w\" (UniqueName: \"kubernetes.io/projected/29f27201-1f58-4672-a5e6-64fe7aef7d2e-kube-api-access-ssf6w\") pod \"tuned-jpgtk\" (UID: \"29f27201-1f58-4672-a5e6-64fe7aef7d2e\") " pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.484520 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484480 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-run-openvswitch\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.484520 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484526 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwjrt\" (UniqueName: \"kubernetes.io/projected/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-kube-api-access-qwjrt\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.484760 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484556 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-var-lib-openvswitch\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.484760 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484599 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-run-openvswitch\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.484760 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484610 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-var-lib-openvswitch\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.484760 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484661 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-ovnkube-script-lib\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.484760 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.484760 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484734 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-run-systemd\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.484760 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-etc-openvswitch\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485081 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484792 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-ovnkube-config\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485081 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484819 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-systemd-units\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485081 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484832 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485081 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484845 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-host-cni-netd\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485081 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484841 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-run-systemd\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485081 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484880 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-log-socket\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485081 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484891 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-systemd-units\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485081 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484910 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-host-run-ovn-kubernetes\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485081 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484927 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-host-cni-netd\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485081 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484946 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-host-kubelet\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485081 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484943 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-log-socket\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485081 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484972 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-host-run-netns\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485081 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484981 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-etc-openvswitch\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485081 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484999 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-host-kubelet\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485081 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484999 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-node-log\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485081 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.485034 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-host-run-netns\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485081 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.484966 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-host-run-ovn-kubernetes\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485794 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.485048 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-node-log\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485794 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.485046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-host-cni-bin\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485794 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.485087 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-host-cni-bin\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485794 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.485132 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-env-overrides\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485794 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.485164 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-ovn-node-metrics-cert\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485794 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.485193 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-host-slash\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485794 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.485222 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-run-ovn\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485794 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.485234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-host-slash\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485794 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.485278 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-run-ovn\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485794 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.485368 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-ovnkube-script-lib\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485794 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.485502 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-ovnkube-config\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.485794 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.485623 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-env-overrides\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.488000 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.487972 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-ovn-node-metrics-cert\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.493982 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.493959 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwjrt\" (UniqueName: \"kubernetes.io/projected/07f7ffd8-67f5-4806-a74c-8c1b4961ac85-kube-api-access-qwjrt\") pod \"ovnkube-node-7xtlp\" (UID: \"07f7ffd8-67f5-4806-a74c-8c1b4961ac85\") " pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.568987 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.568954 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" Apr 23 17:58:33.576864 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.576839 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-k8ncf" Apr 23 17:58:33.583590 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.583555 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xznrt" Apr 23 17:58:33.592704 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.592686 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fhvhd" Apr 23 17:58:33.598253 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.598236 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7kq2r" Apr 23 17:58:33.605836 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.605816 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" Apr 23 17:58:33.613373 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.613354 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-x7kw5" Apr 23 17:58:33.620053 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.620036 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vrgrj" Apr 23 17:58:33.624695 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.624671 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:33.887681 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.887590 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs\") pod \"network-metrics-daemon-q98mx\" (UID: \"ab8ad387-1bfb-42ab-ad18-c8ea20362f8f\") " pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:33.887843 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:33.887735 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:33.887843 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:33.887802 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs podName:ab8ad387-1bfb-42ab-ad18-c8ea20362f8f nodeName:}" failed. No retries permitted until 2026-04-23 17:58:34.887785868 +0000 UTC m=+4.055113169 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs") pod "network-metrics-daemon-q98mx" (UID: "ab8ad387-1bfb-42ab-ad18-c8ea20362f8f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:33.907304 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:33.907278 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29f27201_1f58_4672_a5e6_64fe7aef7d2e.slice/crio-c14624ef96f1d4ea0064c4c22f626715c1b37094b09e09a427e18e5adcb916dd WatchSource:0}: Error finding container c14624ef96f1d4ea0064c4c22f626715c1b37094b09e09a427e18e5adcb916dd: Status 404 returned error can't find the container with id c14624ef96f1d4ea0064c4c22f626715c1b37094b09e09a427e18e5adcb916dd Apr 23 17:58:33.908773 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:33.908750 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf66c5d99_c18e_417a_b05d_439bf68ddbff.slice/crio-7d76fca9cc7bf00a623b1b2493ff672a5313bdc6288d3bec9ee08081f8227042 WatchSource:0}: Error finding container 7d76fca9cc7bf00a623b1b2493ff672a5313bdc6288d3bec9ee08081f8227042: Status 404 returned error can't find the container with id 7d76fca9cc7bf00a623b1b2493ff672a5313bdc6288d3bec9ee08081f8227042 Apr 23 17:58:33.910020 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:33.909997 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07f7ffd8_67f5_4806_a74c_8c1b4961ac85.slice/crio-d98d90f56b5ee8e664d6395a8d1c80198fbe77f014ed476c741e1ebe2aaa3a42 WatchSource:0}: Error finding container d98d90f56b5ee8e664d6395a8d1c80198fbe77f014ed476c741e1ebe2aaa3a42: Status 404 returned error can't find the container with id d98d90f56b5ee8e664d6395a8d1c80198fbe77f014ed476c741e1ebe2aaa3a42 Apr 23 17:58:33.919114 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:33.919086 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod365c5763_fac0_4121_9de2_0a669a25bc8c.slice/crio-45cd27c59365055783e7c755e9faf356f21d77bad81b2950c0d34a16755f1b54 WatchSource:0}: Error finding container 45cd27c59365055783e7c755e9faf356f21d77bad81b2950c0d34a16755f1b54: Status 404 returned error can't find the container with id 45cd27c59365055783e7c755e9faf356f21d77bad81b2950c0d34a16755f1b54 Apr 23 17:58:33.920915 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:58:33.920869 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7dafa22_aab9_4274_a348_a27afa11e470.slice/crio-c6926df3d255d590e013c0a94c3bf4e1b6793f81940a18ca4e741d3dc7a46519 WatchSource:0}: Error finding container c6926df3d255d590e013c0a94c3bf4e1b6793f81940a18ca4e741d3dc7a46519: Status 404 returned error can't find the container with id c6926df3d255d590e013c0a94c3bf4e1b6793f81940a18ca4e741d3dc7a46519 Apr 23 17:58:33.989047 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:33.988901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9wkp\" (UniqueName: \"kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp\") pod \"network-check-target-58nsw\" (UID: \"8f6f3666-ed32-45c7-8804-c3a7951d4815\") " pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:58:33.989130 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:33.989086 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:58:33.989130 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:33.989109 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:58:33.989130 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:33.989121 2576 projected.go:194] Error preparing data for projected volume kube-api-access-x9wkp for pod openshift-network-diagnostics/network-check-target-58nsw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:33.989247 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:33.989166 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp podName:8f6f3666-ed32-45c7-8804-c3a7951d4815 nodeName:}" failed. No retries permitted until 2026-04-23 17:58:34.989152418 +0000 UTC m=+4.156479715 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-x9wkp" (UniqueName: "kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp") pod "network-check-target-58nsw" (UID: "8f6f3666-ed32-45c7-8804-c3a7951d4815") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:34.311530 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:34.311382 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:53:32 +0000 UTC" deadline="2027-12-05 17:15:21.420068616 +0000 UTC" Apr 23 17:58:34.311530 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:34.311423 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14183h16m47.108649195s" Apr 23 17:58:34.410993 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:34.410638 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:58:34.410993 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:34.410769 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58nsw" podUID="8f6f3666-ed32-45c7-8804-c3a7951d4815" Apr 23 17:58:34.418605 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:34.418571 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" event={"ID":"29f27201-1f58-4672-a5e6-64fe7aef7d2e","Type":"ContainerStarted","Data":"c14624ef96f1d4ea0064c4c22f626715c1b37094b09e09a427e18e5adcb916dd"} Apr 23 17:58:34.423346 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:34.423292 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" event={"ID":"c190cfdd-0e26-48e7-b404-b2097e05ec6e","Type":"ContainerStarted","Data":"66ccbcb4ab9da559c731102a2163a8c506e7bc6cf428af99656de277ae6ad932"} Apr 23 17:58:34.427774 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:34.427721 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xznrt" event={"ID":"f66c5d99-c18e-417a-b05d-439bf68ddbff","Type":"ContainerStarted","Data":"7d76fca9cc7bf00a623b1b2493ff672a5313bdc6288d3bec9ee08081f8227042"} Apr 23 17:58:34.429401 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:34.429362 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" event={"ID":"07f7ffd8-67f5-4806-a74c-8c1b4961ac85","Type":"ContainerStarted","Data":"d98d90f56b5ee8e664d6395a8d1c80198fbe77f014ed476c741e1ebe2aaa3a42"} Apr 23 17:58:34.434306 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:34.434272 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-157.ec2.internal" event={"ID":"e0be1704db5e6b55a9c3869587898cb5","Type":"ContainerStarted","Data":"48a16a33cb1900f7386a00526e8f9fe64157046cc452a8904eb6aff43989413d"} Apr 23 17:58:34.436826 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:34.436796 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fhvhd" event={"ID":"b7dafa22-aab9-4274-a348-a27afa11e470","Type":"ContainerStarted","Data":"c6926df3d255d590e013c0a94c3bf4e1b6793f81940a18ca4e741d3dc7a46519"} Apr 23 17:58:34.438784 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:34.438759 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-k8ncf" event={"ID":"365c5763-fac0-4121-9de2-0a669a25bc8c","Type":"ContainerStarted","Data":"45cd27c59365055783e7c755e9faf356f21d77bad81b2950c0d34a16755f1b54"} Apr 23 17:58:34.440644 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:34.440622 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vrgrj" event={"ID":"f6a1e63d-4ac5-4843-8b2f-4842157bdd00","Type":"ContainerStarted","Data":"716a5260c5d1bbd46eac2f67135ed9ca19ebfc3990be9ec83d7f70e6f8f8c516"} Apr 23 17:58:34.442330 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:34.442304 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7kq2r" event={"ID":"60dd6cf9-cd33-4a4c-bc5c-61d0fcd7b08a","Type":"ContainerStarted","Data":"29aef69bc47319c885b240dded7d7b54e0669c6ed56569a2dcb0acbb6da8d1de"} Apr 23 17:58:34.443471 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:34.443432 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-x7kw5" event={"ID":"3ce8a6d4-d062-4813-b21e-b06b4d147b13","Type":"ContainerStarted","Data":"4eab0d16ada268c34302a4746e52c54594d39fdc5ca494f906e3231735bd7127"} Apr 23 17:58:34.739634 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:34.739603 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:58:34.906048 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:34.906009 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs\") pod \"network-metrics-daemon-q98mx\" (UID: \"ab8ad387-1bfb-42ab-ad18-c8ea20362f8f\") " pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:34.906197 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:34.906169 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:34.906325 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:34.906234 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs podName:ab8ad387-1bfb-42ab-ad18-c8ea20362f8f nodeName:}" failed. No retries permitted until 2026-04-23 17:58:36.906214407 +0000 UTC m=+6.073541720 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs") pod "network-metrics-daemon-q98mx" (UID: "ab8ad387-1bfb-42ab-ad18-c8ea20362f8f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:35.007240 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:35.007149 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9wkp\" (UniqueName: \"kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp\") pod \"network-check-target-58nsw\" (UID: \"8f6f3666-ed32-45c7-8804-c3a7951d4815\") " pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:58:35.007402 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:35.007355 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:58:35.007402 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:35.007376 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:58:35.007402 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:35.007389 2576 projected.go:194] Error preparing data for projected volume kube-api-access-x9wkp for pod openshift-network-diagnostics/network-check-target-58nsw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:35.007594 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:35.007448 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp podName:8f6f3666-ed32-45c7-8804-c3a7951d4815 nodeName:}" failed. No retries permitted until 2026-04-23 17:58:37.007427966 +0000 UTC m=+6.174755265 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-x9wkp" (UniqueName: "kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp") pod "network-check-target-58nsw" (UID: "8f6f3666-ed32-45c7-8804-c3a7951d4815") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:35.034772 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:35.034733 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:58:35.411146 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:35.410610 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:35.411146 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:35.410753 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q98mx" podUID="ab8ad387-1bfb-42ab-ad18-c8ea20362f8f" Apr 23 17:58:35.475856 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:35.475816 2576 generic.go:358] "Generic (PLEG): container finished" podID="51c48410f0a83abd5140e158944b1f13" containerID="d9600eb33908b7ba3c66a328e546aefbfcdd77a6dc333757ffe82991a8af962d" exitCode=0 Apr 23 17:58:35.476566 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:35.476539 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-157.ec2.internal" event={"ID":"51c48410f0a83abd5140e158944b1f13","Type":"ContainerDied","Data":"d9600eb33908b7ba3c66a328e546aefbfcdd77a6dc333757ffe82991a8af962d"} Apr 23 17:58:35.494154 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:35.493276 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-157.ec2.internal" podStartSLOduration=3.493259595 podStartE2EDuration="3.493259595s" podCreationTimestamp="2026-04-23 17:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:58:34.450497327 +0000 UTC m=+3.617824648" watchObservedRunningTime="2026-04-23 17:58:35.493259595 +0000 UTC m=+4.660586915" Apr 23 17:58:36.411189 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:36.411124 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:58:36.411648 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:36.411320 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58nsw" podUID="8f6f3666-ed32-45c7-8804-c3a7951d4815" Apr 23 17:58:36.482215 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:36.482149 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-157.ec2.internal" event={"ID":"51c48410f0a83abd5140e158944b1f13","Type":"ContainerStarted","Data":"34cc69572592baeab73fe62b19c9406fd32932598960a5600e7451bc5c4a441c"} Apr 23 17:58:36.499622 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:36.499563 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-157.ec2.internal" podStartSLOduration=4.499544973 podStartE2EDuration="4.499544973s" podCreationTimestamp="2026-04-23 17:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:58:36.499255547 +0000 UTC m=+5.666582866" watchObservedRunningTime="2026-04-23 17:58:36.499544973 +0000 UTC m=+5.666872296" Apr 23 17:58:36.504252 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:36.503435 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-62sfd"] Apr 23 17:58:36.506770 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:36.506317 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:36.506770 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:36.506391 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-62sfd" podUID="a3fad012-152e-4084-a80b-63e1ff5a0998" Apr 23 17:58:36.622324 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:36.622287 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3fad012-152e-4084-a80b-63e1ff5a0998-original-pull-secret\") pod \"global-pull-secret-syncer-62sfd\" (UID: \"a3fad012-152e-4084-a80b-63e1ff5a0998\") " pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:36.622503 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:36.622342 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a3fad012-152e-4084-a80b-63e1ff5a0998-kubelet-config\") pod \"global-pull-secret-syncer-62sfd\" (UID: \"a3fad012-152e-4084-a80b-63e1ff5a0998\") " pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:36.622503 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:36.622404 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a3fad012-152e-4084-a80b-63e1ff5a0998-dbus\") pod \"global-pull-secret-syncer-62sfd\" (UID: \"a3fad012-152e-4084-a80b-63e1ff5a0998\") " pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:36.723744 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:36.723637 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3fad012-152e-4084-a80b-63e1ff5a0998-original-pull-secret\") pod \"global-pull-secret-syncer-62sfd\" (UID: \"a3fad012-152e-4084-a80b-63e1ff5a0998\") " pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:36.723744 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:36.723709 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a3fad012-152e-4084-a80b-63e1ff5a0998-kubelet-config\") pod \"global-pull-secret-syncer-62sfd\" (UID: \"a3fad012-152e-4084-a80b-63e1ff5a0998\") " pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:36.723963 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:36.723769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a3fad012-152e-4084-a80b-63e1ff5a0998-dbus\") pod \"global-pull-secret-syncer-62sfd\" (UID: \"a3fad012-152e-4084-a80b-63e1ff5a0998\") " pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:36.723963 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:36.723949 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a3fad012-152e-4084-a80b-63e1ff5a0998-dbus\") pod \"global-pull-secret-syncer-62sfd\" (UID: \"a3fad012-152e-4084-a80b-63e1ff5a0998\") " pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:36.724063 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:36.724037 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:36.724115 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:36.724082 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3fad012-152e-4084-a80b-63e1ff5a0998-original-pull-secret podName:a3fad012-152e-4084-a80b-63e1ff5a0998 nodeName:}" failed. No retries permitted until 2026-04-23 17:58:37.224068608 +0000 UTC m=+6.391395905 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a3fad012-152e-4084-a80b-63e1ff5a0998-original-pull-secret") pod "global-pull-secret-syncer-62sfd" (UID: "a3fad012-152e-4084-a80b-63e1ff5a0998") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:36.724303 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:36.724284 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a3fad012-152e-4084-a80b-63e1ff5a0998-kubelet-config\") pod \"global-pull-secret-syncer-62sfd\" (UID: \"a3fad012-152e-4084-a80b-63e1ff5a0998\") " pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:36.925483 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:36.925426 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs\") pod \"network-metrics-daemon-q98mx\" (UID: \"ab8ad387-1bfb-42ab-ad18-c8ea20362f8f\") " pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:36.925666 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:36.925630 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:36.925734 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:36.925695 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs podName:ab8ad387-1bfb-42ab-ad18-c8ea20362f8f nodeName:}" failed. No retries permitted until 2026-04-23 17:58:40.92567735 +0000 UTC m=+10.093004661 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs") pod "network-metrics-daemon-q98mx" (UID: "ab8ad387-1bfb-42ab-ad18-c8ea20362f8f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:37.026521 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:37.026423 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9wkp\" (UniqueName: \"kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp\") pod \"network-check-target-58nsw\" (UID: \"8f6f3666-ed32-45c7-8804-c3a7951d4815\") " pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:58:37.026674 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:37.026606 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:58:37.026674 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:37.026626 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:58:37.026674 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:37.026640 2576 projected.go:194] Error preparing data for projected volume kube-api-access-x9wkp for pod openshift-network-diagnostics/network-check-target-58nsw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:37.026835 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:37.026703 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp podName:8f6f3666-ed32-45c7-8804-c3a7951d4815 nodeName:}" failed. No retries permitted until 2026-04-23 17:58:41.026682654 +0000 UTC m=+10.194009955 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-x9wkp" (UniqueName: "kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp") pod "network-check-target-58nsw" (UID: "8f6f3666-ed32-45c7-8804-c3a7951d4815") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:37.228901 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:37.228862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3fad012-152e-4084-a80b-63e1ff5a0998-original-pull-secret\") pod \"global-pull-secret-syncer-62sfd\" (UID: \"a3fad012-152e-4084-a80b-63e1ff5a0998\") " pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:37.229309 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:37.229063 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:37.229309 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:37.229151 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3fad012-152e-4084-a80b-63e1ff5a0998-original-pull-secret podName:a3fad012-152e-4084-a80b-63e1ff5a0998 nodeName:}" failed. No retries permitted until 2026-04-23 17:58:38.229130417 +0000 UTC m=+7.396457723 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a3fad012-152e-4084-a80b-63e1ff5a0998-original-pull-secret") pod "global-pull-secret-syncer-62sfd" (UID: "a3fad012-152e-4084-a80b-63e1ff5a0998") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:37.411930 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:37.411393 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:37.411930 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:37.411565 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q98mx" podUID="ab8ad387-1bfb-42ab-ad18-c8ea20362f8f" Apr 23 17:58:38.238274 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:38.238151 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3fad012-152e-4084-a80b-63e1ff5a0998-original-pull-secret\") pod \"global-pull-secret-syncer-62sfd\" (UID: \"a3fad012-152e-4084-a80b-63e1ff5a0998\") " pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:38.238485 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:38.238330 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:38.238485 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:38.238402 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3fad012-152e-4084-a80b-63e1ff5a0998-original-pull-secret podName:a3fad012-152e-4084-a80b-63e1ff5a0998 nodeName:}" failed. No retries permitted until 2026-04-23 17:58:40.238388211 +0000 UTC m=+9.405715512 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a3fad012-152e-4084-a80b-63e1ff5a0998-original-pull-secret") pod "global-pull-secret-syncer-62sfd" (UID: "a3fad012-152e-4084-a80b-63e1ff5a0998") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:38.411241 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:38.411209 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:58:38.411418 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:38.411332 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58nsw" podUID="8f6f3666-ed32-45c7-8804-c3a7951d4815" Apr 23 17:58:38.411747 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:38.411727 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:38.411842 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:38.411818 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-62sfd" podUID="a3fad012-152e-4084-a80b-63e1ff5a0998" Apr 23 17:58:39.410403 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:39.410366 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:39.410825 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:39.410528 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q98mx" podUID="ab8ad387-1bfb-42ab-ad18-c8ea20362f8f" Apr 23 17:58:40.256228 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:40.256190 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3fad012-152e-4084-a80b-63e1ff5a0998-original-pull-secret\") pod \"global-pull-secret-syncer-62sfd\" (UID: \"a3fad012-152e-4084-a80b-63e1ff5a0998\") " pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:40.256413 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:40.256338 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:40.256413 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:40.256388 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3fad012-152e-4084-a80b-63e1ff5a0998-original-pull-secret podName:a3fad012-152e-4084-a80b-63e1ff5a0998 nodeName:}" failed. No retries permitted until 2026-04-23 17:58:44.256375064 +0000 UTC m=+13.423702361 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a3fad012-152e-4084-a80b-63e1ff5a0998-original-pull-secret") pod "global-pull-secret-syncer-62sfd" (UID: "a3fad012-152e-4084-a80b-63e1ff5a0998") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:40.411193 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:40.410671 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:40.411193 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:40.410790 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-62sfd" podUID="a3fad012-152e-4084-a80b-63e1ff5a0998" Apr 23 17:58:40.411193 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:40.410867 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:58:40.411193 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:40.410926 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58nsw" podUID="8f6f3666-ed32-45c7-8804-c3a7951d4815" Apr 23 17:58:40.962586 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:40.962541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs\") pod \"network-metrics-daemon-q98mx\" (UID: \"ab8ad387-1bfb-42ab-ad18-c8ea20362f8f\") " pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:40.962822 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:40.962745 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:40.962822 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:40.962810 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs podName:ab8ad387-1bfb-42ab-ad18-c8ea20362f8f nodeName:}" failed. No retries permitted until 2026-04-23 17:58:48.962791432 +0000 UTC m=+18.130118731 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs") pod "network-metrics-daemon-q98mx" (UID: "ab8ad387-1bfb-42ab-ad18-c8ea20362f8f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:41.064472 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:41.063806 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9wkp\" (UniqueName: \"kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp\") pod \"network-check-target-58nsw\" (UID: \"8f6f3666-ed32-45c7-8804-c3a7951d4815\") " pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:58:41.064472 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:41.064000 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:58:41.064472 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:41.064016 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:58:41.064472 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:41.064029 2576 projected.go:194] Error preparing data for projected volume kube-api-access-x9wkp for pod openshift-network-diagnostics/network-check-target-58nsw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:41.064472 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:41.064085 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp podName:8f6f3666-ed32-45c7-8804-c3a7951d4815 nodeName:}" failed. No retries permitted until 2026-04-23 17:58:49.064068364 +0000 UTC m=+18.231395683 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-x9wkp" (UniqueName: "kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp") pod "network-check-target-58nsw" (UID: "8f6f3666-ed32-45c7-8804-c3a7951d4815") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:41.411668 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:41.411630 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:41.412093 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:41.411769 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q98mx" podUID="ab8ad387-1bfb-42ab-ad18-c8ea20362f8f" Apr 23 17:58:42.410768 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:42.410732 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:58:42.410950 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:42.410782 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:42.410950 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:42.410885 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58nsw" podUID="8f6f3666-ed32-45c7-8804-c3a7951d4815" Apr 23 17:58:42.411055 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:42.411015 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-62sfd" podUID="a3fad012-152e-4084-a80b-63e1ff5a0998" Apr 23 17:58:43.414239 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:43.414204 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:43.414660 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:43.414318 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q98mx" podUID="ab8ad387-1bfb-42ab-ad18-c8ea20362f8f" Apr 23 17:58:44.289697 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:44.289648 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3fad012-152e-4084-a80b-63e1ff5a0998-original-pull-secret\") pod \"global-pull-secret-syncer-62sfd\" (UID: \"a3fad012-152e-4084-a80b-63e1ff5a0998\") " pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:44.289888 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:44.289854 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:44.289996 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:44.289908 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3fad012-152e-4084-a80b-63e1ff5a0998-original-pull-secret podName:a3fad012-152e-4084-a80b-63e1ff5a0998 nodeName:}" failed. No retries permitted until 2026-04-23 17:58:52.289889652 +0000 UTC m=+21.457216949 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a3fad012-152e-4084-a80b-63e1ff5a0998-original-pull-secret") pod "global-pull-secret-syncer-62sfd" (UID: "a3fad012-152e-4084-a80b-63e1ff5a0998") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:44.410904 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:44.410867 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:44.411067 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:44.410867 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:58:44.411067 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:44.411013 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-62sfd" podUID="a3fad012-152e-4084-a80b-63e1ff5a0998" Apr 23 17:58:44.411152 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:44.411104 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58nsw" podUID="8f6f3666-ed32-45c7-8804-c3a7951d4815" Apr 23 17:58:45.411205 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:45.411165 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:45.411671 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:45.411304 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q98mx" podUID="ab8ad387-1bfb-42ab-ad18-c8ea20362f8f" Apr 23 17:58:46.410536 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:46.410510 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:58:46.410665 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:46.410550 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:46.410665 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:46.410642 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58nsw" podUID="8f6f3666-ed32-45c7-8804-c3a7951d4815" Apr 23 17:58:46.410817 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:46.410784 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-62sfd" podUID="a3fad012-152e-4084-a80b-63e1ff5a0998" Apr 23 17:58:47.411346 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:47.411308 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:47.411818 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:47.411445 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q98mx" podUID="ab8ad387-1bfb-42ab-ad18-c8ea20362f8f" Apr 23 17:58:48.411067 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:48.411027 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:58:48.411262 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:48.411041 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:48.411262 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:48.411174 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58nsw" podUID="8f6f3666-ed32-45c7-8804-c3a7951d4815" Apr 23 17:58:48.411262 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:48.411247 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-62sfd" podUID="a3fad012-152e-4084-a80b-63e1ff5a0998" Apr 23 17:58:49.023136 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:49.023097 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs\") pod \"network-metrics-daemon-q98mx\" (UID: \"ab8ad387-1bfb-42ab-ad18-c8ea20362f8f\") " pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:49.023597 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:49.023230 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:49.023597 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:49.023288 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs podName:ab8ad387-1bfb-42ab-ad18-c8ea20362f8f nodeName:}" failed. No retries permitted until 2026-04-23 17:59:05.023273277 +0000 UTC m=+34.190600575 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs") pod "network-metrics-daemon-q98mx" (UID: "ab8ad387-1bfb-42ab-ad18-c8ea20362f8f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:58:49.124154 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:49.124113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9wkp\" (UniqueName: \"kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp\") pod \"network-check-target-58nsw\" (UID: \"8f6f3666-ed32-45c7-8804-c3a7951d4815\") " pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:58:49.124322 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:49.124293 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:58:49.124322 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:49.124313 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:58:49.124322 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:49.124322 2576 projected.go:194] Error preparing data for projected volume kube-api-access-x9wkp for pod openshift-network-diagnostics/network-check-target-58nsw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:49.124523 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:49.124373 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp podName:8f6f3666-ed32-45c7-8804-c3a7951d4815 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:05.124359546 +0000 UTC m=+34.291686846 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-x9wkp" (UniqueName: "kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp") pod "network-check-target-58nsw" (UID: "8f6f3666-ed32-45c7-8804-c3a7951d4815") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:58:49.411404 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:49.411366 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:49.411688 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:49.411536 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q98mx" podUID="ab8ad387-1bfb-42ab-ad18-c8ea20362f8f" Apr 23 17:58:50.410524 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:50.410487 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:50.410971 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:50.410487 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:58:50.410971 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:50.410623 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-62sfd" podUID="a3fad012-152e-4084-a80b-63e1ff5a0998" Apr 23 17:58:50.410971 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:50.410675 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58nsw" podUID="8f6f3666-ed32-45c7-8804-c3a7951d4815" Apr 23 17:58:51.411599 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:51.411573 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:51.411904 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:51.411699 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q98mx" podUID="ab8ad387-1bfb-42ab-ad18-c8ea20362f8f" Apr 23 17:58:52.349776 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.349520 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3fad012-152e-4084-a80b-63e1ff5a0998-original-pull-secret\") pod \"global-pull-secret-syncer-62sfd\" (UID: \"a3fad012-152e-4084-a80b-63e1ff5a0998\") " pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:52.349957 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:52.349701 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:52.349957 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:52.349879 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3fad012-152e-4084-a80b-63e1ff5a0998-original-pull-secret podName:a3fad012-152e-4084-a80b-63e1ff5a0998 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:08.34985794 +0000 UTC m=+37.517185252 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a3fad012-152e-4084-a80b-63e1ff5a0998-original-pull-secret") pod "global-pull-secret-syncer-62sfd" (UID: "a3fad012-152e-4084-a80b-63e1ff5a0998") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:58:52.410579 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.410493 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:52.410730 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.410504 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:58:52.410730 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:52.410639 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-62sfd" podUID="a3fad012-152e-4084-a80b-63e1ff5a0998" Apr 23 17:58:52.410730 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:52.410707 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58nsw" podUID="8f6f3666-ed32-45c7-8804-c3a7951d4815" Apr 23 17:58:52.511002 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.510963 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" event={"ID":"c190cfdd-0e26-48e7-b404-b2097e05ec6e","Type":"ContainerStarted","Data":"a8ee3264a81a5ee771c8c14d46a63a2839ab45a40f69dd744c054b0773a36f3e"} Apr 23 17:58:52.512386 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.512355 2576 generic.go:358] "Generic (PLEG): container finished" podID="f66c5d99-c18e-417a-b05d-439bf68ddbff" containerID="f3e263ddda9edc7ec0db1d14bcd72c94d02fe95b85929876e91bf2ee0f193daf" exitCode=0 Apr 23 17:58:52.512550 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.512453 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xznrt" event={"ID":"f66c5d99-c18e-417a-b05d-439bf68ddbff","Type":"ContainerDied","Data":"f3e263ddda9edc7ec0db1d14bcd72c94d02fe95b85929876e91bf2ee0f193daf"} Apr 23 17:58:52.517049 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.517030 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 17:58:52.517803 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.517777 2576 generic.go:358] "Generic (PLEG): container finished" podID="07f7ffd8-67f5-4806-a74c-8c1b4961ac85" containerID="62acb8f686d9d4940aad9a1f32297a6248b410486aa2e121fc7c8ff91b4a71ee" exitCode=1 Apr 23 17:58:52.517897 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.517854 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" event={"ID":"07f7ffd8-67f5-4806-a74c-8c1b4961ac85","Type":"ContainerStarted","Data":"a0f3fe18fad848fce48b924db4fe0388375743e4bcd5a41ee880406908ade517"} Apr 23 17:58:52.517897 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.517886 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" event={"ID":"07f7ffd8-67f5-4806-a74c-8c1b4961ac85","Type":"ContainerStarted","Data":"18149badfdb079c807583abf997b9c3d272f740bf0c318a0d77ce9c42c32282a"} Apr 23 17:58:52.518002 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.517902 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" event={"ID":"07f7ffd8-67f5-4806-a74c-8c1b4961ac85","Type":"ContainerStarted","Data":"dd3bb336be5eb52dfe6385b6459831bfae31a8c5a0e07fb728a0dd6d7743f2c2"} Apr 23 17:58:52.518002 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.517915 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" event={"ID":"07f7ffd8-67f5-4806-a74c-8c1b4961ac85","Type":"ContainerStarted","Data":"4168f639039b094a7ae50fe498183c4ff550c067c8600d52a1a50db14d2eacf1"} Apr 23 17:58:52.518002 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.517928 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" event={"ID":"07f7ffd8-67f5-4806-a74c-8c1b4961ac85","Type":"ContainerDied","Data":"62acb8f686d9d4940aad9a1f32297a6248b410486aa2e121fc7c8ff91b4a71ee"} Apr 23 17:58:52.518002 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.517945 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" event={"ID":"07f7ffd8-67f5-4806-a74c-8c1b4961ac85","Type":"ContainerStarted","Data":"76bcf462a341da2647c754eed1f88ea4c8762cb0d26be274b0409bb05c4a87d5"} Apr 23 17:58:52.519292 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.519271 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-k8ncf" event={"ID":"365c5763-fac0-4121-9de2-0a669a25bc8c","Type":"ContainerStarted","Data":"7b0ee67d75157665d9bf0e4fd971f2ac558dc3817b3ccf6151ec330d7b4fc9ce"} Apr 23 17:58:52.520715 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.520694 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vrgrj" event={"ID":"f6a1e63d-4ac5-4843-8b2f-4842157bdd00","Type":"ContainerStarted","Data":"ac5d101a0c32b96cbf6bcc081f9338df1e81c480d13e7dc22cfbcf9bb9a941eb"} Apr 23 17:58:52.522031 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.521999 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7kq2r" event={"ID":"60dd6cf9-cd33-4a4c-bc5c-61d0fcd7b08a","Type":"ContainerStarted","Data":"817349cd22e0d26a50c6b6e4c51827aa8ffb32cf8f95daf18b8148728a3a427a"} Apr 23 17:58:52.523268 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.523248 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-x7kw5" event={"ID":"3ce8a6d4-d062-4813-b21e-b06b4d147b13","Type":"ContainerStarted","Data":"90a53233cec06abec1b1bf7a4d713ea715e25b5afea78fded70b6e6f332b781c"} Apr 23 17:58:52.524865 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.524831 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" event={"ID":"29f27201-1f58-4672-a5e6-64fe7aef7d2e","Type":"ContainerStarted","Data":"7f76c71880620736d5a1ed74615ba1906214d0f26483a6d14fbb62f190c0fb8a"} Apr 23 17:58:52.565496 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.564674 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-k8ncf" podStartSLOduration=4.118756913 podStartE2EDuration="21.564657867s" podCreationTimestamp="2026-04-23 17:58:31 +0000 UTC" firstStartedPulling="2026-04-23 17:58:33.924060758 +0000 UTC m=+3.091388059" lastFinishedPulling="2026-04-23 17:58:51.369961716 +0000 UTC m=+20.537289013" observedRunningTime="2026-04-23 17:58:52.549450446 +0000 UTC m=+21.716777766" watchObservedRunningTime="2026-04-23 17:58:52.564657867 +0000 UTC m=+21.731985173" Apr 23 17:58:52.565496 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.564967 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jpgtk" podStartSLOduration=4.104399358 podStartE2EDuration="21.564960323s" podCreationTimestamp="2026-04-23 17:58:31 +0000 UTC" firstStartedPulling="2026-04-23 17:58:33.909500266 +0000 UTC m=+3.076827562" lastFinishedPulling="2026-04-23 17:58:51.370061229 +0000 UTC m=+20.537388527" observedRunningTime="2026-04-23 17:58:52.564775764 +0000 UTC m=+21.732103082" watchObservedRunningTime="2026-04-23 17:58:52.564960323 +0000 UTC m=+21.732287643" Apr 23 17:58:52.600229 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.600174 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7kq2r" podStartSLOduration=4.147552311 podStartE2EDuration="21.600160397s" podCreationTimestamp="2026-04-23 17:58:31 +0000 UTC" firstStartedPulling="2026-04-23 17:58:33.917378285 +0000 UTC m=+3.084705583" lastFinishedPulling="2026-04-23 17:58:51.369986367 +0000 UTC m=+20.537313669" observedRunningTime="2026-04-23 17:58:52.579593311 +0000 UTC m=+21.746920629" watchObservedRunningTime="2026-04-23 17:58:52.600160397 +0000 UTC m=+21.767487694" Apr 23 17:58:52.600367 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.600320 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-x7kw5" podStartSLOduration=4.174854481 podStartE2EDuration="21.600316104s" podCreationTimestamp="2026-04-23 17:58:31 +0000 UTC" firstStartedPulling="2026-04-23 17:58:33.916834041 +0000 UTC m=+3.084161344" lastFinishedPulling="2026-04-23 17:58:51.342295656 +0000 UTC m=+20.509622967" observedRunningTime="2026-04-23 17:58:52.600272442 +0000 UTC m=+21.767599760" watchObservedRunningTime="2026-04-23 17:58:52.600316104 +0000 UTC m=+21.767643422" Apr 23 17:58:52.626281 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.624721 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vrgrj" podStartSLOduration=4.137192591 podStartE2EDuration="21.624705631s" podCreationTimestamp="2026-04-23 17:58:31 +0000 UTC" firstStartedPulling="2026-04-23 17:58:33.918636511 +0000 UTC m=+3.085963811" lastFinishedPulling="2026-04-23 17:58:51.406149547 +0000 UTC m=+20.573476851" observedRunningTime="2026-04-23 17:58:52.624615848 +0000 UTC m=+21.791943168" watchObservedRunningTime="2026-04-23 17:58:52.624705631 +0000 UTC m=+21.792032951" Apr 23 17:58:52.991709 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:52.991675 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 17:58:53.348058 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:53.347944 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T17:58:52.99170219Z","UUID":"cbf5c615-a7e5-4da5-9100-4aa9b216e83c","Handler":null,"Name":"","Endpoint":""} Apr 23 17:58:53.350178 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:53.350154 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 17:58:53.350313 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:53.350187 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 17:58:53.411064 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:53.411031 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:53.411234 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:53.411155 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q98mx" podUID="ab8ad387-1bfb-42ab-ad18-c8ea20362f8f" Apr 23 17:58:53.527630 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:53.527595 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fhvhd" event={"ID":"b7dafa22-aab9-4274-a348-a27afa11e470","Type":"ContainerStarted","Data":"fd790a66347ef846ff66929980c8e62970c51e2437637b7fbd7669c8d5a25bc6"} Apr 23 17:58:53.529594 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:53.529561 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" event={"ID":"c190cfdd-0e26-48e7-b404-b2097e05ec6e","Type":"ContainerStarted","Data":"8b6f59b2af40868451d35755df554a618cf584cfc2837e7b237dfb97228a5415"} Apr 23 17:58:53.542152 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:53.542103 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-fhvhd" podStartSLOduration=5.097776542 podStartE2EDuration="22.542087509s" podCreationTimestamp="2026-04-23 17:58:31 +0000 UTC" firstStartedPulling="2026-04-23 17:58:33.925685813 +0000 UTC m=+3.093013115" lastFinishedPulling="2026-04-23 17:58:51.369996778 +0000 UTC m=+20.537324082" observedRunningTime="2026-04-23 17:58:53.542055143 +0000 UTC m=+22.709382482" watchObservedRunningTime="2026-04-23 17:58:53.542087509 +0000 UTC m=+22.709414822" Apr 23 17:58:54.411299 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:54.411117 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:58:54.411447 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:54.411117 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:54.411447 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:54.411393 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58nsw" podUID="8f6f3666-ed32-45c7-8804-c3a7951d4815" Apr 23 17:58:54.411554 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:54.411513 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-62sfd" podUID="a3fad012-152e-4084-a80b-63e1ff5a0998" Apr 23 17:58:54.534086 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:54.534053 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" event={"ID":"c190cfdd-0e26-48e7-b404-b2097e05ec6e","Type":"ContainerStarted","Data":"e6d40f80c1af875518f7d21badd56709eee6e682f977ded98ea74ae6f68b7630"} Apr 23 17:58:54.537154 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:54.537134 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 17:58:54.537499 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:54.537455 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" event={"ID":"07f7ffd8-67f5-4806-a74c-8c1b4961ac85","Type":"ContainerStarted","Data":"c23cf7f5aa9c3f81545e7fc79b3bef2533eaa539deb0ea3bd753b0b418510a91"} Apr 23 17:58:54.552681 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:54.552627 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jbnf6" podStartSLOduration=3.4053114190000002 podStartE2EDuration="23.552606963s" podCreationTimestamp="2026-04-23 17:58:31 +0000 UTC" firstStartedPulling="2026-04-23 17:58:33.914763423 +0000 UTC m=+3.082090723" lastFinishedPulling="2026-04-23 17:58:54.062058954 +0000 UTC m=+23.229386267" observedRunningTime="2026-04-23 17:58:54.551910668 +0000 UTC m=+23.719237987" watchObservedRunningTime="2026-04-23 17:58:54.552606963 +0000 UTC m=+23.719934287" Apr 23 17:58:55.255419 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:55.255391 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7kq2r" Apr 23 17:58:55.256123 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:55.256096 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7kq2r" Apr 23 17:58:55.410701 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:55.410669 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:55.410871 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:55.410799 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q98mx" podUID="ab8ad387-1bfb-42ab-ad18-c8ea20362f8f" Apr 23 17:58:55.539280 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:55.539201 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7kq2r" Apr 23 17:58:55.539866 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:55.539792 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7kq2r" Apr 23 17:58:56.411127 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:56.411092 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:56.411310 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:56.411140 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:58:56.411310 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:56.411247 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58nsw" podUID="8f6f3666-ed32-45c7-8804-c3a7951d4815" Apr 23 17:58:56.411635 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:56.411599 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-62sfd" podUID="a3fad012-152e-4084-a80b-63e1ff5a0998" Apr 23 17:58:57.410921 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:57.410739 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:57.411430 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:57.411011 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q98mx" podUID="ab8ad387-1bfb-42ab-ad18-c8ea20362f8f" Apr 23 17:58:57.545126 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:57.545094 2576 generic.go:358] "Generic (PLEG): container finished" podID="f66c5d99-c18e-417a-b05d-439bf68ddbff" containerID="bf66692a92dbe79b49f1fbc24fa62facea3bcd65c77060d99f59280f0a6a4a89" exitCode=0 Apr 23 17:58:57.545294 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:57.545161 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xznrt" event={"ID":"f66c5d99-c18e-417a-b05d-439bf68ddbff","Type":"ContainerDied","Data":"bf66692a92dbe79b49f1fbc24fa62facea3bcd65c77060d99f59280f0a6a4a89"} Apr 23 17:58:57.548106 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:57.548090 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 17:58:57.548494 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:57.548453 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" event={"ID":"07f7ffd8-67f5-4806-a74c-8c1b4961ac85","Type":"ContainerStarted","Data":"a6c475c83dff21c10588731cd9af8d177f6f92f15efd58baf530900f4ee6623c"} Apr 23 17:58:57.548739 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:57.548722 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:57.548815 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:57.548747 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:57.548991 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:57.548972 2576 scope.go:117] "RemoveContainer" containerID="62acb8f686d9d4940aad9a1f32297a6248b410486aa2e121fc7c8ff91b4a71ee" Apr 23 17:58:57.564063 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:57.564041 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:58.410870 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:58.410800 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:58:58.410870 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:58.410863 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:58.411272 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:58.410950 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-62sfd" podUID="a3fad012-152e-4084-a80b-63e1ff5a0998" Apr 23 17:58:58.411272 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:58.411066 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58nsw" podUID="8f6f3666-ed32-45c7-8804-c3a7951d4815" Apr 23 17:58:58.522900 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:58.522865 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-62sfd"] Apr 23 17:58:58.525436 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:58.525409 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q98mx"] Apr 23 17:58:58.525570 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:58.525563 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:58:58.525696 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:58.525673 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q98mx" podUID="ab8ad387-1bfb-42ab-ad18-c8ea20362f8f" Apr 23 17:58:58.527802 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:58.527783 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-58nsw"] Apr 23 17:58:58.552401 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:58.552376 2576 generic.go:358] "Generic (PLEG): container finished" podID="f66c5d99-c18e-417a-b05d-439bf68ddbff" containerID="b56a99b2c7b7ef11699e480c99070773239d1697a24c5adf86c77fdc4b8b50b4" exitCode=0 Apr 23 17:58:58.552553 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:58.552405 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xznrt" event={"ID":"f66c5d99-c18e-417a-b05d-439bf68ddbff","Type":"ContainerDied","Data":"b56a99b2c7b7ef11699e480c99070773239d1697a24c5adf86c77fdc4b8b50b4"} Apr 23 17:58:58.556339 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:58.556321 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 17:58:58.556774 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:58.556748 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" event={"ID":"07f7ffd8-67f5-4806-a74c-8c1b4961ac85","Type":"ContainerStarted","Data":"eb49255bc579956dd72a5e9ae58362ed4fc760b9fc7d919fea1d38e7a32ed535"} Apr 23 17:58:58.556869 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:58.556757 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:58:58.556869 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:58.556801 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:58:58.556988 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:58.556876 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-62sfd" podUID="a3fad012-152e-4084-a80b-63e1ff5a0998" Apr 23 17:58:58.557059 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:58:58.557039 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58nsw" podUID="8f6f3666-ed32-45c7-8804-c3a7951d4815" Apr 23 17:58:58.557107 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:58.557071 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:58.573223 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:58.573199 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:58:58.601573 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:58.601531 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" podStartSLOduration=10.070653857 podStartE2EDuration="27.601518914s" podCreationTimestamp="2026-04-23 17:58:31 +0000 UTC" firstStartedPulling="2026-04-23 17:58:33.914412672 +0000 UTC m=+3.081739980" lastFinishedPulling="2026-04-23 17:58:51.445277739 +0000 UTC m=+20.612605037" observedRunningTime="2026-04-23 17:58:58.60116548 +0000 UTC m=+27.768492799" watchObservedRunningTime="2026-04-23 17:58:58.601518914 +0000 UTC m=+27.768846258" Apr 23 17:58:59.560317 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:59.560228 2576 generic.go:358] "Generic (PLEG): container finished" podID="f66c5d99-c18e-417a-b05d-439bf68ddbff" containerID="6e3952377d7bd792e2f649085969d7d52396546c86b0f851253ed07570b40cd8" exitCode=0 Apr 23 17:58:59.560670 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:58:59.560318 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xznrt" event={"ID":"f66c5d99-c18e-417a-b05d-439bf68ddbff","Type":"ContainerDied","Data":"6e3952377d7bd792e2f649085969d7d52396546c86b0f851253ed07570b40cd8"} Apr 23 17:59:00.411326 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:00.411289 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:59:00.411521 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:00.411292 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:59:00.411521 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:00.411292 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:59:00.411521 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:00.411441 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-62sfd" podUID="a3fad012-152e-4084-a80b-63e1ff5a0998" Apr 23 17:59:00.411521 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:00.411510 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58nsw" podUID="8f6f3666-ed32-45c7-8804-c3a7951d4815" Apr 23 17:59:00.411725 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:00.411601 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q98mx" podUID="ab8ad387-1bfb-42ab-ad18-c8ea20362f8f" Apr 23 17:59:02.411281 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:02.411078 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:59:02.411924 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:02.411113 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:59:02.411924 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:02.411384 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-62sfd" podUID="a3fad012-152e-4084-a80b-63e1ff5a0998" Apr 23 17:59:02.411924 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:02.411136 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:59:02.411924 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:02.411492 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q98mx" podUID="ab8ad387-1bfb-42ab-ad18-c8ea20362f8f" Apr 23 17:59:02.411924 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:02.411530 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58nsw" podUID="8f6f3666-ed32-45c7-8804-c3a7951d4815" Apr 23 17:59:04.411234 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.411204 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:59:04.411860 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.411240 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:59:04.411860 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.411204 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:59:04.411860 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:04.411327 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-62sfd" podUID="a3fad012-152e-4084-a80b-63e1ff5a0998" Apr 23 17:59:04.411860 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:04.411403 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58nsw" podUID="8f6f3666-ed32-45c7-8804-c3a7951d4815" Apr 23 17:59:04.411860 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:04.411506 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q98mx" podUID="ab8ad387-1bfb-42ab-ad18-c8ea20362f8f" Apr 23 17:59:04.700404 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.700377 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-157.ec2.internal" event="NodeReady" Apr 23 17:59:04.700588 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.700548 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 17:59:04.736579 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.736549 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-579c67b56b-drvmr"] Apr 23 17:59:04.767967 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.767939 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6pv4g"] Apr 23 17:59:04.768132 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.768098 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:04.770076 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.770050 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 17:59:04.770225 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.770135 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 17:59:04.770853 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.770830 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 17:59:04.770989 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.770868 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jxfcc\"" Apr 23 17:59:04.776862 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.776837 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 17:59:04.784245 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.784223 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-579c67b56b-drvmr"] Apr 23 17:59:04.784245 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.784249 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6pv4g"] Apr 23 17:59:04.784414 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.784363 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6pv4g" Apr 23 17:59:04.786649 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.786628 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tsbqj\"" Apr 23 17:59:04.786757 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.786688 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 17:59:04.786856 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.786840 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 17:59:04.786916 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.786636 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 17:59:04.848444 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.848407 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6xhj9"] Apr 23 17:59:04.865728 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.865694 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6xhj9"] Apr 23 17:59:04.865872 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.865842 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6xhj9" Apr 23 17:59:04.867950 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.867927 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-sv5q9\"" Apr 23 17:59:04.868115 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.868094 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 17:59:04.868427 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.867927 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 17:59:04.950286 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.950244 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgqk6\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-kube-api-access-tgqk6\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:04.950286 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.950288 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert\") pod \"ingress-canary-6pv4g\" (UID: \"e929e6ba-de02-4dcb-affd-2772d869c2e0\") " pod="openshift-ingress-canary/ingress-canary-6pv4g" Apr 23 17:59:04.950531 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.950329 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79008782-6b97-4390-87b3-4fde5c883645-trusted-ca\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:04.950531 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.950353 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/79008782-6b97-4390-87b3-4fde5c883645-image-registry-private-configuration\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:04.950531 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.950393 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:04.950531 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.950430 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/79008782-6b97-4390-87b3-4fde5c883645-ca-trust-extracted\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:04.950531 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.950443 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/79008782-6b97-4390-87b3-4fde5c883645-installation-pull-secrets\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:04.950531 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.950476 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd6jm\" (UniqueName: \"kubernetes.io/projected/e929e6ba-de02-4dcb-affd-2772d869c2e0-kube-api-access-pd6jm\") pod \"ingress-canary-6pv4g\" (UID: \"e929e6ba-de02-4dcb-affd-2772d869c2e0\") " pod="openshift-ingress-canary/ingress-canary-6pv4g" Apr 23 17:59:04.950531 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.950499 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-bound-sa-token\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:04.950531 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:04.950518 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/79008782-6b97-4390-87b3-4fde5c883645-registry-certificates\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:05.051104 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.051074 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-bound-sa-token\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:05.051104 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.051108 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/79008782-6b97-4390-87b3-4fde5c883645-registry-certificates\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:05.051306 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.051132 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs\") pod \"network-metrics-daemon-q98mx\" (UID: \"ab8ad387-1bfb-42ab-ad18-c8ea20362f8f\") " pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:59:05.051306 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.051155 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bbb38a01-f704-4497-b3c1-20236e4e4f23-tmp-dir\") pod \"dns-default-6xhj9\" (UID: \"bbb38a01-f704-4497-b3c1-20236e4e4f23\") " pod="openshift-dns/dns-default-6xhj9" Apr 23 17:59:05.051306 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.051176 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgqk6\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-kube-api-access-tgqk6\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:05.051306 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.051219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert\") pod \"ingress-canary-6pv4g\" (UID: \"e929e6ba-de02-4dcb-affd-2772d869c2e0\") " pod="openshift-ingress-canary/ingress-canary-6pv4g" Apr 23 17:59:05.051306 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:05.051236 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:59:05.051306 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.051260 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79008782-6b97-4390-87b3-4fde5c883645-trusted-ca\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:05.051627 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:05.051316 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs podName:ab8ad387-1bfb-42ab-ad18-c8ea20362f8f nodeName:}" failed. No retries permitted until 2026-04-23 17:59:37.051294914 +0000 UTC m=+66.218622225 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs") pod "network-metrics-daemon-q98mx" (UID: "ab8ad387-1bfb-42ab-ad18-c8ea20362f8f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:59:05.051627 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.051354 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ppd4\" (UniqueName: \"kubernetes.io/projected/bbb38a01-f704-4497-b3c1-20236e4e4f23-kube-api-access-5ppd4\") pod \"dns-default-6xhj9\" (UID: \"bbb38a01-f704-4497-b3c1-20236e4e4f23\") " pod="openshift-dns/dns-default-6xhj9" Apr 23 17:59:05.051627 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:05.051361 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:59:05.051627 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.051424 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/79008782-6b97-4390-87b3-4fde5c883645-image-registry-private-configuration\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:05.051627 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:05.051439 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert podName:e929e6ba-de02-4dcb-affd-2772d869c2e0 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:05.551421104 +0000 UTC m=+34.718748404 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert") pod "ingress-canary-6pv4g" (UID: "e929e6ba-de02-4dcb-affd-2772d869c2e0") : secret "canary-serving-cert" not found Apr 23 17:59:05.051627 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.051522 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:05.051627 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.051574 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bbb38a01-f704-4497-b3c1-20236e4e4f23-config-volume\") pod \"dns-default-6xhj9\" (UID: \"bbb38a01-f704-4497-b3c1-20236e4e4f23\") " pod="openshift-dns/dns-default-6xhj9" Apr 23 17:59:05.051627 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.051599 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls\") pod \"dns-default-6xhj9\" (UID: \"bbb38a01-f704-4497-b3c1-20236e4e4f23\") " pod="openshift-dns/dns-default-6xhj9" Apr 23 17:59:05.052017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.051647 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/79008782-6b97-4390-87b3-4fde5c883645-ca-trust-extracted\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:05.052017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.051678 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/79008782-6b97-4390-87b3-4fde5c883645-installation-pull-secrets\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:05.052017 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:05.051653 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:59:05.052017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.051706 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pd6jm\" (UniqueName: \"kubernetes.io/projected/e929e6ba-de02-4dcb-affd-2772d869c2e0-kube-api-access-pd6jm\") pod \"ingress-canary-6pv4g\" (UID: \"e929e6ba-de02-4dcb-affd-2772d869c2e0\") " pod="openshift-ingress-canary/ingress-canary-6pv4g" Apr 23 17:59:05.052017 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:05.051712 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-579c67b56b-drvmr: secret "image-registry-tls" not found Apr 23 17:59:05.052017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.051759 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/79008782-6b97-4390-87b3-4fde5c883645-registry-certificates\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:05.052017 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:05.051802 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls podName:79008782-6b97-4390-87b3-4fde5c883645 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:05.551781007 +0000 UTC m=+34.719108306 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls") pod "image-registry-579c67b56b-drvmr" (UID: "79008782-6b97-4390-87b3-4fde5c883645") : secret "image-registry-tls" not found Apr 23 17:59:05.052017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.051958 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/79008782-6b97-4390-87b3-4fde5c883645-ca-trust-extracted\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:05.052251 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.052107 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79008782-6b97-4390-87b3-4fde5c883645-trusted-ca\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:05.055668 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.055648 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/79008782-6b97-4390-87b3-4fde5c883645-installation-pull-secrets\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:05.055668 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.055660 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/79008782-6b97-4390-87b3-4fde5c883645-image-registry-private-configuration\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:05.058784 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.058762 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgqk6\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-kube-api-access-tgqk6\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:05.061278 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.061256 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-bound-sa-token\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:05.061579 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.061563 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd6jm\" (UniqueName: \"kubernetes.io/projected/e929e6ba-de02-4dcb-affd-2772d869c2e0-kube-api-access-pd6jm\") pod \"ingress-canary-6pv4g\" (UID: \"e929e6ba-de02-4dcb-affd-2772d869c2e0\") " pod="openshift-ingress-canary/ingress-canary-6pv4g" Apr 23 17:59:05.152364 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.152328 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ppd4\" (UniqueName: \"kubernetes.io/projected/bbb38a01-f704-4497-b3c1-20236e4e4f23-kube-api-access-5ppd4\") pod \"dns-default-6xhj9\" (UID: \"bbb38a01-f704-4497-b3c1-20236e4e4f23\") " pod="openshift-dns/dns-default-6xhj9" Apr 23 17:59:05.152564 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.152398 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bbb38a01-f704-4497-b3c1-20236e4e4f23-config-volume\") pod \"dns-default-6xhj9\" (UID: \"bbb38a01-f704-4497-b3c1-20236e4e4f23\") " pod="openshift-dns/dns-default-6xhj9" Apr 23 17:59:05.152564 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.152420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls\") pod \"dns-default-6xhj9\" (UID: \"bbb38a01-f704-4497-b3c1-20236e4e4f23\") " pod="openshift-dns/dns-default-6xhj9" Apr 23 17:59:05.152564 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.152516 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bbb38a01-f704-4497-b3c1-20236e4e4f23-tmp-dir\") pod \"dns-default-6xhj9\" (UID: \"bbb38a01-f704-4497-b3c1-20236e4e4f23\") " pod="openshift-dns/dns-default-6xhj9" Apr 23 17:59:05.152564 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:05.152545 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:59:05.152564 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.152558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9wkp\" (UniqueName: \"kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp\") pod \"network-check-target-58nsw\" (UID: \"8f6f3666-ed32-45c7-8804-c3a7951d4815\") " pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:59:05.152833 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:05.152614 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls podName:bbb38a01-f704-4497-b3c1-20236e4e4f23 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:05.652596547 +0000 UTC m=+34.819923844 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls") pod "dns-default-6xhj9" (UID: "bbb38a01-f704-4497-b3c1-20236e4e4f23") : secret "dns-default-metrics-tls" not found Apr 23 17:59:05.152833 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:05.152661 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:59:05.152833 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:05.152675 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:59:05.152833 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:05.152683 2576 projected.go:194] Error preparing data for projected volume kube-api-access-x9wkp for pod openshift-network-diagnostics/network-check-target-58nsw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:59:05.152833 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:05.152725 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp podName:8f6f3666-ed32-45c7-8804-c3a7951d4815 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:37.152709103 +0000 UTC m=+66.320036400 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-x9wkp" (UniqueName: "kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp") pod "network-check-target-58nsw" (UID: "8f6f3666-ed32-45c7-8804-c3a7951d4815") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:59:05.152833 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.152819 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bbb38a01-f704-4497-b3c1-20236e4e4f23-tmp-dir\") pod \"dns-default-6xhj9\" (UID: \"bbb38a01-f704-4497-b3c1-20236e4e4f23\") " pod="openshift-dns/dns-default-6xhj9" Apr 23 17:59:05.153043 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.152979 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bbb38a01-f704-4497-b3c1-20236e4e4f23-config-volume\") pod \"dns-default-6xhj9\" (UID: \"bbb38a01-f704-4497-b3c1-20236e4e4f23\") " pod="openshift-dns/dns-default-6xhj9" Apr 23 17:59:05.160243 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.160223 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ppd4\" (UniqueName: \"kubernetes.io/projected/bbb38a01-f704-4497-b3c1-20236e4e4f23-kube-api-access-5ppd4\") pod \"dns-default-6xhj9\" (UID: \"bbb38a01-f704-4497-b3c1-20236e4e4f23\") " pod="openshift-dns/dns-default-6xhj9" Apr 23 17:59:05.555230 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.555205 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert\") pod \"ingress-canary-6pv4g\" (UID: \"e929e6ba-de02-4dcb-affd-2772d869c2e0\") " pod="openshift-ingress-canary/ingress-canary-6pv4g" Apr 23 17:59:05.555625 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.555274 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:05.555625 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:05.555356 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:59:05.555625 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:05.555389 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:59:05.555625 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:05.555404 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-579c67b56b-drvmr: secret "image-registry-tls" not found Apr 23 17:59:05.555625 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:05.555430 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert podName:e929e6ba-de02-4dcb-affd-2772d869c2e0 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:06.55541135 +0000 UTC m=+35.722738650 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert") pod "ingress-canary-6pv4g" (UID: "e929e6ba-de02-4dcb-affd-2772d869c2e0") : secret "canary-serving-cert" not found Apr 23 17:59:05.555625 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:05.555446 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls podName:79008782-6b97-4390-87b3-4fde5c883645 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:06.55544012 +0000 UTC m=+35.722767418 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls") pod "image-registry-579c67b56b-drvmr" (UID: "79008782-6b97-4390-87b3-4fde5c883645") : secret "image-registry-tls" not found Apr 23 17:59:05.656282 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:05.656250 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls\") pod \"dns-default-6xhj9\" (UID: \"bbb38a01-f704-4497-b3c1-20236e4e4f23\") " pod="openshift-dns/dns-default-6xhj9" Apr 23 17:59:05.656430 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:05.656411 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:59:05.656509 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:05.656494 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls podName:bbb38a01-f704-4497-b3c1-20236e4e4f23 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:06.656477755 +0000 UTC m=+35.823805065 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls") pod "dns-default-6xhj9" (UID: "bbb38a01-f704-4497-b3c1-20236e4e4f23") : secret "dns-default-metrics-tls" not found Apr 23 17:59:06.410884 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:06.410708 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:59:06.411054 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:06.410765 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:59:06.411054 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:06.410791 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:59:06.413561 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:06.413543 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 17:59:06.413687 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:06.413657 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 17:59:06.413687 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:06.413661 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qjtm5\"" Apr 23 17:59:06.413792 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:06.413716 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7tv5t\"" Apr 23 17:59:06.413792 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:06.413734 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 17:59:06.413792 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:06.413782 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 17:59:06.562872 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:06.562837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert\") pod \"ingress-canary-6pv4g\" (UID: \"e929e6ba-de02-4dcb-affd-2772d869c2e0\") " pod="openshift-ingress-canary/ingress-canary-6pv4g" Apr 23 17:59:06.563239 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:06.562901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:06.563239 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:06.563001 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:59:06.563239 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:06.563081 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert podName:e929e6ba-de02-4dcb-affd-2772d869c2e0 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:08.563059983 +0000 UTC m=+37.730387284 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert") pod "ingress-canary-6pv4g" (UID: "e929e6ba-de02-4dcb-affd-2772d869c2e0") : secret "canary-serving-cert" not found Apr 23 17:59:06.563239 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:06.563003 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:59:06.563239 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:06.563112 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-579c67b56b-drvmr: secret "image-registry-tls" not found Apr 23 17:59:06.563239 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:06.563157 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls podName:79008782-6b97-4390-87b3-4fde5c883645 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:08.563145582 +0000 UTC m=+37.730472879 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls") pod "image-registry-579c67b56b-drvmr" (UID: "79008782-6b97-4390-87b3-4fde5c883645") : secret "image-registry-tls" not found Apr 23 17:59:06.580225 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:06.580199 2576 generic.go:358] "Generic (PLEG): container finished" podID="f66c5d99-c18e-417a-b05d-439bf68ddbff" containerID="a5ba63df1df8850588ecf65acc9d2da816a25ca0e6bff2a54bb952c3cf4878c6" exitCode=0 Apr 23 17:59:06.580387 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:06.580241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xznrt" event={"ID":"f66c5d99-c18e-417a-b05d-439bf68ddbff","Type":"ContainerDied","Data":"a5ba63df1df8850588ecf65acc9d2da816a25ca0e6bff2a54bb952c3cf4878c6"} Apr 23 17:59:06.664387 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:06.664267 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls\") pod \"dns-default-6xhj9\" (UID: \"bbb38a01-f704-4497-b3c1-20236e4e4f23\") " pod="openshift-dns/dns-default-6xhj9" Apr 23 17:59:06.664560 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:06.664430 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:59:06.664560 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:06.664551 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls podName:bbb38a01-f704-4497-b3c1-20236e4e4f23 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:08.66452799 +0000 UTC m=+37.831855290 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls") pod "dns-default-6xhj9" (UID: "bbb38a01-f704-4497-b3c1-20236e4e4f23") : secret "dns-default-metrics-tls" not found Apr 23 17:59:07.584333 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:07.584297 2576 generic.go:358] "Generic (PLEG): container finished" podID="f66c5d99-c18e-417a-b05d-439bf68ddbff" containerID="a4d5999496b536513fda5e673fc022b885357b78ae7b858c9f48eee48a2dcd65" exitCode=0 Apr 23 17:59:07.584707 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:07.584360 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xznrt" event={"ID":"f66c5d99-c18e-417a-b05d-439bf68ddbff","Type":"ContainerDied","Data":"a4d5999496b536513fda5e673fc022b885357b78ae7b858c9f48eee48a2dcd65"} Apr 23 17:59:08.377970 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:08.377883 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3fad012-152e-4084-a80b-63e1ff5a0998-original-pull-secret\") pod \"global-pull-secret-syncer-62sfd\" (UID: \"a3fad012-152e-4084-a80b-63e1ff5a0998\") " pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:59:08.381163 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:08.381133 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3fad012-152e-4084-a80b-63e1ff5a0998-original-pull-secret\") pod \"global-pull-secret-syncer-62sfd\" (UID: \"a3fad012-152e-4084-a80b-63e1ff5a0998\") " pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:59:08.526158 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:08.526122 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-62sfd" Apr 23 17:59:08.579597 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:08.579569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert\") pod \"ingress-canary-6pv4g\" (UID: \"e929e6ba-de02-4dcb-affd-2772d869c2e0\") " pod="openshift-ingress-canary/ingress-canary-6pv4g" Apr 23 17:59:08.579757 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:08.579621 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:08.579757 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:08.579718 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:59:08.579757 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:08.579722 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:59:08.579757 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:08.579727 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-579c67b56b-drvmr: secret "image-registry-tls" not found Apr 23 17:59:08.579997 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:08.579780 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert podName:e929e6ba-de02-4dcb-affd-2772d869c2e0 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:12.579765685 +0000 UTC m=+41.747092985 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert") pod "ingress-canary-6pv4g" (UID: "e929e6ba-de02-4dcb-affd-2772d869c2e0") : secret "canary-serving-cert" not found Apr 23 17:59:08.579997 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:08.579796 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls podName:79008782-6b97-4390-87b3-4fde5c883645 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:12.579789957 +0000 UTC m=+41.747117254 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls") pod "image-registry-579c67b56b-drvmr" (UID: "79008782-6b97-4390-87b3-4fde5c883645") : secret "image-registry-tls" not found Apr 23 17:59:08.588480 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:08.588444 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xznrt" event={"ID":"f66c5d99-c18e-417a-b05d-439bf68ddbff","Type":"ContainerStarted","Data":"dc35a8a6eb418dedbf547299145a7c7a1a53f9f56e0a9e2aabbf8317619f3146"} Apr 23 17:59:08.628335 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:08.628242 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xznrt" podStartSLOduration=6.1077358 podStartE2EDuration="37.628226708s" podCreationTimestamp="2026-04-23 17:58:31 +0000 UTC" firstStartedPulling="2026-04-23 17:58:33.910945525 +0000 UTC m=+3.078272834" lastFinishedPulling="2026-04-23 17:59:05.431436445 +0000 UTC m=+34.598763742" observedRunningTime="2026-04-23 17:59:08.627890832 +0000 UTC m=+37.795218152" watchObservedRunningTime="2026-04-23 17:59:08.628226708 +0000 UTC m=+37.795554026" Apr 23 17:59:08.680960 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:08.680929 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls\") pod \"dns-default-6xhj9\" (UID: \"bbb38a01-f704-4497-b3c1-20236e4e4f23\") " pod="openshift-dns/dns-default-6xhj9" Apr 23 17:59:08.681114 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:08.681081 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:59:08.681168 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:08.681148 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls podName:bbb38a01-f704-4497-b3c1-20236e4e4f23 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:12.681129453 +0000 UTC m=+41.848456768 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls") pod "dns-default-6xhj9" (UID: "bbb38a01-f704-4497-b3c1-20236e4e4f23") : secret "dns-default-metrics-tls" not found Apr 23 17:59:08.706683 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:08.706651 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-62sfd"] Apr 23 17:59:08.711252 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:59:08.711216 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3fad012_152e_4084_a80b_63e1ff5a0998.slice/crio-ea5308dd92a372b23c0361e45595115ae4dbe06b3cf4bfa32ebcba1af128eb33 WatchSource:0}: Error finding container ea5308dd92a372b23c0361e45595115ae4dbe06b3cf4bfa32ebcba1af128eb33: Status 404 returned error can't find the container with id ea5308dd92a372b23c0361e45595115ae4dbe06b3cf4bfa32ebcba1af128eb33 Apr 23 17:59:09.591732 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:09.591498 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-62sfd" event={"ID":"a3fad012-152e-4084-a80b-63e1ff5a0998","Type":"ContainerStarted","Data":"ea5308dd92a372b23c0361e45595115ae4dbe06b3cf4bfa32ebcba1af128eb33"} Apr 23 17:59:12.598786 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:12.598757 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-62sfd" event={"ID":"a3fad012-152e-4084-a80b-63e1ff5a0998","Type":"ContainerStarted","Data":"7d4add4d6b23fc1b6a425b4342a84905d08d18026341fdb67d97976b9059ffcd"} Apr 23 17:59:12.614629 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:12.614607 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:12.614716 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:12.614675 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert\") pod \"ingress-canary-6pv4g\" (UID: \"e929e6ba-de02-4dcb-affd-2772d869c2e0\") " pod="openshift-ingress-canary/ingress-canary-6pv4g" Apr 23 17:59:12.614813 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:12.614743 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:59:12.614813 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:12.614757 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:59:12.614813 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:12.614759 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-579c67b56b-drvmr: secret "image-registry-tls" not found Apr 23 17:59:12.614813 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:12.614802 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert podName:e929e6ba-de02-4dcb-affd-2772d869c2e0 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:20.614789406 +0000 UTC m=+49.782116703 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert") pod "ingress-canary-6pv4g" (UID: "e929e6ba-de02-4dcb-affd-2772d869c2e0") : secret "canary-serving-cert" not found Apr 23 17:59:12.614950 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:12.614816 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls podName:79008782-6b97-4390-87b3-4fde5c883645 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:20.614808559 +0000 UTC m=+49.782135856 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls") pod "image-registry-579c67b56b-drvmr" (UID: "79008782-6b97-4390-87b3-4fde5c883645") : secret "image-registry-tls" not found Apr 23 17:59:12.618081 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:12.618043 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-62sfd" podStartSLOduration=33.153929989 podStartE2EDuration="36.618029928s" podCreationTimestamp="2026-04-23 17:58:36 +0000 UTC" firstStartedPulling="2026-04-23 17:59:08.713134131 +0000 UTC m=+37.880461442" lastFinishedPulling="2026-04-23 17:59:12.177234083 +0000 UTC m=+41.344561381" observedRunningTime="2026-04-23 17:59:12.616684899 +0000 UTC m=+41.784012238" watchObservedRunningTime="2026-04-23 17:59:12.618029928 +0000 UTC m=+41.785357246" Apr 23 17:59:12.714954 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:12.714920 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls\") pod \"dns-default-6xhj9\" (UID: \"bbb38a01-f704-4497-b3c1-20236e4e4f23\") " pod="openshift-dns/dns-default-6xhj9" Apr 23 17:59:12.715076 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:12.715032 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:59:12.715110 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:12.715085 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls podName:bbb38a01-f704-4497-b3c1-20236e4e4f23 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:20.715070999 +0000 UTC m=+49.882398300 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls") pod "dns-default-6xhj9" (UID: "bbb38a01-f704-4497-b3c1-20236e4e4f23") : secret "dns-default-metrics-tls" not found Apr 23 17:59:20.670728 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:20.670683 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert\") pod \"ingress-canary-6pv4g\" (UID: \"e929e6ba-de02-4dcb-affd-2772d869c2e0\") " pod="openshift-ingress-canary/ingress-canary-6pv4g" Apr 23 17:59:20.671216 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:20.670757 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:20.671216 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:20.670835 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:59:20.671216 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:20.670891 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert podName:e929e6ba-de02-4dcb-affd-2772d869c2e0 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:36.670876846 +0000 UTC m=+65.838204143 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert") pod "ingress-canary-6pv4g" (UID: "e929e6ba-de02-4dcb-affd-2772d869c2e0") : secret "canary-serving-cert" not found Apr 23 17:59:20.671216 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:20.670891 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:59:20.671216 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:20.670909 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-579c67b56b-drvmr: secret "image-registry-tls" not found Apr 23 17:59:20.671216 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:20.670971 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls podName:79008782-6b97-4390-87b3-4fde5c883645 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:36.670954518 +0000 UTC m=+65.838281832 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls") pod "image-registry-579c67b56b-drvmr" (UID: "79008782-6b97-4390-87b3-4fde5c883645") : secret "image-registry-tls" not found Apr 23 17:59:20.771879 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:20.771840 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls\") pod \"dns-default-6xhj9\" (UID: \"bbb38a01-f704-4497-b3c1-20236e4e4f23\") " pod="openshift-dns/dns-default-6xhj9" Apr 23 17:59:20.772046 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:20.771977 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:59:20.772046 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:20.772040 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls podName:bbb38a01-f704-4497-b3c1-20236e4e4f23 nodeName:}" failed. No retries permitted until 2026-04-23 17:59:36.772024672 +0000 UTC m=+65.939351968 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls") pod "dns-default-6xhj9" (UID: "bbb38a01-f704-4497-b3c1-20236e4e4f23") : secret "dns-default-metrics-tls" not found Apr 23 17:59:30.573768 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:30.573734 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7xtlp" Apr 23 17:59:36.687358 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:36.687316 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert\") pod \"ingress-canary-6pv4g\" (UID: \"e929e6ba-de02-4dcb-affd-2772d869c2e0\") " pod="openshift-ingress-canary/ingress-canary-6pv4g" Apr 23 17:59:36.687838 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:36.687387 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 17:59:36.687838 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:36.687490 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:59:36.687838 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:36.687531 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:59:36.687838 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:36.687543 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-579c67b56b-drvmr: secret "image-registry-tls" not found Apr 23 17:59:36.687838 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:36.687571 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert podName:e929e6ba-de02-4dcb-affd-2772d869c2e0 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:08.687551057 +0000 UTC m=+97.854878377 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert") pod "ingress-canary-6pv4g" (UID: "e929e6ba-de02-4dcb-affd-2772d869c2e0") : secret "canary-serving-cert" not found Apr 23 17:59:36.687838 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:36.687591 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls podName:79008782-6b97-4390-87b3-4fde5c883645 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:08.687579604 +0000 UTC m=+97.854906918 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls") pod "image-registry-579c67b56b-drvmr" (UID: "79008782-6b97-4390-87b3-4fde5c883645") : secret "image-registry-tls" not found Apr 23 17:59:36.788303 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:36.788263 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls\") pod \"dns-default-6xhj9\" (UID: \"bbb38a01-f704-4497-b3c1-20236e4e4f23\") " pod="openshift-dns/dns-default-6xhj9" Apr 23 17:59:36.788450 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:36.788409 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:59:36.788512 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:36.788491 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls podName:bbb38a01-f704-4497-b3c1-20236e4e4f23 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:08.788475283 +0000 UTC m=+97.955802593 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls") pod "dns-default-6xhj9" (UID: "bbb38a01-f704-4497-b3c1-20236e4e4f23") : secret "dns-default-metrics-tls" not found Apr 23 17:59:37.090319 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:37.090278 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs\") pod \"network-metrics-daemon-q98mx\" (UID: \"ab8ad387-1bfb-42ab-ad18-c8ea20362f8f\") " pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 17:59:37.092358 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:37.092339 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 17:59:37.101482 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:37.101450 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 17:59:37.101563 ip-10-0-137-157 kubenswrapper[2576]: E0423 17:59:37.101531 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs podName:ab8ad387-1bfb-42ab-ad18-c8ea20362f8f nodeName:}" failed. No retries permitted until 2026-04-23 18:00:41.101517437 +0000 UTC m=+130.268844740 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs") pod "network-metrics-daemon-q98mx" (UID: "ab8ad387-1bfb-42ab-ad18-c8ea20362f8f") : secret "metrics-daemon-secret" not found Apr 23 17:59:37.190663 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:37.190618 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9wkp\" (UniqueName: \"kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp\") pod \"network-check-target-58nsw\" (UID: \"8f6f3666-ed32-45c7-8804-c3a7951d4815\") " pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:59:37.192991 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:37.192973 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 17:59:37.204017 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:37.204000 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 17:59:37.216052 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:37.216018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9wkp\" (UniqueName: \"kubernetes.io/projected/8f6f3666-ed32-45c7-8804-c3a7951d4815-kube-api-access-x9wkp\") pod \"network-check-target-58nsw\" (UID: \"8f6f3666-ed32-45c7-8804-c3a7951d4815\") " pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:59:37.323481 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:37.323440 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qjtm5\"" Apr 23 17:59:37.331686 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:37.331666 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 17:59:37.444935 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:37.444896 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-58nsw"] Apr 23 17:59:37.448502 ip-10-0-137-157 kubenswrapper[2576]: W0423 17:59:37.448472 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f6f3666_ed32_45c7_8804_c3a7951d4815.slice/crio-2d014dd357d37aabfdc5ac655a70db159ef48a33a2db825b88234c85276ace02 WatchSource:0}: Error finding container 2d014dd357d37aabfdc5ac655a70db159ef48a33a2db825b88234c85276ace02: Status 404 returned error can't find the container with id 2d014dd357d37aabfdc5ac655a70db159ef48a33a2db825b88234c85276ace02 Apr 23 17:59:37.646561 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:37.646474 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-58nsw" event={"ID":"8f6f3666-ed32-45c7-8804-c3a7951d4815","Type":"ContainerStarted","Data":"2d014dd357d37aabfdc5ac655a70db159ef48a33a2db825b88234c85276ace02"} Apr 23 17:59:40.653225 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:40.653193 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-58nsw" event={"ID":"8f6f3666-ed32-45c7-8804-c3a7951d4815","Type":"ContainerStarted","Data":"041776d1de5dacf16efddd9aed3baa3066d5ab7e8ef2e71779c9d74764d8a5c1"} Apr 23 17:59:40.653612 ip-10-0-137-157 kubenswrapper[2576]: I0423 17:59:40.653323 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 18:00:08.719432 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:08.719395 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert\") pod \"ingress-canary-6pv4g\" (UID: \"e929e6ba-de02-4dcb-affd-2772d869c2e0\") " pod="openshift-ingress-canary/ingress-canary-6pv4g" Apr 23 18:00:08.719861 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:08.719442 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 18:00:08.719861 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:08.719550 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 18:00:08.719861 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:08.719551 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 18:00:08.719861 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:08.719562 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-579c67b56b-drvmr: secret "image-registry-tls" not found Apr 23 18:00:08.719861 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:08.719612 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert podName:e929e6ba-de02-4dcb-affd-2772d869c2e0 nodeName:}" failed. No retries permitted until 2026-04-23 18:01:12.719596794 +0000 UTC m=+161.886924092 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert") pod "ingress-canary-6pv4g" (UID: "e929e6ba-de02-4dcb-affd-2772d869c2e0") : secret "canary-serving-cert" not found Apr 23 18:00:08.719861 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:08.719627 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls podName:79008782-6b97-4390-87b3-4fde5c883645 nodeName:}" failed. No retries permitted until 2026-04-23 18:01:12.719620367 +0000 UTC m=+161.886947663 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls") pod "image-registry-579c67b56b-drvmr" (UID: "79008782-6b97-4390-87b3-4fde5c883645") : secret "image-registry-tls" not found Apr 23 18:00:08.820647 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:08.820602 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls\") pod \"dns-default-6xhj9\" (UID: \"bbb38a01-f704-4497-b3c1-20236e4e4f23\") " pod="openshift-dns/dns-default-6xhj9" Apr 23 18:00:08.820809 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:08.820748 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 18:00:08.820809 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:08.820807 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls podName:bbb38a01-f704-4497-b3c1-20236e4e4f23 nodeName:}" failed. No retries permitted until 2026-04-23 18:01:12.820793622 +0000 UTC m=+161.988120919 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls") pod "dns-default-6xhj9" (UID: "bbb38a01-f704-4497-b3c1-20236e4e4f23") : secret "dns-default-metrics-tls" not found Apr 23 18:00:11.657888 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:11.657860 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-58nsw" Apr 23 18:00:11.673555 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:11.673507 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-58nsw" podStartSLOduration=98.028495337 podStartE2EDuration="1m40.67349238s" podCreationTimestamp="2026-04-23 17:58:31 +0000 UTC" firstStartedPulling="2026-04-23 17:59:37.450403629 +0000 UTC m=+66.617730927" lastFinishedPulling="2026-04-23 17:59:40.095400673 +0000 UTC m=+69.262727970" observedRunningTime="2026-04-23 17:59:40.671072864 +0000 UTC m=+69.838400185" watchObservedRunningTime="2026-04-23 18:00:11.67349238 +0000 UTC m=+100.840819698" Apr 23 18:00:36.624769 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.624734 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qtzgl"] Apr 23 18:00:36.628910 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.628895 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qtzgl" Apr 23 18:00:36.630828 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.630801 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 23 18:00:36.630828 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.630816 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 23 18:00:36.631312 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.631293 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 23 18:00:36.631312 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.631301 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-l6dn9\"" Apr 23 18:00:36.637681 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.637661 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qtzgl"] Apr 23 18:00:36.721574 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.721538 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qtzgl\" (UID: \"9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qtzgl" Apr 23 18:00:36.721756 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.721651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2529c\" (UniqueName: \"kubernetes.io/projected/9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283-kube-api-access-2529c\") pod \"cluster-samples-operator-6dc5bdb6b4-qtzgl\" (UID: \"9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qtzgl" Apr 23 18:00:36.730583 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.730547 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-mw95w"] Apr 23 18:00:36.733431 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.733409 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-85zc7"] Apr 23 18:00:36.733582 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.733566 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mw95w" Apr 23 18:00:36.735572 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.735553 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-2tc65\"" Apr 23 18:00:36.735692 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.735600 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 23 18:00:36.735692 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.735622 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 23 18:00:36.736349 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.736331 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-85zc7" Apr 23 18:00:36.738053 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.738036 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 23 18:00:36.738142 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.738067 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 18:00:36.738323 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.738310 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 18:00:36.738433 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.738415 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-2rdxl\"" Apr 23 18:00:36.738741 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.738719 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 23 18:00:36.744123 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.744105 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-mw95w"] Apr 23 18:00:36.745034 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.745015 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-85zc7"] Apr 23 18:00:36.746103 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.746084 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 23 18:00:36.822888 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.822855 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kngn2\" (UniqueName: \"kubernetes.io/projected/23e40766-68ea-4cb5-b83f-5a64e8740c67-kube-api-access-kngn2\") pod \"insights-operator-585dfdc468-85zc7\" (UID: \"23e40766-68ea-4cb5-b83f-5a64e8740c67\") " pod="openshift-insights/insights-operator-585dfdc468-85zc7" Apr 23 18:00:36.822888 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.822891 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ed805c43-a1a5-4865-9b28-5ecd8393eece-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-mw95w\" (UID: \"ed805c43-a1a5-4865-9b28-5ecd8393eece\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mw95w" Apr 23 18:00:36.823088 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.822912 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23e40766-68ea-4cb5-b83f-5a64e8740c67-service-ca-bundle\") pod \"insights-operator-585dfdc468-85zc7\" (UID: \"23e40766-68ea-4cb5-b83f-5a64e8740c67\") " pod="openshift-insights/insights-operator-585dfdc468-85zc7" Apr 23 18:00:36.823396 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.823290 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2529c\" (UniqueName: \"kubernetes.io/projected/9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283-kube-api-access-2529c\") pod \"cluster-samples-operator-6dc5bdb6b4-qtzgl\" (UID: \"9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qtzgl" Apr 23 18:00:36.823562 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.823511 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qtzgl\" (UID: \"9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qtzgl" Apr 23 18:00:36.823633 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:36.823589 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 18:00:36.823633 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.823592 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23e40766-68ea-4cb5-b83f-5a64e8740c67-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-85zc7\" (UID: \"23e40766-68ea-4cb5-b83f-5a64e8740c67\") " pod="openshift-insights/insights-operator-585dfdc468-85zc7" Apr 23 18:00:36.823729 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.823637 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23e40766-68ea-4cb5-b83f-5a64e8740c67-serving-cert\") pod \"insights-operator-585dfdc468-85zc7\" (UID: \"23e40766-68ea-4cb5-b83f-5a64e8740c67\") " pod="openshift-insights/insights-operator-585dfdc468-85zc7" Apr 23 18:00:36.823729 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:36.823672 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283-samples-operator-tls podName:9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:37.323651855 +0000 UTC m=+126.490979156 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qtzgl" (UID: "9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283") : secret "samples-operator-tls" not found Apr 23 18:00:36.823828 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.823738 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23e40766-68ea-4cb5-b83f-5a64e8740c67-tmp\") pod \"insights-operator-585dfdc468-85zc7\" (UID: \"23e40766-68ea-4cb5-b83f-5a64e8740c67\") " pod="openshift-insights/insights-operator-585dfdc468-85zc7" Apr 23 18:00:36.823828 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.823775 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/23e40766-68ea-4cb5-b83f-5a64e8740c67-snapshots\") pod \"insights-operator-585dfdc468-85zc7\" (UID: \"23e40766-68ea-4cb5-b83f-5a64e8740c67\") " pod="openshift-insights/insights-operator-585dfdc468-85zc7" Apr 23 18:00:36.824030 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.823920 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ed805c43-a1a5-4865-9b28-5ecd8393eece-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mw95w\" (UID: \"ed805c43-a1a5-4865-9b28-5ecd8393eece\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mw95w" Apr 23 18:00:36.834615 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.834584 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t"] Apr 23 18:00:36.837405 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.837386 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fkmzp"] Apr 23 18:00:36.837575 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.837556 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t" Apr 23 18:00:36.839413 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.839387 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 23 18:00:36.840056 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.840040 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fkmzp" Apr 23 18:00:36.840500 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.840453 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 23 18:00:36.840647 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.840628 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 18:00:36.840718 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.840706 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 18:00:36.840782 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.840770 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-d4tqm\"" Apr 23 18:00:36.841373 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.841352 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2529c\" (UniqueName: \"kubernetes.io/projected/9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283-kube-api-access-2529c\") pod \"cluster-samples-operator-6dc5bdb6b4-qtzgl\" (UID: \"9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qtzgl" Apr 23 18:00:36.843801 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.843781 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 23 18:00:36.843898 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.843816 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 23 18:00:36.843898 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.843819 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 23 18:00:36.844141 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.844126 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 23 18:00:36.844364 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.844349 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-tzgqs\"" Apr 23 18:00:36.847615 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.847596 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t"] Apr 23 18:00:36.850526 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.850507 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fkmzp"] Apr 23 18:00:36.924774 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.924688 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/275ce576-f50c-4c52-aa60-875645871e66-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-f8b7t\" (UID: \"275ce576-f50c-4c52-aa60-875645871e66\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t" Apr 23 18:00:36.924774 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.924725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcsdf\" (UniqueName: \"kubernetes.io/projected/275ce576-f50c-4c52-aa60-875645871e66-kube-api-access-pcsdf\") pod \"cluster-monitoring-operator-75587bd455-f8b7t\" (UID: \"275ce576-f50c-4c52-aa60-875645871e66\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t" Apr 23 18:00:36.924966 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.924804 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23e40766-68ea-4cb5-b83f-5a64e8740c67-serving-cert\") pod \"insights-operator-585dfdc468-85zc7\" (UID: \"23e40766-68ea-4cb5-b83f-5a64e8740c67\") " pod="openshift-insights/insights-operator-585dfdc468-85zc7" Apr 23 18:00:36.924966 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.924844 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/275ce576-f50c-4c52-aa60-875645871e66-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-f8b7t\" (UID: \"275ce576-f50c-4c52-aa60-875645871e66\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t" Apr 23 18:00:36.924966 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.924877 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23e40766-68ea-4cb5-b83f-5a64e8740c67-tmp\") pod \"insights-operator-585dfdc468-85zc7\" (UID: \"23e40766-68ea-4cb5-b83f-5a64e8740c67\") " pod="openshift-insights/insights-operator-585dfdc468-85zc7" Apr 23 18:00:36.924966 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.924911 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23e40766-68ea-4cb5-b83f-5a64e8740c67-service-ca-bundle\") pod \"insights-operator-585dfdc468-85zc7\" (UID: \"23e40766-68ea-4cb5-b83f-5a64e8740c67\") " pod="openshift-insights/insights-operator-585dfdc468-85zc7" Apr 23 18:00:36.925134 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.925051 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kngn2\" (UniqueName: \"kubernetes.io/projected/23e40766-68ea-4cb5-b83f-5a64e8740c67-kube-api-access-kngn2\") pod \"insights-operator-585dfdc468-85zc7\" (UID: \"23e40766-68ea-4cb5-b83f-5a64e8740c67\") " pod="openshift-insights/insights-operator-585dfdc468-85zc7" Apr 23 18:00:36.925134 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.925093 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23e40766-68ea-4cb5-b83f-5a64e8740c67-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-85zc7\" (UID: \"23e40766-68ea-4cb5-b83f-5a64e8740c67\") " pod="openshift-insights/insights-operator-585dfdc468-85zc7" Apr 23 18:00:36.925134 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.925117 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ed805c43-a1a5-4865-9b28-5ecd8393eece-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mw95w\" (UID: \"ed805c43-a1a5-4865-9b28-5ecd8393eece\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mw95w" Apr 23 18:00:36.925266 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.925136 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/23e40766-68ea-4cb5-b83f-5a64e8740c67-snapshots\") pod \"insights-operator-585dfdc468-85zc7\" (UID: \"23e40766-68ea-4cb5-b83f-5a64e8740c67\") " pod="openshift-insights/insights-operator-585dfdc468-85zc7" Apr 23 18:00:36.925266 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.925163 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ed805c43-a1a5-4865-9b28-5ecd8393eece-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-mw95w\" (UID: \"ed805c43-a1a5-4865-9b28-5ecd8393eece\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mw95w" Apr 23 18:00:36.925266 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:36.925226 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 18:00:36.925393 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:36.925302 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed805c43-a1a5-4865-9b28-5ecd8393eece-networking-console-plugin-cert podName:ed805c43-a1a5-4865-9b28-5ecd8393eece nodeName:}" failed. No retries permitted until 2026-04-23 18:00:37.425282548 +0000 UTC m=+126.592609845 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ed805c43-a1a5-4865-9b28-5ecd8393eece-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mw95w" (UID: "ed805c43-a1a5-4865-9b28-5ecd8393eece") : secret "networking-console-plugin-cert" not found Apr 23 18:00:36.925393 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.925353 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23e40766-68ea-4cb5-b83f-5a64e8740c67-tmp\") pod \"insights-operator-585dfdc468-85zc7\" (UID: \"23e40766-68ea-4cb5-b83f-5a64e8740c67\") " pod="openshift-insights/insights-operator-585dfdc468-85zc7" Apr 23 18:00:36.925630 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.925607 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23e40766-68ea-4cb5-b83f-5a64e8740c67-service-ca-bundle\") pod \"insights-operator-585dfdc468-85zc7\" (UID: \"23e40766-68ea-4cb5-b83f-5a64e8740c67\") " pod="openshift-insights/insights-operator-585dfdc468-85zc7" Apr 23 18:00:36.925773 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.925755 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ed805c43-a1a5-4865-9b28-5ecd8393eece-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-mw95w\" (UID: \"ed805c43-a1a5-4865-9b28-5ecd8393eece\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mw95w" Apr 23 18:00:36.925858 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.925843 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23e40766-68ea-4cb5-b83f-5a64e8740c67-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-85zc7\" (UID: \"23e40766-68ea-4cb5-b83f-5a64e8740c67\") " pod="openshift-insights/insights-operator-585dfdc468-85zc7" Apr 23 18:00:36.926117 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.926096 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/23e40766-68ea-4cb5-b83f-5a64e8740c67-snapshots\") pod \"insights-operator-585dfdc468-85zc7\" (UID: \"23e40766-68ea-4cb5-b83f-5a64e8740c67\") " pod="openshift-insights/insights-operator-585dfdc468-85zc7" Apr 23 18:00:36.927363 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.927340 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23e40766-68ea-4cb5-b83f-5a64e8740c67-serving-cert\") pod \"insights-operator-585dfdc468-85zc7\" (UID: \"23e40766-68ea-4cb5-b83f-5a64e8740c67\") " pod="openshift-insights/insights-operator-585dfdc468-85zc7" Apr 23 18:00:36.933534 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:36.933506 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kngn2\" (UniqueName: \"kubernetes.io/projected/23e40766-68ea-4cb5-b83f-5a64e8740c67-kube-api-access-kngn2\") pod \"insights-operator-585dfdc468-85zc7\" (UID: \"23e40766-68ea-4cb5-b83f-5a64e8740c67\") " pod="openshift-insights/insights-operator-585dfdc468-85zc7" Apr 23 18:00:37.026473 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.026420 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69e4a548-d326-40aa-879e-60213d8f6fc5-config\") pod \"service-ca-operator-d6fc45fc5-fkmzp\" (UID: \"69e4a548-d326-40aa-879e-60213d8f6fc5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fkmzp" Apr 23 18:00:37.026654 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.026510 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/275ce576-f50c-4c52-aa60-875645871e66-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-f8b7t\" (UID: \"275ce576-f50c-4c52-aa60-875645871e66\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t" Apr 23 18:00:37.026654 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.026531 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcsdf\" (UniqueName: \"kubernetes.io/projected/275ce576-f50c-4c52-aa60-875645871e66-kube-api-access-pcsdf\") pod \"cluster-monitoring-operator-75587bd455-f8b7t\" (UID: \"275ce576-f50c-4c52-aa60-875645871e66\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t" Apr 23 18:00:37.026654 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.026570 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/275ce576-f50c-4c52-aa60-875645871e66-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-f8b7t\" (UID: \"275ce576-f50c-4c52-aa60-875645871e66\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t" Apr 23 18:00:37.026654 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.026609 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4dj5\" (UniqueName: \"kubernetes.io/projected/69e4a548-d326-40aa-879e-60213d8f6fc5-kube-api-access-k4dj5\") pod \"service-ca-operator-d6fc45fc5-fkmzp\" (UID: \"69e4a548-d326-40aa-879e-60213d8f6fc5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fkmzp" Apr 23 18:00:37.026654 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.026627 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69e4a548-d326-40aa-879e-60213d8f6fc5-serving-cert\") pod \"service-ca-operator-d6fc45fc5-fkmzp\" (UID: \"69e4a548-d326-40aa-879e-60213d8f6fc5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fkmzp" Apr 23 18:00:37.026810 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:37.026657 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 18:00:37.026810 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:37.026760 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/275ce576-f50c-4c52-aa60-875645871e66-cluster-monitoring-operator-tls podName:275ce576-f50c-4c52-aa60-875645871e66 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:37.526740215 +0000 UTC m=+126.694067699 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/275ce576-f50c-4c52-aa60-875645871e66-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-f8b7t" (UID: "275ce576-f50c-4c52-aa60-875645871e66") : secret "cluster-monitoring-operator-tls" not found Apr 23 18:00:37.027300 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.027278 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/275ce576-f50c-4c52-aa60-875645871e66-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-f8b7t\" (UID: \"275ce576-f50c-4c52-aa60-875645871e66\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t" Apr 23 18:00:37.037851 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.037815 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcsdf\" (UniqueName: \"kubernetes.io/projected/275ce576-f50c-4c52-aa60-875645871e66-kube-api-access-pcsdf\") pod \"cluster-monitoring-operator-75587bd455-f8b7t\" (UID: \"275ce576-f50c-4c52-aa60-875645871e66\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t" Apr 23 18:00:37.049493 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.049451 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-85zc7" Apr 23 18:00:37.128032 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.127999 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4dj5\" (UniqueName: \"kubernetes.io/projected/69e4a548-d326-40aa-879e-60213d8f6fc5-kube-api-access-k4dj5\") pod \"service-ca-operator-d6fc45fc5-fkmzp\" (UID: \"69e4a548-d326-40aa-879e-60213d8f6fc5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fkmzp" Apr 23 18:00:37.128149 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.128041 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69e4a548-d326-40aa-879e-60213d8f6fc5-serving-cert\") pod \"service-ca-operator-d6fc45fc5-fkmzp\" (UID: \"69e4a548-d326-40aa-879e-60213d8f6fc5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fkmzp" Apr 23 18:00:37.128209 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.128194 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69e4a548-d326-40aa-879e-60213d8f6fc5-config\") pod \"service-ca-operator-d6fc45fc5-fkmzp\" (UID: \"69e4a548-d326-40aa-879e-60213d8f6fc5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fkmzp" Apr 23 18:00:37.129143 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.129117 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69e4a548-d326-40aa-879e-60213d8f6fc5-config\") pod \"service-ca-operator-d6fc45fc5-fkmzp\" (UID: \"69e4a548-d326-40aa-879e-60213d8f6fc5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fkmzp" Apr 23 18:00:37.130613 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.130593 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69e4a548-d326-40aa-879e-60213d8f6fc5-serving-cert\") pod \"service-ca-operator-d6fc45fc5-fkmzp\" (UID: \"69e4a548-d326-40aa-879e-60213d8f6fc5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fkmzp" Apr 23 18:00:37.137685 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.137664 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4dj5\" (UniqueName: \"kubernetes.io/projected/69e4a548-d326-40aa-879e-60213d8f6fc5-kube-api-access-k4dj5\") pod \"service-ca-operator-d6fc45fc5-fkmzp\" (UID: \"69e4a548-d326-40aa-879e-60213d8f6fc5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fkmzp" Apr 23 18:00:37.157839 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.157814 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fkmzp" Apr 23 18:00:37.164497 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.164474 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-85zc7"] Apr 23 18:00:37.168776 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:00:37.168747 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23e40766_68ea_4cb5_b83f_5a64e8740c67.slice/crio-557845d033e21524327662fa6ef2a9e974bad711428a34226a85bf69ac5e4070 WatchSource:0}: Error finding container 557845d033e21524327662fa6ef2a9e974bad711428a34226a85bf69ac5e4070: Status 404 returned error can't find the container with id 557845d033e21524327662fa6ef2a9e974bad711428a34226a85bf69ac5e4070 Apr 23 18:00:37.272793 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.272764 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fkmzp"] Apr 23 18:00:37.275496 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:00:37.275443 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69e4a548_d326_40aa_879e_60213d8f6fc5.slice/crio-fb3e9bc3660e5e2b5634522147490fd8c29529bd725cd46b67a14ac4a3de820e WatchSource:0}: Error finding container fb3e9bc3660e5e2b5634522147490fd8c29529bd725cd46b67a14ac4a3de820e: Status 404 returned error can't find the container with id fb3e9bc3660e5e2b5634522147490fd8c29529bd725cd46b67a14ac4a3de820e Apr 23 18:00:37.330681 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.330637 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qtzgl\" (UID: \"9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qtzgl" Apr 23 18:00:37.330850 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:37.330797 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 18:00:37.330903 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:37.330871 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283-samples-operator-tls podName:9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:38.330851238 +0000 UTC m=+127.498178535 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qtzgl" (UID: "9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283") : secret "samples-operator-tls" not found Apr 23 18:00:37.431565 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.431525 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ed805c43-a1a5-4865-9b28-5ecd8393eece-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mw95w\" (UID: \"ed805c43-a1a5-4865-9b28-5ecd8393eece\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mw95w" Apr 23 18:00:37.431739 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:37.431654 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 18:00:37.431739 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:37.431733 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed805c43-a1a5-4865-9b28-5ecd8393eece-networking-console-plugin-cert podName:ed805c43-a1a5-4865-9b28-5ecd8393eece nodeName:}" failed. No retries permitted until 2026-04-23 18:00:38.431714822 +0000 UTC m=+127.599042122 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ed805c43-a1a5-4865-9b28-5ecd8393eece-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mw95w" (UID: "ed805c43-a1a5-4865-9b28-5ecd8393eece") : secret "networking-console-plugin-cert" not found Apr 23 18:00:37.532621 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.532533 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/275ce576-f50c-4c52-aa60-875645871e66-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-f8b7t\" (UID: \"275ce576-f50c-4c52-aa60-875645871e66\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t" Apr 23 18:00:37.532766 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:37.532676 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 18:00:37.532766 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:37.532743 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/275ce576-f50c-4c52-aa60-875645871e66-cluster-monitoring-operator-tls podName:275ce576-f50c-4c52-aa60-875645871e66 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:38.532727682 +0000 UTC m=+127.700054978 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/275ce576-f50c-4c52-aa60-875645871e66-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-f8b7t" (UID: "275ce576-f50c-4c52-aa60-875645871e66") : secret "cluster-monitoring-operator-tls" not found Apr 23 18:00:37.761000 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.760952 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fkmzp" event={"ID":"69e4a548-d326-40aa-879e-60213d8f6fc5","Type":"ContainerStarted","Data":"fb3e9bc3660e5e2b5634522147490fd8c29529bd725cd46b67a14ac4a3de820e"} Apr 23 18:00:37.762372 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:37.762345 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-85zc7" event={"ID":"23e40766-68ea-4cb5-b83f-5a64e8740c67","Type":"ContainerStarted","Data":"557845d033e21524327662fa6ef2a9e974bad711428a34226a85bf69ac5e4070"} Apr 23 18:00:38.339961 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:38.339918 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qtzgl\" (UID: \"9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qtzgl" Apr 23 18:00:38.340182 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:38.340080 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 18:00:38.340182 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:38.340159 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283-samples-operator-tls podName:9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:40.340139315 +0000 UTC m=+129.507466619 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qtzgl" (UID: "9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283") : secret "samples-operator-tls" not found Apr 23 18:00:38.440705 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:38.440667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ed805c43-a1a5-4865-9b28-5ecd8393eece-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mw95w\" (UID: \"ed805c43-a1a5-4865-9b28-5ecd8393eece\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mw95w" Apr 23 18:00:38.440894 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:38.440838 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 18:00:38.440957 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:38.440939 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed805c43-a1a5-4865-9b28-5ecd8393eece-networking-console-plugin-cert podName:ed805c43-a1a5-4865-9b28-5ecd8393eece nodeName:}" failed. No retries permitted until 2026-04-23 18:00:40.440917012 +0000 UTC m=+129.608244320 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ed805c43-a1a5-4865-9b28-5ecd8393eece-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mw95w" (UID: "ed805c43-a1a5-4865-9b28-5ecd8393eece") : secret "networking-console-plugin-cert" not found Apr 23 18:00:38.541709 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:38.541670 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/275ce576-f50c-4c52-aa60-875645871e66-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-f8b7t\" (UID: \"275ce576-f50c-4c52-aa60-875645871e66\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t" Apr 23 18:00:38.541885 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:38.541850 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 18:00:38.541943 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:38.541932 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/275ce576-f50c-4c52-aa60-875645871e66-cluster-monitoring-operator-tls podName:275ce576-f50c-4c52-aa60-875645871e66 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:40.54191083 +0000 UTC m=+129.709238143 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/275ce576-f50c-4c52-aa60-875645871e66-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-f8b7t" (UID: "275ce576-f50c-4c52-aa60-875645871e66") : secret "cluster-monitoring-operator-tls" not found Apr 23 18:00:40.358734 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:40.358696 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qtzgl\" (UID: \"9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qtzgl" Apr 23 18:00:40.359188 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:40.358869 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 18:00:40.359188 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:40.358942 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283-samples-operator-tls podName:9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:44.358925382 +0000 UTC m=+133.526252680 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qtzgl" (UID: "9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283") : secret "samples-operator-tls" not found Apr 23 18:00:40.459736 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:40.459698 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ed805c43-a1a5-4865-9b28-5ecd8393eece-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mw95w\" (UID: \"ed805c43-a1a5-4865-9b28-5ecd8393eece\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mw95w" Apr 23 18:00:40.459928 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:40.459865 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 18:00:40.459986 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:40.459933 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed805c43-a1a5-4865-9b28-5ecd8393eece-networking-console-plugin-cert podName:ed805c43-a1a5-4865-9b28-5ecd8393eece nodeName:}" failed. No retries permitted until 2026-04-23 18:00:44.459915109 +0000 UTC m=+133.627242419 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ed805c43-a1a5-4865-9b28-5ecd8393eece-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mw95w" (UID: "ed805c43-a1a5-4865-9b28-5ecd8393eece") : secret "networking-console-plugin-cert" not found Apr 23 18:00:40.560829 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:40.560794 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/275ce576-f50c-4c52-aa60-875645871e66-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-f8b7t\" (UID: \"275ce576-f50c-4c52-aa60-875645871e66\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t" Apr 23 18:00:40.561015 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:40.560978 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 18:00:40.561083 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:40.561071 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/275ce576-f50c-4c52-aa60-875645871e66-cluster-monitoring-operator-tls podName:275ce576-f50c-4c52-aa60-875645871e66 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:44.561050541 +0000 UTC m=+133.728377852 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/275ce576-f50c-4c52-aa60-875645871e66-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-f8b7t" (UID: "275ce576-f50c-4c52-aa60-875645871e66") : secret "cluster-monitoring-operator-tls" not found Apr 23 18:00:40.770258 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:40.770172 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fkmzp" event={"ID":"69e4a548-d326-40aa-879e-60213d8f6fc5","Type":"ContainerStarted","Data":"c30a67fc4401c3a58e4f6c5a315e6fca2820934a7bf4fe9b466e9f4b57817bff"} Apr 23 18:00:40.771535 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:40.771508 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-85zc7" event={"ID":"23e40766-68ea-4cb5-b83f-5a64e8740c67","Type":"ContainerStarted","Data":"483ee6f9495db7928dc5685339e0f4a852d6115c2a6111227148ccd083c9c30f"} Apr 23 18:00:40.789294 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:40.789251 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fkmzp" podStartSLOduration=2.051872755 podStartE2EDuration="4.789237338s" podCreationTimestamp="2026-04-23 18:00:36 +0000 UTC" firstStartedPulling="2026-04-23 18:00:37.277259824 +0000 UTC m=+126.444587121" lastFinishedPulling="2026-04-23 18:00:40.0146244 +0000 UTC m=+129.181951704" observedRunningTime="2026-04-23 18:00:40.787498588 +0000 UTC m=+129.954825906" watchObservedRunningTime="2026-04-23 18:00:40.789237338 +0000 UTC m=+129.956564656" Apr 23 18:00:40.808883 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:40.808829 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-85zc7" podStartSLOduration=1.9674016060000001 podStartE2EDuration="4.808811819s" podCreationTimestamp="2026-04-23 18:00:36 +0000 UTC" firstStartedPulling="2026-04-23 18:00:37.170366855 +0000 UTC m=+126.337694152" lastFinishedPulling="2026-04-23 18:00:40.011777068 +0000 UTC m=+129.179104365" observedRunningTime="2026-04-23 18:00:40.806964612 +0000 UTC m=+129.974291937" watchObservedRunningTime="2026-04-23 18:00:40.808811819 +0000 UTC m=+129.976139139" Apr 23 18:00:40.954339 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:40.954300 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-d4pdb"] Apr 23 18:00:40.957788 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:40.957772 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d4pdb" Apr 23 18:00:40.960578 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:40.960553 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 23 18:00:40.961111 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:40.961093 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 23 18:00:40.961209 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:40.961188 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-gf94q\"" Apr 23 18:00:40.982111 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:40.982090 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-d4pdb"] Apr 23 18:00:41.066026 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:41.065996 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvjrf\" (UniqueName: \"kubernetes.io/projected/68703c38-8262-4cea-8aa5-15f64ae66fa1-kube-api-access-dvjrf\") pod \"migrator-74bb7799d9-d4pdb\" (UID: \"68703c38-8262-4cea-8aa5-15f64ae66fa1\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d4pdb" Apr 23 18:00:41.166792 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:41.166758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs\") pod \"network-metrics-daemon-q98mx\" (UID: \"ab8ad387-1bfb-42ab-ad18-c8ea20362f8f\") " pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 18:00:41.166945 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:41.166826 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvjrf\" (UniqueName: \"kubernetes.io/projected/68703c38-8262-4cea-8aa5-15f64ae66fa1-kube-api-access-dvjrf\") pod \"migrator-74bb7799d9-d4pdb\" (UID: \"68703c38-8262-4cea-8aa5-15f64ae66fa1\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d4pdb" Apr 23 18:00:41.166945 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:41.166911 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 18:00:41.167027 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:41.166981 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs podName:ab8ad387-1bfb-42ab-ad18-c8ea20362f8f nodeName:}" failed. No retries permitted until 2026-04-23 18:02:43.166962583 +0000 UTC m=+252.334289899 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs") pod "network-metrics-daemon-q98mx" (UID: "ab8ad387-1bfb-42ab-ad18-c8ea20362f8f") : secret "metrics-daemon-secret" not found Apr 23 18:00:41.175068 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:41.175039 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvjrf\" (UniqueName: \"kubernetes.io/projected/68703c38-8262-4cea-8aa5-15f64ae66fa1-kube-api-access-dvjrf\") pod \"migrator-74bb7799d9-d4pdb\" (UID: \"68703c38-8262-4cea-8aa5-15f64ae66fa1\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d4pdb" Apr 23 18:00:41.267442 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:41.267405 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d4pdb" Apr 23 18:00:41.391666 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:41.391637 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-d4pdb"] Apr 23 18:00:41.396330 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:00:41.396301 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68703c38_8262_4cea_8aa5_15f64ae66fa1.slice/crio-ce8155684704975c8dbd5d704efa07dbae468b7d9a37b5a892655ba49f31bc4a WatchSource:0}: Error finding container ce8155684704975c8dbd5d704efa07dbae468b7d9a37b5a892655ba49f31bc4a: Status 404 returned error can't find the container with id ce8155684704975c8dbd5d704efa07dbae468b7d9a37b5a892655ba49f31bc4a Apr 23 18:00:41.774389 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:41.774310 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d4pdb" event={"ID":"68703c38-8262-4cea-8aa5-15f64ae66fa1","Type":"ContainerStarted","Data":"ce8155684704975c8dbd5d704efa07dbae468b7d9a37b5a892655ba49f31bc4a"} Apr 23 18:00:42.780185 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:42.780141 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d4pdb" event={"ID":"68703c38-8262-4cea-8aa5-15f64ae66fa1","Type":"ContainerStarted","Data":"7eb60d2a0985bf44d2cde981d92e41e211a15b738b4704cf6e84a19dd60302c0"} Apr 23 18:00:42.780185 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:42.780182 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d4pdb" event={"ID":"68703c38-8262-4cea-8aa5-15f64ae66fa1","Type":"ContainerStarted","Data":"b7381b8bb95dcaad558373496f5a04fb7afec8b040012e294383da6222b56faf"} Apr 23 18:00:42.801544 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:42.801493 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-d4pdb" podStartSLOduration=1.579432969 podStartE2EDuration="2.801454063s" podCreationTimestamp="2026-04-23 18:00:40 +0000 UTC" firstStartedPulling="2026-04-23 18:00:41.398651125 +0000 UTC m=+130.565978425" lastFinishedPulling="2026-04-23 18:00:42.620672222 +0000 UTC m=+131.787999519" observedRunningTime="2026-04-23 18:00:42.801292637 +0000 UTC m=+131.968619959" watchObservedRunningTime="2026-04-23 18:00:42.801454063 +0000 UTC m=+131.968781384" Apr 23 18:00:44.187816 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.187781 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-ltdx2"] Apr 23 18:00:44.190868 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.190846 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-ltdx2" Apr 23 18:00:44.193112 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.193082 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 23 18:00:44.193112 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.193093 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 23 18:00:44.193112 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.193104 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-24pbt\"" Apr 23 18:00:44.193673 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.193654 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 23 18:00:44.193817 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.193701 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 23 18:00:44.200377 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.200357 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-ltdx2"] Apr 23 18:00:44.289800 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.289761 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6b6572e0-def6-469c-82e4-6a0f6176a8fc-signing-cabundle\") pod \"service-ca-865cb79987-ltdx2\" (UID: \"6b6572e0-def6-469c-82e4-6a0f6176a8fc\") " pod="openshift-service-ca/service-ca-865cb79987-ltdx2" Apr 23 18:00:44.289800 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.289801 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6b6572e0-def6-469c-82e4-6a0f6176a8fc-signing-key\") pod \"service-ca-865cb79987-ltdx2\" (UID: \"6b6572e0-def6-469c-82e4-6a0f6176a8fc\") " pod="openshift-service-ca/service-ca-865cb79987-ltdx2" Apr 23 18:00:44.289994 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.289920 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nskrj\" (UniqueName: \"kubernetes.io/projected/6b6572e0-def6-469c-82e4-6a0f6176a8fc-kube-api-access-nskrj\") pod \"service-ca-865cb79987-ltdx2\" (UID: \"6b6572e0-def6-469c-82e4-6a0f6176a8fc\") " pod="openshift-service-ca/service-ca-865cb79987-ltdx2" Apr 23 18:00:44.390359 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.390323 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nskrj\" (UniqueName: \"kubernetes.io/projected/6b6572e0-def6-469c-82e4-6a0f6176a8fc-kube-api-access-nskrj\") pod \"service-ca-865cb79987-ltdx2\" (UID: \"6b6572e0-def6-469c-82e4-6a0f6176a8fc\") " pod="openshift-service-ca/service-ca-865cb79987-ltdx2" Apr 23 18:00:44.390579 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.390384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6b6572e0-def6-469c-82e4-6a0f6176a8fc-signing-cabundle\") pod \"service-ca-865cb79987-ltdx2\" (UID: \"6b6572e0-def6-469c-82e4-6a0f6176a8fc\") " pod="openshift-service-ca/service-ca-865cb79987-ltdx2" Apr 23 18:00:44.390579 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.390408 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6b6572e0-def6-469c-82e4-6a0f6176a8fc-signing-key\") pod \"service-ca-865cb79987-ltdx2\" (UID: \"6b6572e0-def6-469c-82e4-6a0f6176a8fc\") " pod="openshift-service-ca/service-ca-865cb79987-ltdx2" Apr 23 18:00:44.390579 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.390430 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qtzgl\" (UID: \"9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qtzgl" Apr 23 18:00:44.390579 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:44.390539 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 18:00:44.390786 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:44.390602 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283-samples-operator-tls podName:9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:52.390586827 +0000 UTC m=+141.557914124 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qtzgl" (UID: "9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283") : secret "samples-operator-tls" not found Apr 23 18:00:44.391034 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.391015 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6b6572e0-def6-469c-82e4-6a0f6176a8fc-signing-cabundle\") pod \"service-ca-865cb79987-ltdx2\" (UID: \"6b6572e0-def6-469c-82e4-6a0f6176a8fc\") " pod="openshift-service-ca/service-ca-865cb79987-ltdx2" Apr 23 18:00:44.393051 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.393029 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6b6572e0-def6-469c-82e4-6a0f6176a8fc-signing-key\") pod \"service-ca-865cb79987-ltdx2\" (UID: \"6b6572e0-def6-469c-82e4-6a0f6176a8fc\") " pod="openshift-service-ca/service-ca-865cb79987-ltdx2" Apr 23 18:00:44.398499 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.398478 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nskrj\" (UniqueName: \"kubernetes.io/projected/6b6572e0-def6-469c-82e4-6a0f6176a8fc-kube-api-access-nskrj\") pod \"service-ca-865cb79987-ltdx2\" (UID: \"6b6572e0-def6-469c-82e4-6a0f6176a8fc\") " pod="openshift-service-ca/service-ca-865cb79987-ltdx2" Apr 23 18:00:44.491665 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.491587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ed805c43-a1a5-4865-9b28-5ecd8393eece-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mw95w\" (UID: \"ed805c43-a1a5-4865-9b28-5ecd8393eece\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mw95w" Apr 23 18:00:44.491804 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:44.491685 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 18:00:44.491804 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:44.491738 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed805c43-a1a5-4865-9b28-5ecd8393eece-networking-console-plugin-cert podName:ed805c43-a1a5-4865-9b28-5ecd8393eece nodeName:}" failed. No retries permitted until 2026-04-23 18:00:52.491724552 +0000 UTC m=+141.659051848 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ed805c43-a1a5-4865-9b28-5ecd8393eece-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mw95w" (UID: "ed805c43-a1a5-4865-9b28-5ecd8393eece") : secret "networking-console-plugin-cert" not found Apr 23 18:00:44.500052 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.500025 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-ltdx2" Apr 23 18:00:44.592679 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.592651 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/275ce576-f50c-4c52-aa60-875645871e66-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-f8b7t\" (UID: \"275ce576-f50c-4c52-aa60-875645871e66\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t" Apr 23 18:00:44.592811 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:44.592795 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 18:00:44.592877 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:44.592867 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/275ce576-f50c-4c52-aa60-875645871e66-cluster-monitoring-operator-tls podName:275ce576-f50c-4c52-aa60-875645871e66 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:52.592846599 +0000 UTC m=+141.760173896 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/275ce576-f50c-4c52-aa60-875645871e66-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-f8b7t" (UID: "275ce576-f50c-4c52-aa60-875645871e66") : secret "cluster-monitoring-operator-tls" not found Apr 23 18:00:44.616106 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.616079 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-ltdx2"] Apr 23 18:00:44.618979 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:00:44.618938 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b6572e0_def6_469c_82e4_6a0f6176a8fc.slice/crio-7d51a11d3764b2772b135842e4a756f36fa49d015a503f8e6777027f1c6b2073 WatchSource:0}: Error finding container 7d51a11d3764b2772b135842e4a756f36fa49d015a503f8e6777027f1c6b2073: Status 404 returned error can't find the container with id 7d51a11d3764b2772b135842e4a756f36fa49d015a503f8e6777027f1c6b2073 Apr 23 18:00:44.786126 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.786038 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-ltdx2" event={"ID":"6b6572e0-def6-469c-82e4-6a0f6176a8fc","Type":"ContainerStarted","Data":"1a3db4fe44c6ad21e6c48b352ba149ee023a937c30753aced1d760106ee457a9"} Apr 23 18:00:44.786126 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.786077 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-ltdx2" event={"ID":"6b6572e0-def6-469c-82e4-6a0f6176a8fc","Type":"ContainerStarted","Data":"7d51a11d3764b2772b135842e4a756f36fa49d015a503f8e6777027f1c6b2073"} Apr 23 18:00:44.800755 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.800710 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-ltdx2" podStartSLOduration=0.800696599 podStartE2EDuration="800.696599ms" podCreationTimestamp="2026-04-23 18:00:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:00:44.800592465 +0000 UTC m=+133.967919783" watchObservedRunningTime="2026-04-23 18:00:44.800696599 +0000 UTC m=+133.968023918" Apr 23 18:00:44.908656 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:44.908624 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-x7kw5_3ce8a6d4-d062-4813-b21e-b06b4d147b13/dns-node-resolver/0.log" Apr 23 18:00:45.709769 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:45.709741 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-k8ncf_365c5763-fac0-4121-9de2-0a669a25bc8c/node-ca/0.log" Apr 23 18:00:46.909394 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:46.909360 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-d4pdb_68703c38-8262-4cea-8aa5-15f64ae66fa1/migrator/0.log" Apr 23 18:00:47.113146 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:47.113110 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-d4pdb_68703c38-8262-4cea-8aa5-15f64ae66fa1/graceful-termination/0.log" Apr 23 18:00:52.459698 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:52.459658 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qtzgl\" (UID: \"9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qtzgl" Apr 23 18:00:52.462284 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:52.462256 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qtzgl\" (UID: \"9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qtzgl" Apr 23 18:00:52.538423 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:52.538390 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qtzgl" Apr 23 18:00:52.560811 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:52.560775 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ed805c43-a1a5-4865-9b28-5ecd8393eece-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mw95w\" (UID: \"ed805c43-a1a5-4865-9b28-5ecd8393eece\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mw95w" Apr 23 18:00:52.560969 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:52.560944 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 18:00:52.561054 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:52.561041 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed805c43-a1a5-4865-9b28-5ecd8393eece-networking-console-plugin-cert podName:ed805c43-a1a5-4865-9b28-5ecd8393eece nodeName:}" failed. No retries permitted until 2026-04-23 18:01:08.56101454 +0000 UTC m=+157.728341837 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ed805c43-a1a5-4865-9b28-5ecd8393eece-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mw95w" (UID: "ed805c43-a1a5-4865-9b28-5ecd8393eece") : secret "networking-console-plugin-cert" not found Apr 23 18:00:52.658654 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:52.658626 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qtzgl"] Apr 23 18:00:52.661231 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:52.661187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/275ce576-f50c-4c52-aa60-875645871e66-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-f8b7t\" (UID: \"275ce576-f50c-4c52-aa60-875645871e66\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t" Apr 23 18:00:52.661397 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:52.661375 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 18:00:52.661524 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:00:52.661480 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/275ce576-f50c-4c52-aa60-875645871e66-cluster-monitoring-operator-tls podName:275ce576-f50c-4c52-aa60-875645871e66 nodeName:}" failed. No retries permitted until 2026-04-23 18:01:08.661439572 +0000 UTC m=+157.828766889 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/275ce576-f50c-4c52-aa60-875645871e66-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-f8b7t" (UID: "275ce576-f50c-4c52-aa60-875645871e66") : secret "cluster-monitoring-operator-tls" not found Apr 23 18:00:52.805703 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:52.805610 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qtzgl" event={"ID":"9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283","Type":"ContainerStarted","Data":"1ef8d95ab9dbb3260b6940fb7c27d31332521bb24c31bb768d1ce610b63d7db0"} Apr 23 18:00:54.813590 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:54.813550 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qtzgl" event={"ID":"9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283","Type":"ContainerStarted","Data":"95bf8091686fa9ec717270c957451f46ebea06c138d111c550aed6f102d2128c"} Apr 23 18:00:54.813590 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:54.813590 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qtzgl" event={"ID":"9b9d7bc1-90b9-4ed7-9beb-f4cbfc548283","Type":"ContainerStarted","Data":"b4aa7fbac870aaa29f2bcc96b65aa04fc4ab980b7372934fe20f03645146660b"} Apr 23 18:00:54.831025 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:00:54.830977 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qtzgl" podStartSLOduration=17.236806199 podStartE2EDuration="18.830963237s" podCreationTimestamp="2026-04-23 18:00:36 +0000 UTC" firstStartedPulling="2026-04-23 18:00:52.705676797 +0000 UTC m=+141.873004094" lastFinishedPulling="2026-04-23 18:00:54.299833821 +0000 UTC m=+143.467161132" observedRunningTime="2026-04-23 18:00:54.830070677 +0000 UTC m=+143.997397997" watchObservedRunningTime="2026-04-23 18:00:54.830963237 +0000 UTC m=+143.998290557" Apr 23 18:01:04.800989 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.800959 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9gqzb"] Apr 23 18:01:04.803647 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.803627 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9gqzb" Apr 23 18:01:04.805662 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.805637 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 18:01:04.805779 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.805762 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 18:01:04.805862 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.805848 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rmsv4\"" Apr 23 18:01:04.815026 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.815004 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-rcfgz"] Apr 23 18:01:04.817376 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.817356 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9gqzb"] Apr 23 18:01:04.817512 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.817496 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-rcfgz" Apr 23 18:01:04.819405 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.819383 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 18:01:04.819530 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.819453 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-lkl7m\"" Apr 23 18:01:04.819530 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.819453 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 18:01:04.827729 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.827707 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-rcfgz"] Apr 23 18:01:04.868396 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.868363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/881b7aa3-8be8-4f22-9712-03c2de339ea4-data-volume\") pod \"insights-runtime-extractor-9gqzb\" (UID: \"881b7aa3-8be8-4f22-9712-03c2de339ea4\") " pod="openshift-insights/insights-runtime-extractor-9gqzb" Apr 23 18:01:04.868578 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.868447 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/881b7aa3-8be8-4f22-9712-03c2de339ea4-crio-socket\") pod \"insights-runtime-extractor-9gqzb\" (UID: \"881b7aa3-8be8-4f22-9712-03c2de339ea4\") " pod="openshift-insights/insights-runtime-extractor-9gqzb" Apr 23 18:01:04.868578 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.868491 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crh6m\" (UniqueName: \"kubernetes.io/projected/5a9ac9a5-f676-42f2-9af1-a148cd6302d8-kube-api-access-crh6m\") pod \"downloads-6bcc868b7-rcfgz\" (UID: \"5a9ac9a5-f676-42f2-9af1-a148cd6302d8\") " pod="openshift-console/downloads-6bcc868b7-rcfgz" Apr 23 18:01:04.868578 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.868533 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/881b7aa3-8be8-4f22-9712-03c2de339ea4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9gqzb\" (UID: \"881b7aa3-8be8-4f22-9712-03c2de339ea4\") " pod="openshift-insights/insights-runtime-extractor-9gqzb" Apr 23 18:01:04.868578 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.868555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/881b7aa3-8be8-4f22-9712-03c2de339ea4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9gqzb\" (UID: \"881b7aa3-8be8-4f22-9712-03c2de339ea4\") " pod="openshift-insights/insights-runtime-extractor-9gqzb" Apr 23 18:01:04.868710 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.868600 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh887\" (UniqueName: \"kubernetes.io/projected/881b7aa3-8be8-4f22-9712-03c2de339ea4-kube-api-access-mh887\") pod \"insights-runtime-extractor-9gqzb\" (UID: \"881b7aa3-8be8-4f22-9712-03c2de339ea4\") " pod="openshift-insights/insights-runtime-extractor-9gqzb" Apr 23 18:01:04.969507 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.969449 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/881b7aa3-8be8-4f22-9712-03c2de339ea4-crio-socket\") pod \"insights-runtime-extractor-9gqzb\" (UID: \"881b7aa3-8be8-4f22-9712-03c2de339ea4\") " pod="openshift-insights/insights-runtime-extractor-9gqzb" Apr 23 18:01:04.969507 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.969506 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crh6m\" (UniqueName: \"kubernetes.io/projected/5a9ac9a5-f676-42f2-9af1-a148cd6302d8-kube-api-access-crh6m\") pod \"downloads-6bcc868b7-rcfgz\" (UID: \"5a9ac9a5-f676-42f2-9af1-a148cd6302d8\") " pod="openshift-console/downloads-6bcc868b7-rcfgz" Apr 23 18:01:04.969762 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.969540 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/881b7aa3-8be8-4f22-9712-03c2de339ea4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9gqzb\" (UID: \"881b7aa3-8be8-4f22-9712-03c2de339ea4\") " pod="openshift-insights/insights-runtime-extractor-9gqzb" Apr 23 18:01:04.969762 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.969564 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/881b7aa3-8be8-4f22-9712-03c2de339ea4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9gqzb\" (UID: \"881b7aa3-8be8-4f22-9712-03c2de339ea4\") " pod="openshift-insights/insights-runtime-extractor-9gqzb" Apr 23 18:01:04.969762 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.969582 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mh887\" (UniqueName: \"kubernetes.io/projected/881b7aa3-8be8-4f22-9712-03c2de339ea4-kube-api-access-mh887\") pod \"insights-runtime-extractor-9gqzb\" (UID: \"881b7aa3-8be8-4f22-9712-03c2de339ea4\") " pod="openshift-insights/insights-runtime-extractor-9gqzb" Apr 23 18:01:04.969762 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.969581 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/881b7aa3-8be8-4f22-9712-03c2de339ea4-crio-socket\") pod \"insights-runtime-extractor-9gqzb\" (UID: \"881b7aa3-8be8-4f22-9712-03c2de339ea4\") " pod="openshift-insights/insights-runtime-extractor-9gqzb" Apr 23 18:01:04.969762 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.969656 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/881b7aa3-8be8-4f22-9712-03c2de339ea4-data-volume\") pod \"insights-runtime-extractor-9gqzb\" (UID: \"881b7aa3-8be8-4f22-9712-03c2de339ea4\") " pod="openshift-insights/insights-runtime-extractor-9gqzb" Apr 23 18:01:04.970044 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.970025 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/881b7aa3-8be8-4f22-9712-03c2de339ea4-data-volume\") pod \"insights-runtime-extractor-9gqzb\" (UID: \"881b7aa3-8be8-4f22-9712-03c2de339ea4\") " pod="openshift-insights/insights-runtime-extractor-9gqzb" Apr 23 18:01:04.970106 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.970086 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/881b7aa3-8be8-4f22-9712-03c2de339ea4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9gqzb\" (UID: \"881b7aa3-8be8-4f22-9712-03c2de339ea4\") " pod="openshift-insights/insights-runtime-extractor-9gqzb" Apr 23 18:01:04.972167 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.972144 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/881b7aa3-8be8-4f22-9712-03c2de339ea4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9gqzb\" (UID: \"881b7aa3-8be8-4f22-9712-03c2de339ea4\") " pod="openshift-insights/insights-runtime-extractor-9gqzb" Apr 23 18:01:04.981045 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.981017 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crh6m\" (UniqueName: \"kubernetes.io/projected/5a9ac9a5-f676-42f2-9af1-a148cd6302d8-kube-api-access-crh6m\") pod \"downloads-6bcc868b7-rcfgz\" (UID: \"5a9ac9a5-f676-42f2-9af1-a148cd6302d8\") " pod="openshift-console/downloads-6bcc868b7-rcfgz" Apr 23 18:01:04.981135 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:04.981087 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh887\" (UniqueName: \"kubernetes.io/projected/881b7aa3-8be8-4f22-9712-03c2de339ea4-kube-api-access-mh887\") pod \"insights-runtime-extractor-9gqzb\" (UID: \"881b7aa3-8be8-4f22-9712-03c2de339ea4\") " pod="openshift-insights/insights-runtime-extractor-9gqzb" Apr 23 18:01:05.112473 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:05.112425 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9gqzb" Apr 23 18:01:05.126395 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:05.126364 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-rcfgz" Apr 23 18:01:05.263736 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:05.263708 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9gqzb"] Apr 23 18:01:05.266858 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:01:05.266821 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod881b7aa3_8be8_4f22_9712_03c2de339ea4.slice/crio-699b0a6764acff9462f313647cf02372a20312cd5750a18f8c33a60209483ced WatchSource:0}: Error finding container 699b0a6764acff9462f313647cf02372a20312cd5750a18f8c33a60209483ced: Status 404 returned error can't find the container with id 699b0a6764acff9462f313647cf02372a20312cd5750a18f8c33a60209483ced Apr 23 18:01:05.279128 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:05.279108 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-rcfgz"] Apr 23 18:01:05.281943 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:01:05.281912 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a9ac9a5_f676_42f2_9af1_a148cd6302d8.slice/crio-a7125aaf3701bbddd36021879e13eb4f046eac4a1b4547825114a8b4bd23222f WatchSource:0}: Error finding container a7125aaf3701bbddd36021879e13eb4f046eac4a1b4547825114a8b4bd23222f: Status 404 returned error can't find the container with id a7125aaf3701bbddd36021879e13eb4f046eac4a1b4547825114a8b4bd23222f Apr 23 18:01:05.840494 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:05.840443 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-rcfgz" event={"ID":"5a9ac9a5-f676-42f2-9af1-a148cd6302d8","Type":"ContainerStarted","Data":"a7125aaf3701bbddd36021879e13eb4f046eac4a1b4547825114a8b4bd23222f"} Apr 23 18:01:05.841727 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:05.841700 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9gqzb" event={"ID":"881b7aa3-8be8-4f22-9712-03c2de339ea4","Type":"ContainerStarted","Data":"88e17866eb67c4fe7199942995979d107a01534317484ddee146915d0b56b6d2"} Apr 23 18:01:05.841809 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:05.841738 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9gqzb" event={"ID":"881b7aa3-8be8-4f22-9712-03c2de339ea4","Type":"ContainerStarted","Data":"699b0a6764acff9462f313647cf02372a20312cd5750a18f8c33a60209483ced"} Apr 23 18:01:06.846990 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:06.846947 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9gqzb" event={"ID":"881b7aa3-8be8-4f22-9712-03c2de339ea4","Type":"ContainerStarted","Data":"78825c29a364cfcf9fa776f5dcc7ad8e45d0e6533e9915ce270c1d0058ce2f48"} Apr 23 18:01:07.781925 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:01:07.781848 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-579c67b56b-drvmr" podUID="79008782-6b97-4390-87b3-4fde5c883645" Apr 23 18:01:07.794061 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:01:07.794022 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-6pv4g" podUID="e929e6ba-de02-4dcb-affd-2772d869c2e0" Apr 23 18:01:07.851440 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:07.851403 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9gqzb" event={"ID":"881b7aa3-8be8-4f22-9712-03c2de339ea4","Type":"ContainerStarted","Data":"691dca97e29d34600f91c4c8ef337eafdc65f1de42aac6832ae50ec81cc654ab"} Apr 23 18:01:07.851440 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:07.851439 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 18:01:07.869139 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:07.869087 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9gqzb" podStartSLOduration=1.855521445 podStartE2EDuration="3.86907254s" podCreationTimestamp="2026-04-23 18:01:04 +0000 UTC" firstStartedPulling="2026-04-23 18:01:05.318310612 +0000 UTC m=+154.485637911" lastFinishedPulling="2026-04-23 18:01:07.331861698 +0000 UTC m=+156.499189006" observedRunningTime="2026-04-23 18:01:07.867804717 +0000 UTC m=+157.035132035" watchObservedRunningTime="2026-04-23 18:01:07.86907254 +0000 UTC m=+157.036399895" Apr 23 18:01:07.876575 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:01:07.876545 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6xhj9" podUID="bbb38a01-f704-4497-b3c1-20236e4e4f23" Apr 23 18:01:08.617964 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:08.617916 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ed805c43-a1a5-4865-9b28-5ecd8393eece-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mw95w\" (UID: \"ed805c43-a1a5-4865-9b28-5ecd8393eece\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mw95w" Apr 23 18:01:08.620724 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:08.620687 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ed805c43-a1a5-4865-9b28-5ecd8393eece-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mw95w\" (UID: \"ed805c43-a1a5-4865-9b28-5ecd8393eece\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mw95w" Apr 23 18:01:08.719367 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:08.719325 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/275ce576-f50c-4c52-aa60-875645871e66-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-f8b7t\" (UID: \"275ce576-f50c-4c52-aa60-875645871e66\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t" Apr 23 18:01:08.722339 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:08.722312 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/275ce576-f50c-4c52-aa60-875645871e66-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-f8b7t\" (UID: \"275ce576-f50c-4c52-aa60-875645871e66\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t" Apr 23 18:01:08.843608 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:08.843569 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mw95w" Apr 23 18:01:08.953827 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:08.953781 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t" Apr 23 18:01:08.981863 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:08.981831 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-mw95w"] Apr 23 18:01:08.984331 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:01:08.984297 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded805c43_a1a5_4865_9b28_5ecd8393eece.slice/crio-dd9673faa4e65cf09fb3e7940c043df93e02d12b39af0f8e812ec2b271ddbc1e WatchSource:0}: Error finding container dd9673faa4e65cf09fb3e7940c043df93e02d12b39af0f8e812ec2b271ddbc1e: Status 404 returned error can't find the container with id dd9673faa4e65cf09fb3e7940c043df93e02d12b39af0f8e812ec2b271ddbc1e Apr 23 18:01:09.049431 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.049400 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-799cfbb46b-jtprt"] Apr 23 18:01:09.083454 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.083422 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-799cfbb46b-jtprt"] Apr 23 18:01:09.083620 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.083612 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:09.086101 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.086076 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 18:01:09.086230 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.086101 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 18:01:09.086230 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.086145 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 18:01:09.086759 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.086734 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jm8xd\"" Apr 23 18:01:09.086759 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.086762 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 18:01:09.086939 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.086895 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 18:01:09.089261 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.089228 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t"] Apr 23 18:01:09.094061 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:01:09.094037 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod275ce576_f50c_4c52_aa60_875645871e66.slice/crio-221b3379ce3b6c62f61625c3f4d4776db3c35b68dac2ff066bb259fede600f61 WatchSource:0}: Error finding container 221b3379ce3b6c62f61625c3f4d4776db3c35b68dac2ff066bb259fede600f61: Status 404 returned error can't find the container with id 221b3379ce3b6c62f61625c3f4d4776db3c35b68dac2ff066bb259fede600f61 Apr 23 18:01:09.224385 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.224302 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-console-config\") pod \"console-799cfbb46b-jtprt\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:09.224385 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.224363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-console-oauth-config\") pod \"console-799cfbb46b-jtprt\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:09.224627 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.224410 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-service-ca\") pod \"console-799cfbb46b-jtprt\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:09.224627 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.224505 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-oauth-serving-cert\") pod \"console-799cfbb46b-jtprt\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:09.224627 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.224563 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-console-serving-cert\") pod \"console-799cfbb46b-jtprt\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:09.224627 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.224592 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcngl\" (UniqueName: \"kubernetes.io/projected/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-kube-api-access-pcngl\") pod \"console-799cfbb46b-jtprt\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:09.325388 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.325341 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-service-ca\") pod \"console-799cfbb46b-jtprt\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:09.325582 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.325399 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-oauth-serving-cert\") pod \"console-799cfbb46b-jtprt\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:09.325654 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.325587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-console-serving-cert\") pod \"console-799cfbb46b-jtprt\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:09.325654 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.325631 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcngl\" (UniqueName: \"kubernetes.io/projected/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-kube-api-access-pcngl\") pod \"console-799cfbb46b-jtprt\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:09.325864 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.325842 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-console-config\") pod \"console-799cfbb46b-jtprt\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:09.325939 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.325914 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-console-oauth-config\") pod \"console-799cfbb46b-jtprt\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:09.326220 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.326194 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-oauth-serving-cert\") pod \"console-799cfbb46b-jtprt\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:09.326335 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.326220 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-service-ca\") pod \"console-799cfbb46b-jtprt\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:09.326500 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.326454 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-console-config\") pod \"console-799cfbb46b-jtprt\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:09.328699 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.328681 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-console-serving-cert\") pod \"console-799cfbb46b-jtprt\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:09.332987 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.332964 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-console-oauth-config\") pod \"console-799cfbb46b-jtprt\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:09.335526 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.335500 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcngl\" (UniqueName: \"kubernetes.io/projected/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-kube-api-access-pcngl\") pod \"console-799cfbb46b-jtprt\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:09.395055 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.395023 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:09.430890 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:01:09.430849 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-q98mx" podUID="ab8ad387-1bfb-42ab-ad18-c8ea20362f8f" Apr 23 18:01:09.531059 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.531028 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-799cfbb46b-jtprt"] Apr 23 18:01:09.534097 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:01:09.534066 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0ad8839_bf4f_402e_9298_46b9b31e1b4c.slice/crio-3356e39d7695deeaf12836b73d620f3994267152053dc876f0d35b79800eaaf7 WatchSource:0}: Error finding container 3356e39d7695deeaf12836b73d620f3994267152053dc876f0d35b79800eaaf7: Status 404 returned error can't find the container with id 3356e39d7695deeaf12836b73d620f3994267152053dc876f0d35b79800eaaf7 Apr 23 18:01:09.859620 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.859578 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t" event={"ID":"275ce576-f50c-4c52-aa60-875645871e66","Type":"ContainerStarted","Data":"221b3379ce3b6c62f61625c3f4d4776db3c35b68dac2ff066bb259fede600f61"} Apr 23 18:01:09.860904 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.860870 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-799cfbb46b-jtprt" event={"ID":"b0ad8839-bf4f-402e-9298-46b9b31e1b4c","Type":"ContainerStarted","Data":"3356e39d7695deeaf12836b73d620f3994267152053dc876f0d35b79800eaaf7"} Apr 23 18:01:09.862068 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:09.862040 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mw95w" event={"ID":"ed805c43-a1a5-4865-9b28-5ecd8393eece","Type":"ContainerStarted","Data":"dd9673faa4e65cf09fb3e7940c043df93e02d12b39af0f8e812ec2b271ddbc1e"} Apr 23 18:01:10.868582 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:10.868542 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mw95w" event={"ID":"ed805c43-a1a5-4865-9b28-5ecd8393eece","Type":"ContainerStarted","Data":"6a6b0069bd4ca1da3ad6fc671f16db50d64c3173a7db8563fc312cd41139169f"} Apr 23 18:01:10.885606 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:10.885161 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mw95w" podStartSLOduration=33.473629815 podStartE2EDuration="34.885146403s" podCreationTimestamp="2026-04-23 18:00:36 +0000 UTC" firstStartedPulling="2026-04-23 18:01:08.986694013 +0000 UTC m=+158.154021326" lastFinishedPulling="2026-04-23 18:01:10.398210603 +0000 UTC m=+159.565537914" observedRunningTime="2026-04-23 18:01:10.884664363 +0000 UTC m=+160.051991683" watchObservedRunningTime="2026-04-23 18:01:10.885146403 +0000 UTC m=+160.052473792" Apr 23 18:01:11.875084 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:11.875024 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t" event={"ID":"275ce576-f50c-4c52-aa60-875645871e66","Type":"ContainerStarted","Data":"af548da698446cb3c166fc65b4a1bb2512e364aaa640197f6ad70b73b9bd2803"} Apr 23 18:01:11.893999 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:11.893590 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f8b7t" podStartSLOduration=33.616265267 podStartE2EDuration="35.893572s" podCreationTimestamp="2026-04-23 18:00:36 +0000 UTC" firstStartedPulling="2026-04-23 18:01:09.096292273 +0000 UTC m=+158.263619575" lastFinishedPulling="2026-04-23 18:01:11.373599007 +0000 UTC m=+160.540926308" observedRunningTime="2026-04-23 18:01:11.893230182 +0000 UTC m=+161.060557501" watchObservedRunningTime="2026-04-23 18:01:11.893572 +0000 UTC m=+161.060899321" Apr 23 18:01:12.759794 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:12.759756 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert\") pod \"ingress-canary-6pv4g\" (UID: \"e929e6ba-de02-4dcb-affd-2772d869c2e0\") " pod="openshift-ingress-canary/ingress-canary-6pv4g" Apr 23 18:01:12.759990 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:12.759829 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 18:01:12.763001 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:12.762974 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e929e6ba-de02-4dcb-affd-2772d869c2e0-cert\") pod \"ingress-canary-6pv4g\" (UID: \"e929e6ba-de02-4dcb-affd-2772d869c2e0\") " pod="openshift-ingress-canary/ingress-canary-6pv4g" Apr 23 18:01:12.763122 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:12.762979 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls\") pod \"image-registry-579c67b56b-drvmr\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 18:01:12.861063 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:12.861009 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls\") pod \"dns-default-6xhj9\" (UID: \"bbb38a01-f704-4497-b3c1-20236e4e4f23\") " pod="openshift-dns/dns-default-6xhj9" Apr 23 18:01:12.864159 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:12.864100 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bbb38a01-f704-4497-b3c1-20236e4e4f23-metrics-tls\") pod \"dns-default-6xhj9\" (UID: \"bbb38a01-f704-4497-b3c1-20236e4e4f23\") " pod="openshift-dns/dns-default-6xhj9" Apr 23 18:01:12.954171 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:12.954139 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jxfcc\"" Apr 23 18:01:12.962914 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:12.962886 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 18:01:13.110953 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:13.110910 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-579c67b56b-drvmr"] Apr 23 18:01:13.114511 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:01:13.114478 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79008782_6b97_4390_87b3_4fde5c883645.slice/crio-df77b44483d3a7cc2a0315ddc8626e1de60c7684bef91d6bcb97a18328ba22c8 WatchSource:0}: Error finding container df77b44483d3a7cc2a0315ddc8626e1de60c7684bef91d6bcb97a18328ba22c8: Status 404 returned error can't find the container with id df77b44483d3a7cc2a0315ddc8626e1de60c7684bef91d6bcb97a18328ba22c8 Apr 23 18:01:13.882763 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:13.882717 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-799cfbb46b-jtprt" event={"ID":"b0ad8839-bf4f-402e-9298-46b9b31e1b4c","Type":"ContainerStarted","Data":"d46832b1452e5404f36704b90737f2ba60f772e0ad1c6dce82489e5c684efab3"} Apr 23 18:01:13.884128 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:13.884094 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-579c67b56b-drvmr" event={"ID":"79008782-6b97-4390-87b3-4fde5c883645","Type":"ContainerStarted","Data":"85d1f844919180468b81374f4172dd9abf8348eae5ff4f670bd782f80fbf5a37"} Apr 23 18:01:13.884128 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:13.884122 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-579c67b56b-drvmr" event={"ID":"79008782-6b97-4390-87b3-4fde5c883645","Type":"ContainerStarted","Data":"df77b44483d3a7cc2a0315ddc8626e1de60c7684bef91d6bcb97a18328ba22c8"} Apr 23 18:01:13.884313 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:13.884248 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 18:01:13.900894 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:13.900837 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-799cfbb46b-jtprt" podStartSLOduration=1.458704865 podStartE2EDuration="4.900822514s" podCreationTimestamp="2026-04-23 18:01:09 +0000 UTC" firstStartedPulling="2026-04-23 18:01:09.536157282 +0000 UTC m=+158.703484581" lastFinishedPulling="2026-04-23 18:01:12.978274927 +0000 UTC m=+162.145602230" observedRunningTime="2026-04-23 18:01:13.899957944 +0000 UTC m=+163.067285264" watchObservedRunningTime="2026-04-23 18:01:13.900822514 +0000 UTC m=+163.068149833" Apr 23 18:01:13.917572 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:13.917518 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-579c67b56b-drvmr" podStartSLOduration=161.91750253 podStartE2EDuration="2m41.91750253s" podCreationTimestamp="2026-04-23 17:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:01:13.916754338 +0000 UTC m=+163.084081673" watchObservedRunningTime="2026-04-23 18:01:13.91750253 +0000 UTC m=+163.084829851" Apr 23 18:01:17.643545 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.643493 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-699c97fd46-t2w9b"] Apr 23 18:01:17.646381 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.646350 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:17.652828 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.652803 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 18:01:17.659958 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.659933 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-699c97fd46-t2w9b"] Apr 23 18:01:17.702668 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.702632 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f59971f7-15e2-4888-8554-87d9fea2a48c-console-oauth-config\") pod \"console-699c97fd46-t2w9b\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:17.702855 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.702677 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-service-ca\") pod \"console-699c97fd46-t2w9b\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:17.702855 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.702721 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f59971f7-15e2-4888-8554-87d9fea2a48c-console-serving-cert\") pod \"console-699c97fd46-t2w9b\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:17.702855 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.702743 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-trusted-ca-bundle\") pod \"console-699c97fd46-t2w9b\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:17.702855 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.702805 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-console-config\") pod \"console-699c97fd46-t2w9b\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:17.702855 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.702829 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-oauth-serving-cert\") pod \"console-699c97fd46-t2w9b\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:17.703040 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.702869 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lll5\" (UniqueName: \"kubernetes.io/projected/f59971f7-15e2-4888-8554-87d9fea2a48c-kube-api-access-7lll5\") pod \"console-699c97fd46-t2w9b\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:17.803513 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.803475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f59971f7-15e2-4888-8554-87d9fea2a48c-console-serving-cert\") pod \"console-699c97fd46-t2w9b\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:17.803705 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.803523 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-trusted-ca-bundle\") pod \"console-699c97fd46-t2w9b\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:17.803705 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.803572 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-console-config\") pod \"console-699c97fd46-t2w9b\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:17.803705 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.803599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-oauth-serving-cert\") pod \"console-699c97fd46-t2w9b\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:17.803705 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.803661 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lll5\" (UniqueName: \"kubernetes.io/projected/f59971f7-15e2-4888-8554-87d9fea2a48c-kube-api-access-7lll5\") pod \"console-699c97fd46-t2w9b\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:17.803913 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.803722 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f59971f7-15e2-4888-8554-87d9fea2a48c-console-oauth-config\") pod \"console-699c97fd46-t2w9b\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:17.803913 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.803760 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-service-ca\") pod \"console-699c97fd46-t2w9b\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:17.804442 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.804411 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-console-config\") pod \"console-699c97fd46-t2w9b\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:17.804584 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.804515 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-service-ca\") pod \"console-699c97fd46-t2w9b\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:17.804584 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.804523 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-oauth-serving-cert\") pod \"console-699c97fd46-t2w9b\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:17.804584 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.804564 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-trusted-ca-bundle\") pod \"console-699c97fd46-t2w9b\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:17.806782 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.806757 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f59971f7-15e2-4888-8554-87d9fea2a48c-console-serving-cert\") pod \"console-699c97fd46-t2w9b\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:17.806980 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.806957 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f59971f7-15e2-4888-8554-87d9fea2a48c-console-oauth-config\") pod \"console-699c97fd46-t2w9b\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:17.815849 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.815829 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lll5\" (UniqueName: \"kubernetes.io/projected/f59971f7-15e2-4888-8554-87d9fea2a48c-kube-api-access-7lll5\") pod \"console-699c97fd46-t2w9b\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:17.957852 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:17.957772 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:18.410887 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:18.410846 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6pv4g" Apr 23 18:01:18.413747 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:18.413722 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tsbqj\"" Apr 23 18:01:18.421221 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:18.421192 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6pv4g" Apr 23 18:01:19.395617 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:19.395575 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:19.396073 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:19.395634 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:19.397300 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:19.397265 2576 patch_prober.go:28] interesting pod/console-799cfbb46b-jtprt container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.18:8443/health\": dial tcp 10.132.0.18:8443: connect: connection refused" start-of-body= Apr 23 18:01:19.397439 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:19.397318 2576 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-799cfbb46b-jtprt" podUID="b0ad8839-bf4f-402e-9298-46b9b31e1b4c" containerName="console" probeResult="failure" output="Get \"https://10.132.0.18:8443/health\": dial tcp 10.132.0.18:8443: connect: connection refused" Apr 23 18:01:20.349434 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.349372 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-cmn5h"] Apr 23 18:01:20.362783 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.361802 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.364758 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.364258 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 18:01:20.364758 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.364265 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 18:01:20.364758 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.364554 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-sclw6\"" Apr 23 18:01:20.364758 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.364591 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 18:01:20.364758 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.364719 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 18:01:20.410686 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.410656 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6xhj9" Apr 23 18:01:20.412937 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.412910 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-sv5q9\"" Apr 23 18:01:20.421539 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.421514 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6xhj9" Apr 23 18:01:20.427074 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.427048 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6dhb\" (UniqueName: \"kubernetes.io/projected/03b2d537-2af8-4205-8fec-7db579b30694-kube-api-access-j6dhb\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.427227 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.427207 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/03b2d537-2af8-4205-8fec-7db579b30694-sys\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.427295 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.427263 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03b2d537-2af8-4205-8fec-7db579b30694-metrics-client-ca\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.427354 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.427290 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/03b2d537-2af8-4205-8fec-7db579b30694-node-exporter-accelerators-collector-config\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.427406 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.427357 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/03b2d537-2af8-4205-8fec-7db579b30694-node-exporter-wtmp\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.427406 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.427387 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/03b2d537-2af8-4205-8fec-7db579b30694-root\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.427530 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.427429 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/03b2d537-2af8-4205-8fec-7db579b30694-node-exporter-textfile\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.427530 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.427488 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03b2d537-2af8-4205-8fec-7db579b30694-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.427530 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.427524 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/03b2d537-2af8-4205-8fec-7db579b30694-node-exporter-tls\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.528322 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.528285 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6dhb\" (UniqueName: \"kubernetes.io/projected/03b2d537-2af8-4205-8fec-7db579b30694-kube-api-access-j6dhb\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.528544 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.528363 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/03b2d537-2af8-4205-8fec-7db579b30694-sys\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.528544 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.528411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03b2d537-2af8-4205-8fec-7db579b30694-metrics-client-ca\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.528544 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.528437 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/03b2d537-2af8-4205-8fec-7db579b30694-node-exporter-accelerators-collector-config\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.528544 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.528496 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/03b2d537-2af8-4205-8fec-7db579b30694-node-exporter-wtmp\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.528544 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.528525 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/03b2d537-2af8-4205-8fec-7db579b30694-root\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.528897 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.528569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/03b2d537-2af8-4205-8fec-7db579b30694-node-exporter-textfile\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.528897 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.528597 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03b2d537-2af8-4205-8fec-7db579b30694-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.528897 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.528627 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/03b2d537-2af8-4205-8fec-7db579b30694-node-exporter-tls\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.528897 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:01:20.528782 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 18:01:20.528897 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.528818 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/03b2d537-2af8-4205-8fec-7db579b30694-sys\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.528897 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:01:20.528847 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b2d537-2af8-4205-8fec-7db579b30694-node-exporter-tls podName:03b2d537-2af8-4205-8fec-7db579b30694 nodeName:}" failed. No retries permitted until 2026-04-23 18:01:21.028828659 +0000 UTC m=+170.196155971 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/03b2d537-2af8-4205-8fec-7db579b30694-node-exporter-tls") pod "node-exporter-cmn5h" (UID: "03b2d537-2af8-4205-8fec-7db579b30694") : secret "node-exporter-tls" not found Apr 23 18:01:20.529138 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.528957 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/03b2d537-2af8-4205-8fec-7db579b30694-node-exporter-wtmp\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.529138 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.529005 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/03b2d537-2af8-4205-8fec-7db579b30694-root\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.529273 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.529250 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/03b2d537-2af8-4205-8fec-7db579b30694-node-exporter-textfile\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.529332 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.529270 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03b2d537-2af8-4205-8fec-7db579b30694-metrics-client-ca\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.529419 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.529349 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/03b2d537-2af8-4205-8fec-7db579b30694-node-exporter-accelerators-collector-config\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.532156 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.532133 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03b2d537-2af8-4205-8fec-7db579b30694-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:20.539326 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:20.539277 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6dhb\" (UniqueName: \"kubernetes.io/projected/03b2d537-2af8-4205-8fec-7db579b30694-kube-api-access-j6dhb\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:21.034419 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.034368 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/03b2d537-2af8-4205-8fec-7db579b30694-node-exporter-tls\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:21.037648 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.037615 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/03b2d537-2af8-4205-8fec-7db579b30694-node-exporter-tls\") pod \"node-exporter-cmn5h\" (UID: \"03b2d537-2af8-4205-8fec-7db579b30694\") " pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:21.275705 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.275671 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cmn5h" Apr 23 18:01:21.391655 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.391624 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 18:01:21.395112 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.395085 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.397344 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.397321 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 18:01:21.397483 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.397409 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 18:01:21.397546 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.397324 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 18:01:21.397607 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.397323 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 18:01:21.397683 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.397324 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 18:01:21.397853 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.397834 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 18:01:21.398057 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.398039 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 18:01:21.398122 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.398077 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 18:01:21.398122 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.398106 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-6lhp7\"" Apr 23 18:01:21.398122 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.398115 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 18:01:21.408394 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.408351 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 18:01:21.410617 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.410596 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 18:01:21.438693 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.438662 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.439151 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.438710 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.439151 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.438765 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.439151 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.438798 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrcfw\" (UniqueName: \"kubernetes.io/projected/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-kube-api-access-jrcfw\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.439151 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.438830 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.439151 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.438855 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-config-volume\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.439151 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.438877 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.439151 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.438927 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-config-out\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.439151 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.438963 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-web-config\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.439151 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.439004 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.439151 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.439034 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.439151 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.439068 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.439151 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.439107 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.539745 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.539697 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-web-config\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.539929 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.539840 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.539929 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.539880 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.539929 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.539914 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.541170 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.539955 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.541170 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.539995 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.541170 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.540025 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.541170 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.540084 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.541170 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.540121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrcfw\" (UniqueName: \"kubernetes.io/projected/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-kube-api-access-jrcfw\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.541170 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.540156 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.541170 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.540185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-config-volume\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.541170 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.540216 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.541170 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.540272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-config-out\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.541170 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.540714 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.541936 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:01:21.541302 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-alertmanager-trusted-ca-bundle podName:30ffb75d-23b2-49d1-912a-c5a2bfb24cbc nodeName:}" failed. No retries permitted until 2026-04-23 18:01:22.041280077 +0000 UTC m=+171.208607379 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "30ffb75d-23b2-49d1-912a-c5a2bfb24cbc") : configmap references non-existent config key: ca-bundle.crt Apr 23 18:01:21.542388 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:01:21.542370 2576 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 23 18:01:21.542570 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:01:21.542556 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-secret-alertmanager-main-tls podName:30ffb75d-23b2-49d1-912a-c5a2bfb24cbc nodeName:}" failed. No retries permitted until 2026-04-23 18:01:22.042537837 +0000 UTC m=+171.209865136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "30ffb75d-23b2-49d1-912a-c5a2bfb24cbc") : secret "alertmanager-main-tls" not found Apr 23 18:01:21.543668 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.543640 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-web-config\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.544104 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.544027 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-config-out\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.545370 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.545326 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.546024 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.545937 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.546125 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.546022 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.548144 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.548105 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.548245 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.548166 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.549440 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.549415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.549746 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.549727 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-config-volume\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:21.551796 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:21.551755 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrcfw\" (UniqueName: \"kubernetes.io/projected/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-kube-api-access-jrcfw\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:22.044759 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:22.044703 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:22.044968 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:22.044813 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:22.045877 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:22.045842 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:22.047526 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:22.047504 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/30ffb75d-23b2-49d1-912a-c5a2bfb24cbc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:22.188659 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:01:22.188614 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03b2d537_2af8_4205_8fec_7db579b30694.slice/crio-aa1e8ea5a312f4a305a3b6547948140a85f3ba03283beea80445c5b56c3f3932 WatchSource:0}: Error finding container aa1e8ea5a312f4a305a3b6547948140a85f3ba03283beea80445c5b56c3f3932: Status 404 returned error can't find the container with id aa1e8ea5a312f4a305a3b6547948140a85f3ba03283beea80445c5b56c3f3932 Apr 23 18:01:22.307520 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:22.307293 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 18:01:22.316418 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:22.316375 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6xhj9"] Apr 23 18:01:22.321820 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:01:22.321787 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbb38a01_f704_4497_b3c1_20236e4e4f23.slice/crio-c561e1e0e801ac1ec586187a379243d5f3b78baa5eb42b6c96fbf6a9490494d9 WatchSource:0}: Error finding container c561e1e0e801ac1ec586187a379243d5f3b78baa5eb42b6c96fbf6a9490494d9: Status 404 returned error can't find the container with id c561e1e0e801ac1ec586187a379243d5f3b78baa5eb42b6c96fbf6a9490494d9 Apr 23 18:01:22.450472 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:22.450432 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 18:01:22.452854 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:01:22.452828 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30ffb75d_23b2_49d1_912a_c5a2bfb24cbc.slice/crio-5c693d670f2f93868e3bfc5aff6b35039a99acb3d3d8629ee82657f0d6717b31 WatchSource:0}: Error finding container 5c693d670f2f93868e3bfc5aff6b35039a99acb3d3d8629ee82657f0d6717b31: Status 404 returned error can't find the container with id 5c693d670f2f93868e3bfc5aff6b35039a99acb3d3d8629ee82657f0d6717b31 Apr 23 18:01:22.542808 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:22.542775 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6pv4g"] Apr 23 18:01:22.545676 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:22.545616 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-699c97fd46-t2w9b"] Apr 23 18:01:22.547318 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:01:22.547288 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode929e6ba_de02_4dcb_affd_2772d869c2e0.slice/crio-dae174177cef78918b89c5abaddcd38aba8af402d3fed9ba970fdb2181c5189b WatchSource:0}: Error finding container dae174177cef78918b89c5abaddcd38aba8af402d3fed9ba970fdb2181c5189b: Status 404 returned error can't find the container with id dae174177cef78918b89c5abaddcd38aba8af402d3fed9ba970fdb2181c5189b Apr 23 18:01:22.548848 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:01:22.548819 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf59971f7_15e2_4888_8554_87d9fea2a48c.slice/crio-142440582f66ac331bab1053fbbf05b2455600a64db7e651e27c20fb40ed639c WatchSource:0}: Error finding container 142440582f66ac331bab1053fbbf05b2455600a64db7e651e27c20fb40ed639c: Status 404 returned error can't find the container with id 142440582f66ac331bab1053fbbf05b2455600a64db7e651e27c20fb40ed639c Apr 23 18:01:22.914315 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:22.914277 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cmn5h" event={"ID":"03b2d537-2af8-4205-8fec-7db579b30694","Type":"ContainerStarted","Data":"aa1e8ea5a312f4a305a3b6547948140a85f3ba03283beea80445c5b56c3f3932"} Apr 23 18:01:22.917365 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:22.917327 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-rcfgz" event={"ID":"5a9ac9a5-f676-42f2-9af1-a148cd6302d8","Type":"ContainerStarted","Data":"0adeee20490fbdbba9ac578f94bb3712959f896f69d9dde94ec1f88cd4b6b985"} Apr 23 18:01:22.919695 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:22.919665 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-rcfgz" Apr 23 18:01:22.921718 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:22.921678 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6xhj9" event={"ID":"bbb38a01-f704-4497-b3c1-20236e4e4f23","Type":"ContainerStarted","Data":"c561e1e0e801ac1ec586187a379243d5f3b78baa5eb42b6c96fbf6a9490494d9"} Apr 23 18:01:22.923699 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:22.923647 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699c97fd46-t2w9b" event={"ID":"f59971f7-15e2-4888-8554-87d9fea2a48c","Type":"ContainerStarted","Data":"96349c8183c5c06c56d7eec401652a5cd4dbf0f05e2989185a066ac1ea39435a"} Apr 23 18:01:22.923699 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:22.923679 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699c97fd46-t2w9b" event={"ID":"f59971f7-15e2-4888-8554-87d9fea2a48c","Type":"ContainerStarted","Data":"142440582f66ac331bab1053fbbf05b2455600a64db7e651e27c20fb40ed639c"} Apr 23 18:01:22.926191 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:22.926129 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc","Type":"ContainerStarted","Data":"5c693d670f2f93868e3bfc5aff6b35039a99acb3d3d8629ee82657f0d6717b31"} Apr 23 18:01:22.929702 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:22.929665 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6pv4g" event={"ID":"e929e6ba-de02-4dcb-affd-2772d869c2e0","Type":"ContainerStarted","Data":"dae174177cef78918b89c5abaddcd38aba8af402d3fed9ba970fdb2181c5189b"} Apr 23 18:01:22.932061 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:22.931968 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-rcfgz" Apr 23 18:01:22.955749 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:22.954583 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-rcfgz" podStartSLOduration=1.977427544 podStartE2EDuration="18.954564477s" podCreationTimestamp="2026-04-23 18:01:04 +0000 UTC" firstStartedPulling="2026-04-23 18:01:05.283796788 +0000 UTC m=+154.451124087" lastFinishedPulling="2026-04-23 18:01:22.260933709 +0000 UTC m=+171.428261020" observedRunningTime="2026-04-23 18:01:22.933771876 +0000 UTC m=+172.101099195" watchObservedRunningTime="2026-04-23 18:01:22.954564477 +0000 UTC m=+172.121891818" Apr 23 18:01:23.314025 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.313913 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-699c97fd46-t2w9b" podStartSLOduration=6.313892019 podStartE2EDuration="6.313892019s" podCreationTimestamp="2026-04-23 18:01:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:01:22.976046449 +0000 UTC m=+172.143373769" watchObservedRunningTime="2026-04-23 18:01:23.313892019 +0000 UTC m=+172.481219339" Apr 23 18:01:23.314185 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.314130 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7c874b45d-sjk2w"] Apr 23 18:01:23.333491 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.333420 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7c874b45d-sjk2w"] Apr 23 18:01:23.333658 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.333622 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.345357 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.344890 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 23 18:01:23.345357 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.345200 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 23 18:01:23.345539 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.345391 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 23 18:01:23.347395 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.345683 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 23 18:01:23.347395 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.345793 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-nkzhr\"" Apr 23 18:01:23.347395 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.345894 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-1s7m5lspro1pn\"" Apr 23 18:01:23.347395 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.345683 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 23 18:01:23.364689 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.361387 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/944258a9-3a02-4539-a1c0-d605fae95404-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.364689 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.361425 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvk5s\" (UniqueName: \"kubernetes.io/projected/944258a9-3a02-4539-a1c0-d605fae95404-kube-api-access-bvk5s\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.364689 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.361495 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/944258a9-3a02-4539-a1c0-d605fae95404-secret-thanos-querier-tls\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.364689 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.361534 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/944258a9-3a02-4539-a1c0-d605fae95404-secret-grpc-tls\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.364689 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.361586 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/944258a9-3a02-4539-a1c0-d605fae95404-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.364689 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.361619 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/944258a9-3a02-4539-a1c0-d605fae95404-metrics-client-ca\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.364689 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.361650 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/944258a9-3a02-4539-a1c0-d605fae95404-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.364689 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.361682 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/944258a9-3a02-4539-a1c0-d605fae95404-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.462429 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.462376 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/944258a9-3a02-4539-a1c0-d605fae95404-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.462947 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.462437 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/944258a9-3a02-4539-a1c0-d605fae95404-metrics-client-ca\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.462947 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.462487 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/944258a9-3a02-4539-a1c0-d605fae95404-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.462947 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.462520 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/944258a9-3a02-4539-a1c0-d605fae95404-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.462947 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.462591 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/944258a9-3a02-4539-a1c0-d605fae95404-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.462947 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.462616 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvk5s\" (UniqueName: \"kubernetes.io/projected/944258a9-3a02-4539-a1c0-d605fae95404-kube-api-access-bvk5s\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.462947 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.462663 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/944258a9-3a02-4539-a1c0-d605fae95404-secret-thanos-querier-tls\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.462947 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.462698 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/944258a9-3a02-4539-a1c0-d605fae95404-secret-grpc-tls\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.465664 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.465596 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/944258a9-3a02-4539-a1c0-d605fae95404-metrics-client-ca\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.474147 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.474096 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/944258a9-3a02-4539-a1c0-d605fae95404-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.476777 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.476728 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/944258a9-3a02-4539-a1c0-d605fae95404-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.479787 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.479740 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/944258a9-3a02-4539-a1c0-d605fae95404-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.480417 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.480376 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/944258a9-3a02-4539-a1c0-d605fae95404-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.480529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.480503 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/944258a9-3a02-4539-a1c0-d605fae95404-secret-grpc-tls\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.482959 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.482931 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/944258a9-3a02-4539-a1c0-d605fae95404-secret-thanos-querier-tls\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.483533 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.483491 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvk5s\" (UniqueName: \"kubernetes.io/projected/944258a9-3a02-4539-a1c0-d605fae95404-kube-api-access-bvk5s\") pod \"thanos-querier-7c874b45d-sjk2w\" (UID: \"944258a9-3a02-4539-a1c0-d605fae95404\") " pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.666878 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.666844 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:23.935386 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.935287 2576 generic.go:358] "Generic (PLEG): container finished" podID="03b2d537-2af8-4205-8fec-7db579b30694" containerID="86f412182c8325acd20065e985265cc15667ea62121bf017f5d74c1ad3c3abcb" exitCode=0 Apr 23 18:01:23.935560 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:23.935397 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cmn5h" event={"ID":"03b2d537-2af8-4205-8fec-7db579b30694","Type":"ContainerDied","Data":"86f412182c8325acd20065e985265cc15667ea62121bf017f5d74c1ad3c3abcb"} Apr 23 18:01:26.124353 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:26.124323 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7c874b45d-sjk2w"] Apr 23 18:01:26.127698 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:01:26.127668 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod944258a9_3a02_4539_a1c0_d605fae95404.slice/crio-0b5bb482acb14d059a8f18143ebae8ba35c8b9ea3d349d015f1322e7dd5acccc WatchSource:0}: Error finding container 0b5bb482acb14d059a8f18143ebae8ba35c8b9ea3d349d015f1322e7dd5acccc: Status 404 returned error can't find the container with id 0b5bb482acb14d059a8f18143ebae8ba35c8b9ea3d349d015f1322e7dd5acccc Apr 23 18:01:26.947629 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:26.947585 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" event={"ID":"944258a9-3a02-4539-a1c0-d605fae95404","Type":"ContainerStarted","Data":"0b5bb482acb14d059a8f18143ebae8ba35c8b9ea3d349d015f1322e7dd5acccc"} Apr 23 18:01:26.949621 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:26.949578 2576 generic.go:358] "Generic (PLEG): container finished" podID="30ffb75d-23b2-49d1-912a-c5a2bfb24cbc" containerID="37931ed0e370212de9ca482273a2bac2b1798b4d65ab81436855770ea1f8a375" exitCode=0 Apr 23 18:01:26.949925 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:26.949894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc","Type":"ContainerDied","Data":"37931ed0e370212de9ca482273a2bac2b1798b4d65ab81436855770ea1f8a375"} Apr 23 18:01:26.955236 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:26.955208 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6pv4g" event={"ID":"e929e6ba-de02-4dcb-affd-2772d869c2e0","Type":"ContainerStarted","Data":"d177964d6203c76f6fcc6232ed5e3219e3c5a35cccf955bc658c1261fab4309d"} Apr 23 18:01:26.959795 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:26.959115 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cmn5h" event={"ID":"03b2d537-2af8-4205-8fec-7db579b30694","Type":"ContainerStarted","Data":"910274217c143c09129b49e0dccb27bf8515e669d34bea2e1ec945ff586309c7"} Apr 23 18:01:26.959795 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:26.959146 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cmn5h" event={"ID":"03b2d537-2af8-4205-8fec-7db579b30694","Type":"ContainerStarted","Data":"08aad1ae83172264fb20e69d4568e699eaf980192eb2573ecca28e2453a7f5ba"} Apr 23 18:01:26.962508 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:26.962482 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6xhj9" event={"ID":"bbb38a01-f704-4497-b3c1-20236e4e4f23","Type":"ContainerStarted","Data":"836d81ee5c2405111d7c328c8e17459583437decd2735c75a87f330db318aa4f"} Apr 23 18:01:26.962614 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:26.962513 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6xhj9" event={"ID":"bbb38a01-f704-4497-b3c1-20236e4e4f23","Type":"ContainerStarted","Data":"fb40c11c263c5117b0d77736e781afe04c967140aa610caa3f355a44cf0e537c"} Apr 23 18:01:26.963022 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:26.963001 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6xhj9" Apr 23 18:01:26.994027 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:26.993975 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6xhj9" podStartSLOduration=139.374641676 podStartE2EDuration="2m22.99395835s" podCreationTimestamp="2026-04-23 17:59:04 +0000 UTC" firstStartedPulling="2026-04-23 18:01:22.324329808 +0000 UTC m=+171.491657108" lastFinishedPulling="2026-04-23 18:01:25.943646471 +0000 UTC m=+175.110973782" observedRunningTime="2026-04-23 18:01:26.992358386 +0000 UTC m=+176.159685705" watchObservedRunningTime="2026-04-23 18:01:26.99395835 +0000 UTC m=+176.161285665" Apr 23 18:01:27.008985 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:27.008931 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6pv4g" podStartSLOduration=139.608666926 podStartE2EDuration="2m23.008910576s" podCreationTimestamp="2026-04-23 17:59:04 +0000 UTC" firstStartedPulling="2026-04-23 18:01:22.549666956 +0000 UTC m=+171.716994266" lastFinishedPulling="2026-04-23 18:01:25.949910618 +0000 UTC m=+175.117237916" observedRunningTime="2026-04-23 18:01:27.007587219 +0000 UTC m=+176.174914541" watchObservedRunningTime="2026-04-23 18:01:27.008910576 +0000 UTC m=+176.176237897" Apr 23 18:01:27.028698 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:27.028646 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-cmn5h" podStartSLOduration=6.131877114 podStartE2EDuration="7.028631579s" podCreationTimestamp="2026-04-23 18:01:20 +0000 UTC" firstStartedPulling="2026-04-23 18:01:22.191066463 +0000 UTC m=+171.358393763" lastFinishedPulling="2026-04-23 18:01:23.087820917 +0000 UTC m=+172.255148228" observedRunningTime="2026-04-23 18:01:27.027841538 +0000 UTC m=+176.195168872" watchObservedRunningTime="2026-04-23 18:01:27.028631579 +0000 UTC m=+176.195958902" Apr 23 18:01:27.957883 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:27.957848 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:27.958900 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:27.958865 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:27.960252 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:27.960226 2576 patch_prober.go:28] interesting pod/console-699c97fd46-t2w9b container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.19:8443/health\": dial tcp 10.132.0.19:8443: connect: connection refused" start-of-body= Apr 23 18:01:27.960918 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:27.960885 2576 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-699c97fd46-t2w9b" podUID="f59971f7-15e2-4888-8554-87d9fea2a48c" containerName="console" probeResult="failure" output="Get \"https://10.132.0.19:8443/health\": dial tcp 10.132.0.19:8443: connect: connection refused" Apr 23 18:01:29.395667 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.395638 2576 patch_prober.go:28] interesting pod/console-799cfbb46b-jtprt container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.18:8443/health\": dial tcp 10.132.0.18:8443: connect: connection refused" start-of-body= Apr 23 18:01:29.395971 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.395688 2576 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-799cfbb46b-jtprt" podUID="b0ad8839-bf4f-402e-9298-46b9b31e1b4c" containerName="console" probeResult="failure" output="Get \"https://10.132.0.18:8443/health\": dial tcp 10.132.0.18:8443: connect: connection refused" Apr 23 18:01:29.680298 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.680269 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-799cfbb46b-jtprt"] Apr 23 18:01:29.711179 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.711150 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-74679f47b-rj2lc"] Apr 23 18:01:29.740688 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.740663 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74679f47b-rj2lc"] Apr 23 18:01:29.740798 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.740774 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:29.832980 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.832946 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-console-serving-cert\") pod \"console-74679f47b-rj2lc\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:29.833156 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.832988 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-console-config\") pod \"console-74679f47b-rj2lc\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:29.833156 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.833069 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z98vn\" (UniqueName: \"kubernetes.io/projected/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-kube-api-access-z98vn\") pod \"console-74679f47b-rj2lc\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:29.833156 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.833126 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-console-oauth-config\") pod \"console-74679f47b-rj2lc\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:29.833295 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.833160 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-oauth-serving-cert\") pod \"console-74679f47b-rj2lc\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:29.833295 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.833184 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-service-ca\") pod \"console-74679f47b-rj2lc\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:29.833295 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.833207 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-trusted-ca-bundle\") pod \"console-74679f47b-rj2lc\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:29.934199 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.934160 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-service-ca\") pod \"console-74679f47b-rj2lc\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:29.934352 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.934210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-trusted-ca-bundle\") pod \"console-74679f47b-rj2lc\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:29.934352 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.934329 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-console-serving-cert\") pod \"console-74679f47b-rj2lc\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:29.934501 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.934356 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-console-config\") pod \"console-74679f47b-rj2lc\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:29.934501 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.934408 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z98vn\" (UniqueName: \"kubernetes.io/projected/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-kube-api-access-z98vn\") pod \"console-74679f47b-rj2lc\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:29.934501 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.934439 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-console-oauth-config\") pod \"console-74679f47b-rj2lc\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:29.934501 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.934490 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-oauth-serving-cert\") pod \"console-74679f47b-rj2lc\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:29.935042 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.935015 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-service-ca\") pod \"console-74679f47b-rj2lc\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:29.935512 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.935484 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-oauth-serving-cert\") pod \"console-74679f47b-rj2lc\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:29.935632 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.935571 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-console-config\") pod \"console-74679f47b-rj2lc\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:29.935762 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.935734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-trusted-ca-bundle\") pod \"console-74679f47b-rj2lc\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:29.937605 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.937584 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-console-oauth-config\") pod \"console-74679f47b-rj2lc\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:29.937805 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.937780 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-console-serving-cert\") pod \"console-74679f47b-rj2lc\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:29.945553 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.945529 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z98vn\" (UniqueName: \"kubernetes.io/projected/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-kube-api-access-z98vn\") pod \"console-74679f47b-rj2lc\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:29.978530 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.978493 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc","Type":"ContainerStarted","Data":"a6e3bbbc9438ae4bc0641e205dfe15f339231430e2b09e35847209ae1cf160c6"} Apr 23 18:01:29.978530 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.978536 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc","Type":"ContainerStarted","Data":"bf10f443fad1d2e3a3ddaab23f94cd6c1b6d342905240e52e47d1571599dbc1d"} Apr 23 18:01:29.978747 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.978550 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc","Type":"ContainerStarted","Data":"e1b529d649e5cbe6e840919ba523785d30c4fac221ed3b1727a3638ab4ea82d1"} Apr 23 18:01:29.978747 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.978562 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc","Type":"ContainerStarted","Data":"c63f557160b3d249e94dbe6e7fd9e70ae68a550d6a6bbc1746d7e9cfcae64a10"} Apr 23 18:01:29.978747 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.978576 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc","Type":"ContainerStarted","Data":"44d589cc1c79b98649cdadafcd46be8fdaa572e122a1082cd15415541cee34ac"} Apr 23 18:01:29.980803 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.980768 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" event={"ID":"944258a9-3a02-4539-a1c0-d605fae95404","Type":"ContainerStarted","Data":"11be1a2f322e88812481fabbf7b868731234e852d32be91263f49427d47109a2"} Apr 23 18:01:29.980927 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.980805 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" event={"ID":"944258a9-3a02-4539-a1c0-d605fae95404","Type":"ContainerStarted","Data":"d7dadbad95d5d25056027c8be60a763baf04345295e66e7e2320e34f09b305fd"} Apr 23 18:01:29.980927 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:29.980820 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" event={"ID":"944258a9-3a02-4539-a1c0-d605fae95404","Type":"ContainerStarted","Data":"5b44251983790a586405f3dbc7fd57f1a6618f6927326ba386d6e0fb75fe08bf"} Apr 23 18:01:30.051366 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:30.051274 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:30.197073 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:30.197038 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74679f47b-rj2lc"] Apr 23 18:01:30.201610 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:01:30.201580 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f511ed0_d1a5_4dc0_9a05_df2bd20dd201.slice/crio-044ac7be47c4c9aa8f516985e996929f797c378b87b8af8794cab19e1a4f800a WatchSource:0}: Error finding container 044ac7be47c4c9aa8f516985e996929f797c378b87b8af8794cab19e1a4f800a: Status 404 returned error can't find the container with id 044ac7be47c4c9aa8f516985e996929f797c378b87b8af8794cab19e1a4f800a Apr 23 18:01:30.988441 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:30.988398 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74679f47b-rj2lc" event={"ID":"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201","Type":"ContainerStarted","Data":"ac8f949fa759d43b79582ecd02dd09c9a67f80ac3290645ae799f86e26915135"} Apr 23 18:01:30.988441 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:30.988445 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74679f47b-rj2lc" event={"ID":"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201","Type":"ContainerStarted","Data":"044ac7be47c4c9aa8f516985e996929f797c378b87b8af8794cab19e1a4f800a"} Apr 23 18:01:31.006526 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:31.006455 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74679f47b-rj2lc" podStartSLOduration=2.006437736 podStartE2EDuration="2.006437736s" podCreationTimestamp="2026-04-23 18:01:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:01:31.005419093 +0000 UTC m=+180.172746413" watchObservedRunningTime="2026-04-23 18:01:31.006437736 +0000 UTC m=+180.173765055" Apr 23 18:01:31.899478 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:31.899420 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-699c97fd46-t2w9b"] Apr 23 18:01:31.995331 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:31.995268 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" event={"ID":"944258a9-3a02-4539-a1c0-d605fae95404","Type":"ContainerStarted","Data":"81ba1b059c47354dcc0f3735675e896693a36298187a8971e2f82317ebcc4885"} Apr 23 18:01:31.995784 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:31.995336 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" event={"ID":"944258a9-3a02-4539-a1c0-d605fae95404","Type":"ContainerStarted","Data":"df055efd1d9eb4ccbf2ca64de1555864572a3257c8d3883128fc5b306de0f325"} Apr 23 18:01:31.995784 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:31.995353 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" event={"ID":"944258a9-3a02-4539-a1c0-d605fae95404","Type":"ContainerStarted","Data":"b8d65e5fe39ef2b40b99316fd09fd9d96dcf6ed6c333b728cc37ae6bd1add563"} Apr 23 18:01:31.995784 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:31.995440 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:31.998599 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:31.998569 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"30ffb75d-23b2-49d1-912a-c5a2bfb24cbc","Type":"ContainerStarted","Data":"c241cc0325c1fddd83024c1f21b41399164498d3dda3ba9880a33f191224e531"} Apr 23 18:01:32.019627 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:32.019567 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" podStartSLOduration=4.062351336 podStartE2EDuration="9.019549381s" podCreationTimestamp="2026-04-23 18:01:23 +0000 UTC" firstStartedPulling="2026-04-23 18:01:26.130358823 +0000 UTC m=+175.297686140" lastFinishedPulling="2026-04-23 18:01:31.087556877 +0000 UTC m=+180.254884185" observedRunningTime="2026-04-23 18:01:32.018766174 +0000 UTC m=+181.186093506" watchObservedRunningTime="2026-04-23 18:01:32.019549381 +0000 UTC m=+181.186876701" Apr 23 18:01:32.047219 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:32.047165 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.412176845 podStartE2EDuration="11.047151319s" podCreationTimestamp="2026-04-23 18:01:21 +0000 UTC" firstStartedPulling="2026-04-23 18:01:22.454867729 +0000 UTC m=+171.622195040" lastFinishedPulling="2026-04-23 18:01:31.089842209 +0000 UTC m=+180.257169514" observedRunningTime="2026-04-23 18:01:32.045939317 +0000 UTC m=+181.213266635" watchObservedRunningTime="2026-04-23 18:01:32.047151319 +0000 UTC m=+181.214478637" Apr 23 18:01:32.967400 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:32.967361 2576 patch_prober.go:28] interesting pod/image-registry-579c67b56b-drvmr container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 18:01:32.967601 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:32.967417 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-579c67b56b-drvmr" podUID="79008782-6b97-4390-87b3-4fde5c883645" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:01:34.892406 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:34.892374 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 18:01:36.732774 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:36.732741 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-579c67b56b-drvmr"] Apr 23 18:01:37.972545 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:37.972511 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6xhj9" Apr 23 18:01:38.008505 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:38.008443 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7c874b45d-sjk2w" Apr 23 18:01:40.052163 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:40.052130 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:40.052581 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:40.052226 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:40.057098 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:40.057076 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:41.028611 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:41.028582 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:01:42.994612 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:42.994571 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74679f47b-rj2lc"] Apr 23 18:01:47.042516 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:47.042450 2576 generic.go:358] "Generic (PLEG): container finished" podID="23e40766-68ea-4cb5-b83f-5a64e8740c67" containerID="483ee6f9495db7928dc5685339e0f4a852d6115c2a6111227148ccd083c9c30f" exitCode=0 Apr 23 18:01:47.042923 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:47.042536 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-85zc7" event={"ID":"23e40766-68ea-4cb5-b83f-5a64e8740c67","Type":"ContainerDied","Data":"483ee6f9495db7928dc5685339e0f4a852d6115c2a6111227148ccd083c9c30f"} Apr 23 18:01:47.042962 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:47.042928 2576 scope.go:117] "RemoveContainer" containerID="483ee6f9495db7928dc5685339e0f4a852d6115c2a6111227148ccd083c9c30f" Apr 23 18:01:47.892726 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:47.892694 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6pv4g_e929e6ba-de02-4dcb-affd-2772d869c2e0/serve-healthcheck-canary/0.log" Apr 23 18:01:48.047133 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:48.047103 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-85zc7" event={"ID":"23e40766-68ea-4cb5-b83f-5a64e8740c67","Type":"ContainerStarted","Data":"df8d20a97e713f3cf6daf3c3656047613102782a308ea9bfbcc1d4a3565ca4ad"} Apr 23 18:01:54.710265 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:54.710186 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-799cfbb46b-jtprt" podUID="b0ad8839-bf4f-402e-9298-46b9b31e1b4c" containerName="console" containerID="cri-o://d46832b1452e5404f36704b90737f2ba60f772e0ad1c6dce82489e5c684efab3" gracePeriod=15 Apr 23 18:01:54.998494 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:54.998451 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-799cfbb46b-jtprt_b0ad8839-bf4f-402e-9298-46b9b31e1b4c/console/0.log" Apr 23 18:01:54.998605 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:54.998539 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:55.067965 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.067934 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-799cfbb46b-jtprt_b0ad8839-bf4f-402e-9298-46b9b31e1b4c/console/0.log" Apr 23 18:01:55.068142 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.067981 2576 generic.go:358] "Generic (PLEG): container finished" podID="b0ad8839-bf4f-402e-9298-46b9b31e1b4c" containerID="d46832b1452e5404f36704b90737f2ba60f772e0ad1c6dce82489e5c684efab3" exitCode=2 Apr 23 18:01:55.068142 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.068043 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-799cfbb46b-jtprt" event={"ID":"b0ad8839-bf4f-402e-9298-46b9b31e1b4c","Type":"ContainerDied","Data":"d46832b1452e5404f36704b90737f2ba60f772e0ad1c6dce82489e5c684efab3"} Apr 23 18:01:55.068142 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.068050 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-799cfbb46b-jtprt" Apr 23 18:01:55.068142 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.068080 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-799cfbb46b-jtprt" event={"ID":"b0ad8839-bf4f-402e-9298-46b9b31e1b4c","Type":"ContainerDied","Data":"3356e39d7695deeaf12836b73d620f3994267152053dc876f0d35b79800eaaf7"} Apr 23 18:01:55.068142 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.068099 2576 scope.go:117] "RemoveContainer" containerID="d46832b1452e5404f36704b90737f2ba60f772e0ad1c6dce82489e5c684efab3" Apr 23 18:01:55.077855 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.077832 2576 scope.go:117] "RemoveContainer" containerID="d46832b1452e5404f36704b90737f2ba60f772e0ad1c6dce82489e5c684efab3" Apr 23 18:01:55.078141 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:01:55.078120 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d46832b1452e5404f36704b90737f2ba60f772e0ad1c6dce82489e5c684efab3\": container with ID starting with d46832b1452e5404f36704b90737f2ba60f772e0ad1c6dce82489e5c684efab3 not found: ID does not exist" containerID="d46832b1452e5404f36704b90737f2ba60f772e0ad1c6dce82489e5c684efab3" Apr 23 18:01:55.078207 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.078150 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d46832b1452e5404f36704b90737f2ba60f772e0ad1c6dce82489e5c684efab3"} err="failed to get container status \"d46832b1452e5404f36704b90737f2ba60f772e0ad1c6dce82489e5c684efab3\": rpc error: code = NotFound desc = could not find container \"d46832b1452e5404f36704b90737f2ba60f772e0ad1c6dce82489e5c684efab3\": container with ID starting with d46832b1452e5404f36704b90737f2ba60f772e0ad1c6dce82489e5c684efab3 not found: ID does not exist" Apr 23 18:01:55.175556 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.175523 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcngl\" (UniqueName: \"kubernetes.io/projected/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-kube-api-access-pcngl\") pod \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " Apr 23 18:01:55.175705 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.175575 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-console-serving-cert\") pod \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " Apr 23 18:01:55.175705 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.175608 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-service-ca\") pod \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " Apr 23 18:01:55.175705 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.175635 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-console-oauth-config\") pod \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " Apr 23 18:01:55.175705 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.175685 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-oauth-serving-cert\") pod \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " Apr 23 18:01:55.175920 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.175724 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-console-config\") pod \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\" (UID: \"b0ad8839-bf4f-402e-9298-46b9b31e1b4c\") " Apr 23 18:01:55.176151 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.176124 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b0ad8839-bf4f-402e-9298-46b9b31e1b4c" (UID: "b0ad8839-bf4f-402e-9298-46b9b31e1b4c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:01:55.176241 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.176145 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-service-ca" (OuterVolumeSpecName: "service-ca") pod "b0ad8839-bf4f-402e-9298-46b9b31e1b4c" (UID: "b0ad8839-bf4f-402e-9298-46b9b31e1b4c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:01:55.176241 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.176218 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-console-config" (OuterVolumeSpecName: "console-config") pod "b0ad8839-bf4f-402e-9298-46b9b31e1b4c" (UID: "b0ad8839-bf4f-402e-9298-46b9b31e1b4c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:01:55.178102 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.178076 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b0ad8839-bf4f-402e-9298-46b9b31e1b4c" (UID: "b0ad8839-bf4f-402e-9298-46b9b31e1b4c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:01:55.178218 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.178133 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-kube-api-access-pcngl" (OuterVolumeSpecName: "kube-api-access-pcngl") pod "b0ad8839-bf4f-402e-9298-46b9b31e1b4c" (UID: "b0ad8839-bf4f-402e-9298-46b9b31e1b4c"). InnerVolumeSpecName "kube-api-access-pcngl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:01:55.178218 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.178167 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b0ad8839-bf4f-402e-9298-46b9b31e1b4c" (UID: "b0ad8839-bf4f-402e-9298-46b9b31e1b4c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:01:55.276594 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.276508 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-oauth-serving-cert\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:01:55.276594 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.276539 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-console-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:01:55.276594 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.276549 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pcngl\" (UniqueName: \"kubernetes.io/projected/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-kube-api-access-pcngl\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:01:55.276594 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.276559 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-console-serving-cert\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:01:55.276594 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.276570 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-service-ca\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:01:55.276594 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.276579 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0ad8839-bf4f-402e-9298-46b9b31e1b4c-console-oauth-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:01:55.415473 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.415435 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-799cfbb46b-jtprt"] Apr 23 18:01:55.415624 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:55.415487 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-799cfbb46b-jtprt"] Apr 23 18:01:56.921568 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:56.921500 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-699c97fd46-t2w9b" podUID="f59971f7-15e2-4888-8554-87d9fea2a48c" containerName="console" containerID="cri-o://96349c8183c5c06c56d7eec401652a5cd4dbf0f05e2989185a066ac1ea39435a" gracePeriod=15 Apr 23 18:01:57.076200 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.076167 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-699c97fd46-t2w9b_f59971f7-15e2-4888-8554-87d9fea2a48c/console/0.log" Apr 23 18:01:57.076385 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.076211 2576 generic.go:358] "Generic (PLEG): container finished" podID="f59971f7-15e2-4888-8554-87d9fea2a48c" containerID="96349c8183c5c06c56d7eec401652a5cd4dbf0f05e2989185a066ac1ea39435a" exitCode=2 Apr 23 18:01:57.076385 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.076260 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699c97fd46-t2w9b" event={"ID":"f59971f7-15e2-4888-8554-87d9fea2a48c","Type":"ContainerDied","Data":"96349c8183c5c06c56d7eec401652a5cd4dbf0f05e2989185a066ac1ea39435a"} Apr 23 18:01:57.178909 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.178851 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-699c97fd46-t2w9b_f59971f7-15e2-4888-8554-87d9fea2a48c/console/0.log" Apr 23 18:01:57.179036 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.178914 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:57.293649 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.293614 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f59971f7-15e2-4888-8554-87d9fea2a48c-console-serving-cert\") pod \"f59971f7-15e2-4888-8554-87d9fea2a48c\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " Apr 23 18:01:57.293827 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.293696 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-oauth-serving-cert\") pod \"f59971f7-15e2-4888-8554-87d9fea2a48c\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " Apr 23 18:01:57.293827 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.293734 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f59971f7-15e2-4888-8554-87d9fea2a48c-console-oauth-config\") pod \"f59971f7-15e2-4888-8554-87d9fea2a48c\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " Apr 23 18:01:57.293827 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.293783 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-console-config\") pod \"f59971f7-15e2-4888-8554-87d9fea2a48c\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " Apr 23 18:01:57.293827 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.293820 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lll5\" (UniqueName: \"kubernetes.io/projected/f59971f7-15e2-4888-8554-87d9fea2a48c-kube-api-access-7lll5\") pod \"f59971f7-15e2-4888-8554-87d9fea2a48c\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " Apr 23 18:01:57.294061 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.293845 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-service-ca\") pod \"f59971f7-15e2-4888-8554-87d9fea2a48c\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " Apr 23 18:01:57.294061 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.293888 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-trusted-ca-bundle\") pod \"f59971f7-15e2-4888-8554-87d9fea2a48c\" (UID: \"f59971f7-15e2-4888-8554-87d9fea2a48c\") " Apr 23 18:01:57.294168 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.294146 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f59971f7-15e2-4888-8554-87d9fea2a48c" (UID: "f59971f7-15e2-4888-8554-87d9fea2a48c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:01:57.294225 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.294206 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-console-config" (OuterVolumeSpecName: "console-config") pod "f59971f7-15e2-4888-8554-87d9fea2a48c" (UID: "f59971f7-15e2-4888-8554-87d9fea2a48c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:01:57.294606 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.294580 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-service-ca" (OuterVolumeSpecName: "service-ca") pod "f59971f7-15e2-4888-8554-87d9fea2a48c" (UID: "f59971f7-15e2-4888-8554-87d9fea2a48c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:01:57.294749 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.294725 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f59971f7-15e2-4888-8554-87d9fea2a48c" (UID: "f59971f7-15e2-4888-8554-87d9fea2a48c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:01:57.295960 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.295921 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f59971f7-15e2-4888-8554-87d9fea2a48c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f59971f7-15e2-4888-8554-87d9fea2a48c" (UID: "f59971f7-15e2-4888-8554-87d9fea2a48c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:01:57.296067 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.296019 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f59971f7-15e2-4888-8554-87d9fea2a48c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f59971f7-15e2-4888-8554-87d9fea2a48c" (UID: "f59971f7-15e2-4888-8554-87d9fea2a48c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:01:57.296126 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.296078 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f59971f7-15e2-4888-8554-87d9fea2a48c-kube-api-access-7lll5" (OuterVolumeSpecName: "kube-api-access-7lll5") pod "f59971f7-15e2-4888-8554-87d9fea2a48c" (UID: "f59971f7-15e2-4888-8554-87d9fea2a48c"). InnerVolumeSpecName "kube-api-access-7lll5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:01:57.395262 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.395217 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f59971f7-15e2-4888-8554-87d9fea2a48c-console-oauth-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:01:57.395262 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.395253 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-console-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:01:57.395262 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.395267 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7lll5\" (UniqueName: \"kubernetes.io/projected/f59971f7-15e2-4888-8554-87d9fea2a48c-kube-api-access-7lll5\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:01:57.395531 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.395279 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-service-ca\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:01:57.395531 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.395292 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-trusted-ca-bundle\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:01:57.395531 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.395305 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f59971f7-15e2-4888-8554-87d9fea2a48c-console-serving-cert\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:01:57.395531 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.395317 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f59971f7-15e2-4888-8554-87d9fea2a48c-oauth-serving-cert\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:01:57.415029 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:57.414994 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0ad8839-bf4f-402e-9298-46b9b31e1b4c" path="/var/lib/kubelet/pods/b0ad8839-bf4f-402e-9298-46b9b31e1b4c/volumes" Apr 23 18:01:58.080597 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:58.080572 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-699c97fd46-t2w9b_f59971f7-15e2-4888-8554-87d9fea2a48c/console/0.log" Apr 23 18:01:58.081055 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:58.080623 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699c97fd46-t2w9b" event={"ID":"f59971f7-15e2-4888-8554-87d9fea2a48c","Type":"ContainerDied","Data":"142440582f66ac331bab1053fbbf05b2455600a64db7e651e27c20fb40ed639c"} Apr 23 18:01:58.081055 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:58.080661 2576 scope.go:117] "RemoveContainer" containerID="96349c8183c5c06c56d7eec401652a5cd4dbf0f05e2989185a066ac1ea39435a" Apr 23 18:01:58.081055 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:58.080700 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699c97fd46-t2w9b" Apr 23 18:01:58.106654 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:58.106626 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-699c97fd46-t2w9b"] Apr 23 18:01:58.114766 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:58.114739 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-699c97fd46-t2w9b"] Apr 23 18:01:59.414958 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:01:59.414927 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f59971f7-15e2-4888-8554-87d9fea2a48c" path="/var/lib/kubelet/pods/f59971f7-15e2-4888-8554-87d9fea2a48c/volumes" Apr 23 18:02:01.094306 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:01.094276 2576 generic.go:358] "Generic (PLEG): container finished" podID="69e4a548-d326-40aa-879e-60213d8f6fc5" containerID="c30a67fc4401c3a58e4f6c5a315e6fca2820934a7bf4fe9b466e9f4b57817bff" exitCode=0 Apr 23 18:02:01.094686 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:01.094346 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fkmzp" event={"ID":"69e4a548-d326-40aa-879e-60213d8f6fc5","Type":"ContainerDied","Data":"c30a67fc4401c3a58e4f6c5a315e6fca2820934a7bf4fe9b466e9f4b57817bff"} Apr 23 18:02:01.094770 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:01.094754 2576 scope.go:117] "RemoveContainer" containerID="c30a67fc4401c3a58e4f6c5a315e6fca2820934a7bf4fe9b466e9f4b57817bff" Apr 23 18:02:01.752787 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:01.752742 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-579c67b56b-drvmr" podUID="79008782-6b97-4390-87b3-4fde5c883645" containerName="registry" containerID="cri-o://85d1f844919180468b81374f4172dd9abf8348eae5ff4f670bd782f80fbf5a37" gracePeriod=30 Apr 23 18:02:02.061276 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.061248 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 18:02:02.099879 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.098859 2576 generic.go:358] "Generic (PLEG): container finished" podID="79008782-6b97-4390-87b3-4fde5c883645" containerID="85d1f844919180468b81374f4172dd9abf8348eae5ff4f670bd782f80fbf5a37" exitCode=0 Apr 23 18:02:02.099879 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.098958 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-579c67b56b-drvmr" event={"ID":"79008782-6b97-4390-87b3-4fde5c883645","Type":"ContainerDied","Data":"85d1f844919180468b81374f4172dd9abf8348eae5ff4f670bd782f80fbf5a37"} Apr 23 18:02:02.099879 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.098991 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-579c67b56b-drvmr" event={"ID":"79008782-6b97-4390-87b3-4fde5c883645","Type":"ContainerDied","Data":"df77b44483d3a7cc2a0315ddc8626e1de60c7684bef91d6bcb97a18328ba22c8"} Apr 23 18:02:02.099879 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.099013 2576 scope.go:117] "RemoveContainer" containerID="85d1f844919180468b81374f4172dd9abf8348eae5ff4f670bd782f80fbf5a37" Apr 23 18:02:02.099879 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.099166 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-579c67b56b-drvmr" Apr 23 18:02:02.104044 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.102683 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fkmzp" event={"ID":"69e4a548-d326-40aa-879e-60213d8f6fc5","Type":"ContainerStarted","Data":"edf2013f9f0b984c2b021758f36a73278004501058a484a78313d55b46763cd2"} Apr 23 18:02:02.121204 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.121173 2576 scope.go:117] "RemoveContainer" containerID="85d1f844919180468b81374f4172dd9abf8348eae5ff4f670bd782f80fbf5a37" Apr 23 18:02:02.121562 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:02:02.121533 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85d1f844919180468b81374f4172dd9abf8348eae5ff4f670bd782f80fbf5a37\": container with ID starting with 85d1f844919180468b81374f4172dd9abf8348eae5ff4f670bd782f80fbf5a37 not found: ID does not exist" containerID="85d1f844919180468b81374f4172dd9abf8348eae5ff4f670bd782f80fbf5a37" Apr 23 18:02:02.121663 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.121574 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85d1f844919180468b81374f4172dd9abf8348eae5ff4f670bd782f80fbf5a37"} err="failed to get container status \"85d1f844919180468b81374f4172dd9abf8348eae5ff4f670bd782f80fbf5a37\": rpc error: code = NotFound desc = could not find container \"85d1f844919180468b81374f4172dd9abf8348eae5ff4f670bd782f80fbf5a37\": container with ID starting with 85d1f844919180468b81374f4172dd9abf8348eae5ff4f670bd782f80fbf5a37 not found: ID does not exist" Apr 23 18:02:02.154293 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.154257 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgqk6\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-kube-api-access-tgqk6\") pod \"79008782-6b97-4390-87b3-4fde5c883645\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " Apr 23 18:02:02.154519 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.154320 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/79008782-6b97-4390-87b3-4fde5c883645-image-registry-private-configuration\") pod \"79008782-6b97-4390-87b3-4fde5c883645\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " Apr 23 18:02:02.154519 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.154343 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-bound-sa-token\") pod \"79008782-6b97-4390-87b3-4fde5c883645\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " Apr 23 18:02:02.154519 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.154394 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79008782-6b97-4390-87b3-4fde5c883645-trusted-ca\") pod \"79008782-6b97-4390-87b3-4fde5c883645\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " Apr 23 18:02:02.154694 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.154555 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/79008782-6b97-4390-87b3-4fde5c883645-installation-pull-secrets\") pod \"79008782-6b97-4390-87b3-4fde5c883645\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " Apr 23 18:02:02.154694 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.154614 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/79008782-6b97-4390-87b3-4fde5c883645-registry-certificates\") pod \"79008782-6b97-4390-87b3-4fde5c883645\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " Apr 23 18:02:02.154694 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.154656 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/79008782-6b97-4390-87b3-4fde5c883645-ca-trust-extracted\") pod \"79008782-6b97-4390-87b3-4fde5c883645\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " Apr 23 18:02:02.154694 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.154687 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls\") pod \"79008782-6b97-4390-87b3-4fde5c883645\" (UID: \"79008782-6b97-4390-87b3-4fde5c883645\") " Apr 23 18:02:02.154961 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.154932 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79008782-6b97-4390-87b3-4fde5c883645-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "79008782-6b97-4390-87b3-4fde5c883645" (UID: "79008782-6b97-4390-87b3-4fde5c883645"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:02:02.155128 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.155108 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79008782-6b97-4390-87b3-4fde5c883645-trusted-ca\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:02:02.155283 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.155119 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79008782-6b97-4390-87b3-4fde5c883645-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "79008782-6b97-4390-87b3-4fde5c883645" (UID: "79008782-6b97-4390-87b3-4fde5c883645"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:02:02.158047 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.158006 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79008782-6b97-4390-87b3-4fde5c883645-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "79008782-6b97-4390-87b3-4fde5c883645" (UID: "79008782-6b97-4390-87b3-4fde5c883645"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:02:02.159598 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.159367 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-kube-api-access-tgqk6" (OuterVolumeSpecName: "kube-api-access-tgqk6") pod "79008782-6b97-4390-87b3-4fde5c883645" (UID: "79008782-6b97-4390-87b3-4fde5c883645"). InnerVolumeSpecName "kube-api-access-tgqk6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:02:02.159598 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.159498 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "79008782-6b97-4390-87b3-4fde5c883645" (UID: "79008782-6b97-4390-87b3-4fde5c883645"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:02:02.159598 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.159557 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "79008782-6b97-4390-87b3-4fde5c883645" (UID: "79008782-6b97-4390-87b3-4fde5c883645"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:02:02.159598 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.159451 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79008782-6b97-4390-87b3-4fde5c883645-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "79008782-6b97-4390-87b3-4fde5c883645" (UID: "79008782-6b97-4390-87b3-4fde5c883645"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:02:02.167977 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.166989 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79008782-6b97-4390-87b3-4fde5c883645-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "79008782-6b97-4390-87b3-4fde5c883645" (UID: "79008782-6b97-4390-87b3-4fde5c883645"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:02:02.256234 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.256184 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tgqk6\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-kube-api-access-tgqk6\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:02:02.256234 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.256220 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/79008782-6b97-4390-87b3-4fde5c883645-image-registry-private-configuration\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:02:02.256234 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.256231 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-bound-sa-token\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:02:02.256234 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.256241 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/79008782-6b97-4390-87b3-4fde5c883645-installation-pull-secrets\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:02:02.256527 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.256250 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/79008782-6b97-4390-87b3-4fde5c883645-registry-certificates\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:02:02.256527 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.256258 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/79008782-6b97-4390-87b3-4fde5c883645-ca-trust-extracted\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:02:02.256527 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.256268 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79008782-6b97-4390-87b3-4fde5c883645-registry-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:02:02.422306 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.422276 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-579c67b56b-drvmr"] Apr 23 18:02:02.427323 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:02.427303 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-579c67b56b-drvmr"] Apr 23 18:02:03.414758 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:03.414726 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79008782-6b97-4390-87b3-4fde5c883645" path="/var/lib/kubelet/pods/79008782-6b97-4390-87b3-4fde5c883645/volumes" Apr 23 18:02:09.052003 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.051938 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-74679f47b-rj2lc" podUID="5f511ed0-d1a5-4dc0-9a05-df2bd20dd201" containerName="console" containerID="cri-o://ac8f949fa759d43b79582ecd02dd09c9a67f80ac3290645ae799f86e26915135" gracePeriod=15 Apr 23 18:02:09.304321 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.304257 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74679f47b-rj2lc_5f511ed0-d1a5-4dc0-9a05-df2bd20dd201/console/0.log" Apr 23 18:02:09.304321 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.304319 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:02:09.422870 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.422840 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-console-serving-cert\") pod \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " Apr 23 18:02:09.423049 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.422898 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-trusted-ca-bundle\") pod \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " Apr 23 18:02:09.423049 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.422925 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-console-config\") pod \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " Apr 23 18:02:09.423049 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.423026 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z98vn\" (UniqueName: \"kubernetes.io/projected/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-kube-api-access-z98vn\") pod \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " Apr 23 18:02:09.423201 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.423081 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-oauth-serving-cert\") pod \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " Apr 23 18:02:09.423201 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.423116 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-service-ca\") pod \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " Apr 23 18:02:09.423201 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.423147 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-console-oauth-config\") pod \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\" (UID: \"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201\") " Apr 23 18:02:09.423431 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.423399 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-console-config" (OuterVolumeSpecName: "console-config") pod "5f511ed0-d1a5-4dc0-9a05-df2bd20dd201" (UID: "5f511ed0-d1a5-4dc0-9a05-df2bd20dd201"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:02:09.423566 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.423451 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5f511ed0-d1a5-4dc0-9a05-df2bd20dd201" (UID: "5f511ed0-d1a5-4dc0-9a05-df2bd20dd201"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:02:09.423566 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.423503 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5f511ed0-d1a5-4dc0-9a05-df2bd20dd201" (UID: "5f511ed0-d1a5-4dc0-9a05-df2bd20dd201"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:02:09.423649 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.423566 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-service-ca" (OuterVolumeSpecName: "service-ca") pod "5f511ed0-d1a5-4dc0-9a05-df2bd20dd201" (UID: "5f511ed0-d1a5-4dc0-9a05-df2bd20dd201"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:02:09.425296 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.425274 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5f511ed0-d1a5-4dc0-9a05-df2bd20dd201" (UID: "5f511ed0-d1a5-4dc0-9a05-df2bd20dd201"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:02:09.425377 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.425337 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-kube-api-access-z98vn" (OuterVolumeSpecName: "kube-api-access-z98vn") pod "5f511ed0-d1a5-4dc0-9a05-df2bd20dd201" (UID: "5f511ed0-d1a5-4dc0-9a05-df2bd20dd201"). InnerVolumeSpecName "kube-api-access-z98vn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:02:09.425525 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.425373 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5f511ed0-d1a5-4dc0-9a05-df2bd20dd201" (UID: "5f511ed0-d1a5-4dc0-9a05-df2bd20dd201"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:02:09.524300 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.524258 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-console-serving-cert\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:02:09.524300 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.524292 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-trusted-ca-bundle\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:02:09.524300 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.524304 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-console-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:02:09.524564 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.524314 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z98vn\" (UniqueName: \"kubernetes.io/projected/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-kube-api-access-z98vn\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:02:09.524564 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.524325 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-oauth-serving-cert\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:02:09.524564 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.524334 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-service-ca\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:02:09.524564 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:09.524342 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201-console-oauth-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:02:10.128086 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:10.128058 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74679f47b-rj2lc_5f511ed0-d1a5-4dc0-9a05-df2bd20dd201/console/0.log" Apr 23 18:02:10.128603 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:10.128099 2576 generic.go:358] "Generic (PLEG): container finished" podID="5f511ed0-d1a5-4dc0-9a05-df2bd20dd201" containerID="ac8f949fa759d43b79582ecd02dd09c9a67f80ac3290645ae799f86e26915135" exitCode=2 Apr 23 18:02:10.128603 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:10.128158 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74679f47b-rj2lc" event={"ID":"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201","Type":"ContainerDied","Data":"ac8f949fa759d43b79582ecd02dd09c9a67f80ac3290645ae799f86e26915135"} Apr 23 18:02:10.128603 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:10.128188 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74679f47b-rj2lc" event={"ID":"5f511ed0-d1a5-4dc0-9a05-df2bd20dd201","Type":"ContainerDied","Data":"044ac7be47c4c9aa8f516985e996929f797c378b87b8af8794cab19e1a4f800a"} Apr 23 18:02:10.128603 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:10.128189 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74679f47b-rj2lc" Apr 23 18:02:10.128603 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:10.128204 2576 scope.go:117] "RemoveContainer" containerID="ac8f949fa759d43b79582ecd02dd09c9a67f80ac3290645ae799f86e26915135" Apr 23 18:02:10.137409 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:10.137390 2576 scope.go:117] "RemoveContainer" containerID="ac8f949fa759d43b79582ecd02dd09c9a67f80ac3290645ae799f86e26915135" Apr 23 18:02:10.137678 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:02:10.137662 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac8f949fa759d43b79582ecd02dd09c9a67f80ac3290645ae799f86e26915135\": container with ID starting with ac8f949fa759d43b79582ecd02dd09c9a67f80ac3290645ae799f86e26915135 not found: ID does not exist" containerID="ac8f949fa759d43b79582ecd02dd09c9a67f80ac3290645ae799f86e26915135" Apr 23 18:02:10.137740 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:10.137690 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac8f949fa759d43b79582ecd02dd09c9a67f80ac3290645ae799f86e26915135"} err="failed to get container status \"ac8f949fa759d43b79582ecd02dd09c9a67f80ac3290645ae799f86e26915135\": rpc error: code = NotFound desc = could not find container \"ac8f949fa759d43b79582ecd02dd09c9a67f80ac3290645ae799f86e26915135\": container with ID starting with ac8f949fa759d43b79582ecd02dd09c9a67f80ac3290645ae799f86e26915135 not found: ID does not exist" Apr 23 18:02:10.149472 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:10.149440 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74679f47b-rj2lc"] Apr 23 18:02:10.150694 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:10.150673 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-74679f47b-rj2lc"] Apr 23 18:02:11.414682 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:11.414646 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f511ed0-d1a5-4dc0-9a05-df2bd20dd201" path="/var/lib/kubelet/pods/5f511ed0-d1a5-4dc0-9a05-df2bd20dd201/volumes" Apr 23 18:02:41.531079 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.531000 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-67cf9fcb56-mznb7"] Apr 23 18:02:41.531538 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.531293 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79008782-6b97-4390-87b3-4fde5c883645" containerName="registry" Apr 23 18:02:41.531538 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.531304 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="79008782-6b97-4390-87b3-4fde5c883645" containerName="registry" Apr 23 18:02:41.531538 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.531313 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0ad8839-bf4f-402e-9298-46b9b31e1b4c" containerName="console" Apr 23 18:02:41.531538 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.531318 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ad8839-bf4f-402e-9298-46b9b31e1b4c" containerName="console" Apr 23 18:02:41.531538 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.531325 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f59971f7-15e2-4888-8554-87d9fea2a48c" containerName="console" Apr 23 18:02:41.531538 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.531331 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f59971f7-15e2-4888-8554-87d9fea2a48c" containerName="console" Apr 23 18:02:41.531538 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.531345 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f511ed0-d1a5-4dc0-9a05-df2bd20dd201" containerName="console" Apr 23 18:02:41.531538 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.531350 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f511ed0-d1a5-4dc0-9a05-df2bd20dd201" containerName="console" Apr 23 18:02:41.531538 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.531393 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="79008782-6b97-4390-87b3-4fde5c883645" containerName="registry" Apr 23 18:02:41.531538 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.531403 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f511ed0-d1a5-4dc0-9a05-df2bd20dd201" containerName="console" Apr 23 18:02:41.531538 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.531410 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f59971f7-15e2-4888-8554-87d9fea2a48c" containerName="console" Apr 23 18:02:41.531538 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.531415 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0ad8839-bf4f-402e-9298-46b9b31e1b4c" containerName="console" Apr 23 18:02:41.534112 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.534096 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.536364 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.536329 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 18:02:41.536364 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.536350 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 18:02:41.536561 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.536403 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 18:02:41.537156 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.537138 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jm8xd\"" Apr 23 18:02:41.537225 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.537206 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 18:02:41.537310 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.537293 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 18:02:41.541048 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.541026 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 18:02:41.543359 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.543339 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67cf9fcb56-mznb7"] Apr 23 18:02:41.588666 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.588627 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa6b8cee-9a30-43a6-8532-16762f460f8a-console-oauth-config\") pod \"console-67cf9fcb56-mznb7\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.588827 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.588678 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjf8s\" (UniqueName: \"kubernetes.io/projected/fa6b8cee-9a30-43a6-8532-16762f460f8a-kube-api-access-fjf8s\") pod \"console-67cf9fcb56-mznb7\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.588827 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.588751 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-oauth-serving-cert\") pod \"console-67cf9fcb56-mznb7\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.588827 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.588790 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-service-ca\") pod \"console-67cf9fcb56-mznb7\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.588827 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.588821 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-console-config\") pod \"console-67cf9fcb56-mznb7\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.588991 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.588865 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa6b8cee-9a30-43a6-8532-16762f460f8a-console-serving-cert\") pod \"console-67cf9fcb56-mznb7\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.588991 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.588895 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-trusted-ca-bundle\") pod \"console-67cf9fcb56-mznb7\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.689691 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.689662 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-service-ca\") pod \"console-67cf9fcb56-mznb7\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.689837 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.689703 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-console-config\") pod \"console-67cf9fcb56-mznb7\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.689837 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.689727 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa6b8cee-9a30-43a6-8532-16762f460f8a-console-serving-cert\") pod \"console-67cf9fcb56-mznb7\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.689955 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.689856 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-trusted-ca-bundle\") pod \"console-67cf9fcb56-mznb7\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.689955 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.689909 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa6b8cee-9a30-43a6-8532-16762f460f8a-console-oauth-config\") pod \"console-67cf9fcb56-mznb7\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.690058 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.689962 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjf8s\" (UniqueName: \"kubernetes.io/projected/fa6b8cee-9a30-43a6-8532-16762f460f8a-kube-api-access-fjf8s\") pod \"console-67cf9fcb56-mznb7\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.690058 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.690011 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-oauth-serving-cert\") pod \"console-67cf9fcb56-mznb7\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.690353 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.690320 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-service-ca\") pod \"console-67cf9fcb56-mznb7\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.690671 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.690417 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-console-config\") pod \"console-67cf9fcb56-mznb7\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.690671 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.690601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-oauth-serving-cert\") pod \"console-67cf9fcb56-mznb7\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.691496 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.691451 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-trusted-ca-bundle\") pod \"console-67cf9fcb56-mznb7\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.697171 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.692902 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa6b8cee-9a30-43a6-8532-16762f460f8a-console-serving-cert\") pod \"console-67cf9fcb56-mznb7\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.697171 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.696924 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa6b8cee-9a30-43a6-8532-16762f460f8a-console-oauth-config\") pod \"console-67cf9fcb56-mznb7\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.698755 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.698731 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjf8s\" (UniqueName: \"kubernetes.io/projected/fa6b8cee-9a30-43a6-8532-16762f460f8a-kube-api-access-fjf8s\") pod \"console-67cf9fcb56-mznb7\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.844393 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.844350 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:41.979174 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:41.979144 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67cf9fcb56-mznb7"] Apr 23 18:02:41.983354 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:02:41.983326 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa6b8cee_9a30_43a6_8532_16762f460f8a.slice/crio-3df1120f35bfff1f2a17379c6b45f8db4d73e0aead6cf697f1ee76c5bda4968c WatchSource:0}: Error finding container 3df1120f35bfff1f2a17379c6b45f8db4d73e0aead6cf697f1ee76c5bda4968c: Status 404 returned error can't find the container with id 3df1120f35bfff1f2a17379c6b45f8db4d73e0aead6cf697f1ee76c5bda4968c Apr 23 18:02:42.221904 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:42.221820 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67cf9fcb56-mznb7" event={"ID":"fa6b8cee-9a30-43a6-8532-16762f460f8a","Type":"ContainerStarted","Data":"fee78e802b6b43e0bfbaf3436b72eefe7b49fbdce97bb2e4bcf63607f6cac07a"} Apr 23 18:02:42.221904 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:42.221857 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67cf9fcb56-mznb7" event={"ID":"fa6b8cee-9a30-43a6-8532-16762f460f8a","Type":"ContainerStarted","Data":"3df1120f35bfff1f2a17379c6b45f8db4d73e0aead6cf697f1ee76c5bda4968c"} Apr 23 18:02:42.241917 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:42.241869 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-67cf9fcb56-mznb7" podStartSLOduration=1.241856749 podStartE2EDuration="1.241856749s" podCreationTimestamp="2026-04-23 18:02:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:02:42.240806637 +0000 UTC m=+251.408133955" watchObservedRunningTime="2026-04-23 18:02:42.241856749 +0000 UTC m=+251.409184067" Apr 23 18:02:43.209398 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:43.209357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs\") pod \"network-metrics-daemon-q98mx\" (UID: \"ab8ad387-1bfb-42ab-ad18-c8ea20362f8f\") " pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 18:02:43.211927 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:43.211903 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab8ad387-1bfb-42ab-ad18-c8ea20362f8f-metrics-certs\") pod \"network-metrics-daemon-q98mx\" (UID: \"ab8ad387-1bfb-42ab-ad18-c8ea20362f8f\") " pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 18:02:43.313245 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:43.313221 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7tv5t\"" Apr 23 18:02:43.321760 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:43.321740 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q98mx" Apr 23 18:02:43.449707 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:43.449677 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q98mx"] Apr 23 18:02:43.453069 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:02:43.453046 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab8ad387_1bfb_42ab_ad18_c8ea20362f8f.slice/crio-9f53d2c31594e1b09e433b81f928ae3e2d3f6defa84effedfaf614d47614b165 WatchSource:0}: Error finding container 9f53d2c31594e1b09e433b81f928ae3e2d3f6defa84effedfaf614d47614b165: Status 404 returned error can't find the container with id 9f53d2c31594e1b09e433b81f928ae3e2d3f6defa84effedfaf614d47614b165 Apr 23 18:02:44.230779 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:44.230725 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q98mx" event={"ID":"ab8ad387-1bfb-42ab-ad18-c8ea20362f8f","Type":"ContainerStarted","Data":"9f53d2c31594e1b09e433b81f928ae3e2d3f6defa84effedfaf614d47614b165"} Apr 23 18:02:45.235501 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:45.235440 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q98mx" event={"ID":"ab8ad387-1bfb-42ab-ad18-c8ea20362f8f","Type":"ContainerStarted","Data":"84958d595acc72e11614f787dcfd7cebeab83e99b0928c8f5fb510e0254a8293"} Apr 23 18:02:45.235501 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:45.235502 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q98mx" event={"ID":"ab8ad387-1bfb-42ab-ad18-c8ea20362f8f","Type":"ContainerStarted","Data":"d4035e20cb8b0a69f0ad848a246ef1439f15fcdf59ac314fc23aad0f575bef46"} Apr 23 18:02:45.250710 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:45.250668 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-q98mx" podStartSLOduration=253.326175579 podStartE2EDuration="4m14.250649539s" podCreationTimestamp="2026-04-23 17:58:31 +0000 UTC" firstStartedPulling="2026-04-23 18:02:43.455155327 +0000 UTC m=+252.622482625" lastFinishedPulling="2026-04-23 18:02:44.379629279 +0000 UTC m=+253.546956585" observedRunningTime="2026-04-23 18:02:45.250553332 +0000 UTC m=+254.417880642" watchObservedRunningTime="2026-04-23 18:02:45.250649539 +0000 UTC m=+254.417976857" Apr 23 18:02:51.844738 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:51.844701 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:51.845147 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:51.844838 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:51.849432 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:51.849412 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:02:52.261017 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:02:52.260941 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:03:31.287888 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:31.287855 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:03:31.288454 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:31.287958 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:03:31.290825 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:31.290806 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 18:03:34.287570 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:34.287535 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6"] Apr 23 18:03:34.290769 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:34.290753 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6" Apr 23 18:03:34.292784 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:34.292762 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rkcpn\"" Apr 23 18:03:34.292784 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:34.292778 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 18:03:34.292916 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:34.292825 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 18:03:34.299986 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:34.299962 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6"] Apr 23 18:03:34.319377 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:34.319353 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-529bc\" (UniqueName: \"kubernetes.io/projected/ec65c2bb-7ff5-4030-97aa-947c490c5655-kube-api-access-529bc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6\" (UID: \"ec65c2bb-7ff5-4030-97aa-947c490c5655\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6" Apr 23 18:03:34.319502 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:34.319425 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ec65c2bb-7ff5-4030-97aa-947c490c5655-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6\" (UID: \"ec65c2bb-7ff5-4030-97aa-947c490c5655\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6" Apr 23 18:03:34.319502 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:34.319453 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ec65c2bb-7ff5-4030-97aa-947c490c5655-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6\" (UID: \"ec65c2bb-7ff5-4030-97aa-947c490c5655\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6" Apr 23 18:03:34.420036 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:34.419997 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-529bc\" (UniqueName: \"kubernetes.io/projected/ec65c2bb-7ff5-4030-97aa-947c490c5655-kube-api-access-529bc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6\" (UID: \"ec65c2bb-7ff5-4030-97aa-947c490c5655\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6" Apr 23 18:03:34.420217 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:34.420060 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ec65c2bb-7ff5-4030-97aa-947c490c5655-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6\" (UID: \"ec65c2bb-7ff5-4030-97aa-947c490c5655\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6" Apr 23 18:03:34.420217 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:34.420086 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ec65c2bb-7ff5-4030-97aa-947c490c5655-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6\" (UID: \"ec65c2bb-7ff5-4030-97aa-947c490c5655\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6" Apr 23 18:03:34.420398 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:34.420382 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ec65c2bb-7ff5-4030-97aa-947c490c5655-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6\" (UID: \"ec65c2bb-7ff5-4030-97aa-947c490c5655\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6" Apr 23 18:03:34.420445 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:34.420414 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ec65c2bb-7ff5-4030-97aa-947c490c5655-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6\" (UID: \"ec65c2bb-7ff5-4030-97aa-947c490c5655\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6" Apr 23 18:03:34.428436 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:34.428407 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-529bc\" (UniqueName: \"kubernetes.io/projected/ec65c2bb-7ff5-4030-97aa-947c490c5655-kube-api-access-529bc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6\" (UID: \"ec65c2bb-7ff5-4030-97aa-947c490c5655\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6" Apr 23 18:03:34.600412 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:34.600378 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6" Apr 23 18:03:34.723177 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:34.722956 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6"] Apr 23 18:03:34.726042 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:03:34.726011 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec65c2bb_7ff5_4030_97aa_947c490c5655.slice/crio-0b2808d693a0c1ab5e521516ffe9b203d48ce7f551e2ecee370f650a63c7c514 WatchSource:0}: Error finding container 0b2808d693a0c1ab5e521516ffe9b203d48ce7f551e2ecee370f650a63c7c514: Status 404 returned error can't find the container with id 0b2808d693a0c1ab5e521516ffe9b203d48ce7f551e2ecee370f650a63c7c514 Apr 23 18:03:34.727959 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:34.727943 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:03:35.383296 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:35.383258 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6" event={"ID":"ec65c2bb-7ff5-4030-97aa-947c490c5655","Type":"ContainerStarted","Data":"0b2808d693a0c1ab5e521516ffe9b203d48ce7f551e2ecee370f650a63c7c514"} Apr 23 18:03:40.399173 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:40.399135 2576 generic.go:358] "Generic (PLEG): container finished" podID="ec65c2bb-7ff5-4030-97aa-947c490c5655" containerID="3de4d6f71d93a4af7b338bf51333b4b33bca646ffcc66cccab4e83bdef3f1206" exitCode=0 Apr 23 18:03:40.399570 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:40.399214 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6" event={"ID":"ec65c2bb-7ff5-4030-97aa-947c490c5655","Type":"ContainerDied","Data":"3de4d6f71d93a4af7b338bf51333b4b33bca646ffcc66cccab4e83bdef3f1206"} Apr 23 18:03:43.408446 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:43.408411 2576 generic.go:358] "Generic (PLEG): container finished" podID="ec65c2bb-7ff5-4030-97aa-947c490c5655" containerID="cd82ee519adebb3675f5a0247580327e6e5422180a29df925d967a5e972af04b" exitCode=0 Apr 23 18:03:43.408844 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:43.408453 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6" event={"ID":"ec65c2bb-7ff5-4030-97aa-947c490c5655","Type":"ContainerDied","Data":"cd82ee519adebb3675f5a0247580327e6e5422180a29df925d967a5e972af04b"} Apr 23 18:03:50.430493 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:50.430447 2576 generic.go:358] "Generic (PLEG): container finished" podID="ec65c2bb-7ff5-4030-97aa-947c490c5655" containerID="d7fb96df8f01fa3e891cc89960125d6af7e9a2e9cce4553ea56a4d7c379a0644" exitCode=0 Apr 23 18:03:50.430851 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:50.430491 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6" event={"ID":"ec65c2bb-7ff5-4030-97aa-947c490c5655","Type":"ContainerDied","Data":"d7fb96df8f01fa3e891cc89960125d6af7e9a2e9cce4553ea56a4d7c379a0644"} Apr 23 18:03:51.556336 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:51.556311 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6" Apr 23 18:03:51.667002 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:51.666970 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-529bc\" (UniqueName: \"kubernetes.io/projected/ec65c2bb-7ff5-4030-97aa-947c490c5655-kube-api-access-529bc\") pod \"ec65c2bb-7ff5-4030-97aa-947c490c5655\" (UID: \"ec65c2bb-7ff5-4030-97aa-947c490c5655\") " Apr 23 18:03:51.667164 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:51.667009 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ec65c2bb-7ff5-4030-97aa-947c490c5655-util\") pod \"ec65c2bb-7ff5-4030-97aa-947c490c5655\" (UID: \"ec65c2bb-7ff5-4030-97aa-947c490c5655\") " Apr 23 18:03:51.667164 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:51.667034 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ec65c2bb-7ff5-4030-97aa-947c490c5655-bundle\") pod \"ec65c2bb-7ff5-4030-97aa-947c490c5655\" (UID: \"ec65c2bb-7ff5-4030-97aa-947c490c5655\") " Apr 23 18:03:51.667645 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:51.667613 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec65c2bb-7ff5-4030-97aa-947c490c5655-bundle" (OuterVolumeSpecName: "bundle") pod "ec65c2bb-7ff5-4030-97aa-947c490c5655" (UID: "ec65c2bb-7ff5-4030-97aa-947c490c5655"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:03:51.669412 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:51.669380 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec65c2bb-7ff5-4030-97aa-947c490c5655-kube-api-access-529bc" (OuterVolumeSpecName: "kube-api-access-529bc") pod "ec65c2bb-7ff5-4030-97aa-947c490c5655" (UID: "ec65c2bb-7ff5-4030-97aa-947c490c5655"). InnerVolumeSpecName "kube-api-access-529bc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:03:51.670932 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:51.670908 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec65c2bb-7ff5-4030-97aa-947c490c5655-util" (OuterVolumeSpecName: "util") pod "ec65c2bb-7ff5-4030-97aa-947c490c5655" (UID: "ec65c2bb-7ff5-4030-97aa-947c490c5655"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:03:51.767878 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:51.767781 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-529bc\" (UniqueName: \"kubernetes.io/projected/ec65c2bb-7ff5-4030-97aa-947c490c5655-kube-api-access-529bc\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:03:51.767878 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:51.767816 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ec65c2bb-7ff5-4030-97aa-947c490c5655-util\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:03:51.767878 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:51.767829 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ec65c2bb-7ff5-4030-97aa-947c490c5655-bundle\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:03:52.437268 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:52.437233 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6" event={"ID":"ec65c2bb-7ff5-4030-97aa-947c490c5655","Type":"ContainerDied","Data":"0b2808d693a0c1ab5e521516ffe9b203d48ce7f551e2ecee370f650a63c7c514"} Apr 23 18:03:52.437268 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:52.437264 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b2808d693a0c1ab5e521516ffe9b203d48ce7f551e2ecee370f650a63c7c514" Apr 23 18:03:52.437544 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:52.437294 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c64fr6" Apr 23 18:03:56.098100 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:56.095348 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxsbk"] Apr 23 18:03:56.098100 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:56.096018 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec65c2bb-7ff5-4030-97aa-947c490c5655" containerName="extract" Apr 23 18:03:56.098100 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:56.096037 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec65c2bb-7ff5-4030-97aa-947c490c5655" containerName="extract" Apr 23 18:03:56.098100 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:56.096060 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec65c2bb-7ff5-4030-97aa-947c490c5655" containerName="util" Apr 23 18:03:56.098100 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:56.096070 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec65c2bb-7ff5-4030-97aa-947c490c5655" containerName="util" Apr 23 18:03:56.098100 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:56.096124 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec65c2bb-7ff5-4030-97aa-947c490c5655" containerName="pull" Apr 23 18:03:56.098100 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:56.096132 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec65c2bb-7ff5-4030-97aa-947c490c5655" containerName="pull" Apr 23 18:03:56.098100 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:56.096255 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec65c2bb-7ff5-4030-97aa-947c490c5655" containerName="extract" Apr 23 18:03:56.144044 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:56.144001 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxsbk"] Apr 23 18:03:56.144193 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:56.144142 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxsbk" Apr 23 18:03:56.146395 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:56.146359 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 23 18:03:56.146395 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:56.146385 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 23 18:03:56.146632 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:56.146604 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-6wjdc\"" Apr 23 18:03:56.146724 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:56.146705 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 23 18:03:56.202500 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:56.202441 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/f735939e-df64-4387-9015-5816e484cb8f-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bxsbk\" (UID: \"f735939e-df64-4387-9015-5816e484cb8f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxsbk" Apr 23 18:03:56.202660 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:56.202592 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkfxj\" (UniqueName: \"kubernetes.io/projected/f735939e-df64-4387-9015-5816e484cb8f-kube-api-access-dkfxj\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bxsbk\" (UID: \"f735939e-df64-4387-9015-5816e484cb8f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxsbk" Apr 23 18:03:56.303696 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:56.303653 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkfxj\" (UniqueName: \"kubernetes.io/projected/f735939e-df64-4387-9015-5816e484cb8f-kube-api-access-dkfxj\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bxsbk\" (UID: \"f735939e-df64-4387-9015-5816e484cb8f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxsbk" Apr 23 18:03:56.303696 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:56.303697 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/f735939e-df64-4387-9015-5816e484cb8f-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bxsbk\" (UID: \"f735939e-df64-4387-9015-5816e484cb8f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxsbk" Apr 23 18:03:56.306177 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:56.306149 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/f735939e-df64-4387-9015-5816e484cb8f-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bxsbk\" (UID: \"f735939e-df64-4387-9015-5816e484cb8f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxsbk" Apr 23 18:03:56.312278 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:56.312258 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkfxj\" (UniqueName: \"kubernetes.io/projected/f735939e-df64-4387-9015-5816e484cb8f-kube-api-access-dkfxj\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bxsbk\" (UID: \"f735939e-df64-4387-9015-5816e484cb8f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxsbk" Apr 23 18:03:56.464032 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:56.463947 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxsbk" Apr 23 18:03:56.585016 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:56.584979 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxsbk"] Apr 23 18:03:56.589485 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:03:56.589440 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf735939e_df64_4387_9015_5816e484cb8f.slice/crio-decd7c2ef82080e4fad788686e439af36420e6a7b57c06986fc2cbc94d4eaece WatchSource:0}: Error finding container decd7c2ef82080e4fad788686e439af36420e6a7b57c06986fc2cbc94d4eaece: Status 404 returned error can't find the container with id decd7c2ef82080e4fad788686e439af36420e6a7b57c06986fc2cbc94d4eaece Apr 23 18:03:57.453270 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:03:57.453226 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxsbk" event={"ID":"f735939e-df64-4387-9015-5816e484cb8f","Type":"ContainerStarted","Data":"decd7c2ef82080e4fad788686e439af36420e6a7b57c06986fc2cbc94d4eaece"} Apr 23 18:04:00.464627 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:00.464543 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxsbk" event={"ID":"f735939e-df64-4387-9015-5816e484cb8f","Type":"ContainerStarted","Data":"9d92df2384888fdc6a172385bc68ae7e72fc8d9c2c7c4e3c4aa01f985649cb7e"} Apr 23 18:04:00.465019 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:00.464672 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxsbk" Apr 23 18:04:00.483520 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:00.483447 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxsbk" podStartSLOduration=0.976247935 podStartE2EDuration="4.483433244s" podCreationTimestamp="2026-04-23 18:03:56 +0000 UTC" firstStartedPulling="2026-04-23 18:03:56.591446664 +0000 UTC m=+325.758773975" lastFinishedPulling="2026-04-23 18:04:00.098631983 +0000 UTC m=+329.265959284" observedRunningTime="2026-04-23 18:04:00.481839395 +0000 UTC m=+329.649166713" watchObservedRunningTime="2026-04-23 18:04:00.483433244 +0000 UTC m=+329.650760562" Apr 23 18:04:00.686264 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:00.686223 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-56kh2"] Apr 23 18:04:00.712069 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:00.712039 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-56kh2"] Apr 23 18:04:00.712222 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:00.712163 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-56kh2" Apr 23 18:04:00.714282 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:00.714248 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-mq2b2\"" Apr 23 18:04:00.714411 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:00.714282 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 23 18:04:00.714411 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:00.714312 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 23 18:04:00.842794 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:00.842762 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5b2aaa72-bf96-4f17-ac74-cc822a69fe20-certificates\") pod \"keda-operator-ffbb595cb-56kh2\" (UID: \"5b2aaa72-bf96-4f17-ac74-cc822a69fe20\") " pod="openshift-keda/keda-operator-ffbb595cb-56kh2" Apr 23 18:04:00.842794 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:00.842798 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbtnn\" (UniqueName: \"kubernetes.io/projected/5b2aaa72-bf96-4f17-ac74-cc822a69fe20-kube-api-access-bbtnn\") pod \"keda-operator-ffbb595cb-56kh2\" (UID: \"5b2aaa72-bf96-4f17-ac74-cc822a69fe20\") " pod="openshift-keda/keda-operator-ffbb595cb-56kh2" Apr 23 18:04:00.842985 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:00.842825 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/5b2aaa72-bf96-4f17-ac74-cc822a69fe20-cabundle0\") pod \"keda-operator-ffbb595cb-56kh2\" (UID: \"5b2aaa72-bf96-4f17-ac74-cc822a69fe20\") " pod="openshift-keda/keda-operator-ffbb595cb-56kh2" Apr 23 18:04:00.943951 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:00.943916 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/5b2aaa72-bf96-4f17-ac74-cc822a69fe20-cabundle0\") pod \"keda-operator-ffbb595cb-56kh2\" (UID: \"5b2aaa72-bf96-4f17-ac74-cc822a69fe20\") " pod="openshift-keda/keda-operator-ffbb595cb-56kh2" Apr 23 18:04:00.944123 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:00.944011 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5b2aaa72-bf96-4f17-ac74-cc822a69fe20-certificates\") pod \"keda-operator-ffbb595cb-56kh2\" (UID: \"5b2aaa72-bf96-4f17-ac74-cc822a69fe20\") " pod="openshift-keda/keda-operator-ffbb595cb-56kh2" Apr 23 18:04:00.944123 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:00.944035 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbtnn\" (UniqueName: \"kubernetes.io/projected/5b2aaa72-bf96-4f17-ac74-cc822a69fe20-kube-api-access-bbtnn\") pod \"keda-operator-ffbb595cb-56kh2\" (UID: \"5b2aaa72-bf96-4f17-ac74-cc822a69fe20\") " pod="openshift-keda/keda-operator-ffbb595cb-56kh2" Apr 23 18:04:00.944200 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:00.944146 2576 secret.go:281] references non-existent secret key: ca.crt Apr 23 18:04:00.944200 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:00.944164 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 18:04:00.944200 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:00.944175 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-56kh2: references non-existent secret key: ca.crt Apr 23 18:04:00.944297 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:00.944233 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b2aaa72-bf96-4f17-ac74-cc822a69fe20-certificates podName:5b2aaa72-bf96-4f17-ac74-cc822a69fe20 nodeName:}" failed. No retries permitted until 2026-04-23 18:04:01.444214602 +0000 UTC m=+330.611541898 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5b2aaa72-bf96-4f17-ac74-cc822a69fe20-certificates") pod "keda-operator-ffbb595cb-56kh2" (UID: "5b2aaa72-bf96-4f17-ac74-cc822a69fe20") : references non-existent secret key: ca.crt Apr 23 18:04:00.944573 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:00.944555 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/5b2aaa72-bf96-4f17-ac74-cc822a69fe20-cabundle0\") pod \"keda-operator-ffbb595cb-56kh2\" (UID: \"5b2aaa72-bf96-4f17-ac74-cc822a69fe20\") " pod="openshift-keda/keda-operator-ffbb595cb-56kh2" Apr 23 18:04:00.954120 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:00.954097 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbtnn\" (UniqueName: \"kubernetes.io/projected/5b2aaa72-bf96-4f17-ac74-cc822a69fe20-kube-api-access-bbtnn\") pod \"keda-operator-ffbb595cb-56kh2\" (UID: \"5b2aaa72-bf96-4f17-ac74-cc822a69fe20\") " pod="openshift-keda/keda-operator-ffbb595cb-56kh2" Apr 23 18:04:01.055565 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.055480 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq"] Apr 23 18:04:01.074959 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.074930 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq"] Apr 23 18:04:01.075132 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.075061 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq" Apr 23 18:04:01.077202 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.077175 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 23 18:04:01.145819 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.145784 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/84757a68-1fb9-4a27-8f0e-504aee785ab8-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6jbrq\" (UID: \"84757a68-1fb9-4a27-8f0e-504aee785ab8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq" Apr 23 18:04:01.145996 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.145860 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/84757a68-1fb9-4a27-8f0e-504aee785ab8-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-6jbrq\" (UID: \"84757a68-1fb9-4a27-8f0e-504aee785ab8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq" Apr 23 18:04:01.145996 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.145883 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbp9h\" (UniqueName: \"kubernetes.io/projected/84757a68-1fb9-4a27-8f0e-504aee785ab8-kube-api-access-cbp9h\") pod \"keda-metrics-apiserver-7c9f485588-6jbrq\" (UID: \"84757a68-1fb9-4a27-8f0e-504aee785ab8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq" Apr 23 18:04:01.246857 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.246824 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/84757a68-1fb9-4a27-8f0e-504aee785ab8-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-6jbrq\" (UID: \"84757a68-1fb9-4a27-8f0e-504aee785ab8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq" Apr 23 18:04:01.246857 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.246869 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbp9h\" (UniqueName: \"kubernetes.io/projected/84757a68-1fb9-4a27-8f0e-504aee785ab8-kube-api-access-cbp9h\") pod \"keda-metrics-apiserver-7c9f485588-6jbrq\" (UID: \"84757a68-1fb9-4a27-8f0e-504aee785ab8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq" Apr 23 18:04:01.247123 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.246920 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/84757a68-1fb9-4a27-8f0e-504aee785ab8-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6jbrq\" (UID: \"84757a68-1fb9-4a27-8f0e-504aee785ab8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq" Apr 23 18:04:01.247123 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:01.247074 2576 secret.go:281] references non-existent secret key: tls.crt Apr 23 18:04:01.247123 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:01.247091 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 18:04:01.247123 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:01.247112 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq: references non-existent secret key: tls.crt Apr 23 18:04:01.247323 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:01.247183 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84757a68-1fb9-4a27-8f0e-504aee785ab8-certificates podName:84757a68-1fb9-4a27-8f0e-504aee785ab8 nodeName:}" failed. No retries permitted until 2026-04-23 18:04:01.747163486 +0000 UTC m=+330.914490785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/84757a68-1fb9-4a27-8f0e-504aee785ab8-certificates") pod "keda-metrics-apiserver-7c9f485588-6jbrq" (UID: "84757a68-1fb9-4a27-8f0e-504aee785ab8") : references non-existent secret key: tls.crt Apr 23 18:04:01.247323 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.247213 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/84757a68-1fb9-4a27-8f0e-504aee785ab8-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-6jbrq\" (UID: \"84757a68-1fb9-4a27-8f0e-504aee785ab8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq" Apr 23 18:04:01.254950 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.254920 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbp9h\" (UniqueName: \"kubernetes.io/projected/84757a68-1fb9-4a27-8f0e-504aee785ab8-kube-api-access-cbp9h\") pod \"keda-metrics-apiserver-7c9f485588-6jbrq\" (UID: \"84757a68-1fb9-4a27-8f0e-504aee785ab8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq" Apr 23 18:04:01.420202 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.420166 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-kz8fb"] Apr 23 18:04:01.440959 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.440921 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-kz8fb"] Apr 23 18:04:01.441094 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.441070 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-kz8fb" Apr 23 18:04:01.442881 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.442861 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 23 18:04:01.448580 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.448558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5b2aaa72-bf96-4f17-ac74-cc822a69fe20-certificates\") pod \"keda-operator-ffbb595cb-56kh2\" (UID: \"5b2aaa72-bf96-4f17-ac74-cc822a69fe20\") " pod="openshift-keda/keda-operator-ffbb595cb-56kh2" Apr 23 18:04:01.448710 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:01.448698 2576 secret.go:281] references non-existent secret key: ca.crt Apr 23 18:04:01.448753 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:01.448716 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 18:04:01.448753 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:01.448728 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-56kh2: references non-existent secret key: ca.crt Apr 23 18:04:01.448815 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:01.448784 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b2aaa72-bf96-4f17-ac74-cc822a69fe20-certificates podName:5b2aaa72-bf96-4f17-ac74-cc822a69fe20 nodeName:}" failed. No retries permitted until 2026-04-23 18:04:02.448767522 +0000 UTC m=+331.616094822 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5b2aaa72-bf96-4f17-ac74-cc822a69fe20-certificates") pod "keda-operator-ffbb595cb-56kh2" (UID: "5b2aaa72-bf96-4f17-ac74-cc822a69fe20") : references non-existent secret key: ca.crt Apr 23 18:04:01.550036 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.549994 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfcc7\" (UniqueName: \"kubernetes.io/projected/e4a8135b-0084-4026-bcd2-da276c3f2c14-kube-api-access-qfcc7\") pod \"keda-admission-cf49989db-kz8fb\" (UID: \"e4a8135b-0084-4026-bcd2-da276c3f2c14\") " pod="openshift-keda/keda-admission-cf49989db-kz8fb" Apr 23 18:04:01.550509 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.550144 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e4a8135b-0084-4026-bcd2-da276c3f2c14-certificates\") pod \"keda-admission-cf49989db-kz8fb\" (UID: \"e4a8135b-0084-4026-bcd2-da276c3f2c14\") " pod="openshift-keda/keda-admission-cf49989db-kz8fb" Apr 23 18:04:01.650630 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.650591 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e4a8135b-0084-4026-bcd2-da276c3f2c14-certificates\") pod \"keda-admission-cf49989db-kz8fb\" (UID: \"e4a8135b-0084-4026-bcd2-da276c3f2c14\") " pod="openshift-keda/keda-admission-cf49989db-kz8fb" Apr 23 18:04:01.650803 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.650700 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfcc7\" (UniqueName: \"kubernetes.io/projected/e4a8135b-0084-4026-bcd2-da276c3f2c14-kube-api-access-qfcc7\") pod \"keda-admission-cf49989db-kz8fb\" (UID: \"e4a8135b-0084-4026-bcd2-da276c3f2c14\") " pod="openshift-keda/keda-admission-cf49989db-kz8fb" Apr 23 18:04:01.653169 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.653151 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e4a8135b-0084-4026-bcd2-da276c3f2c14-certificates\") pod \"keda-admission-cf49989db-kz8fb\" (UID: \"e4a8135b-0084-4026-bcd2-da276c3f2c14\") " pod="openshift-keda/keda-admission-cf49989db-kz8fb" Apr 23 18:04:01.661932 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.661904 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfcc7\" (UniqueName: \"kubernetes.io/projected/e4a8135b-0084-4026-bcd2-da276c3f2c14-kube-api-access-qfcc7\") pod \"keda-admission-cf49989db-kz8fb\" (UID: \"e4a8135b-0084-4026-bcd2-da276c3f2c14\") " pod="openshift-keda/keda-admission-cf49989db-kz8fb" Apr 23 18:04:01.751378 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.751280 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-kz8fb" Apr 23 18:04:01.751568 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.751548 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/84757a68-1fb9-4a27-8f0e-504aee785ab8-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6jbrq\" (UID: \"84757a68-1fb9-4a27-8f0e-504aee785ab8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq" Apr 23 18:04:01.751665 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:01.751652 2576 secret.go:281] references non-existent secret key: tls.crt Apr 23 18:04:01.751724 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:01.751667 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 18:04:01.751724 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:01.751683 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq: references non-existent secret key: tls.crt Apr 23 18:04:01.751798 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:01.751735 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84757a68-1fb9-4a27-8f0e-504aee785ab8-certificates podName:84757a68-1fb9-4a27-8f0e-504aee785ab8 nodeName:}" failed. No retries permitted until 2026-04-23 18:04:02.751717801 +0000 UTC m=+331.919045098 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/84757a68-1fb9-4a27-8f0e-504aee785ab8-certificates") pod "keda-metrics-apiserver-7c9f485588-6jbrq" (UID: "84757a68-1fb9-4a27-8f0e-504aee785ab8") : references non-existent secret key: tls.crt Apr 23 18:04:01.888104 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:01.888079 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-kz8fb"] Apr 23 18:04:01.889919 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:04:01.889881 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4a8135b_0084_4026_bcd2_da276c3f2c14.slice/crio-2efe99e09bc46a1746a3482ed300aea76b4c168ce439d9a010f37255f9175167 WatchSource:0}: Error finding container 2efe99e09bc46a1746a3482ed300aea76b4c168ce439d9a010f37255f9175167: Status 404 returned error can't find the container with id 2efe99e09bc46a1746a3482ed300aea76b4c168ce439d9a010f37255f9175167 Apr 23 18:04:02.458721 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:02.458686 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5b2aaa72-bf96-4f17-ac74-cc822a69fe20-certificates\") pod \"keda-operator-ffbb595cb-56kh2\" (UID: \"5b2aaa72-bf96-4f17-ac74-cc822a69fe20\") " pod="openshift-keda/keda-operator-ffbb595cb-56kh2" Apr 23 18:04:02.458895 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:02.458804 2576 secret.go:281] references non-existent secret key: ca.crt Apr 23 18:04:02.458895 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:02.458816 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 18:04:02.458895 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:02.458825 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-56kh2: references non-existent secret key: ca.crt Apr 23 18:04:02.458895 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:02.458878 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b2aaa72-bf96-4f17-ac74-cc822a69fe20-certificates podName:5b2aaa72-bf96-4f17-ac74-cc822a69fe20 nodeName:}" failed. No retries permitted until 2026-04-23 18:04:04.458865578 +0000 UTC m=+333.626192875 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5b2aaa72-bf96-4f17-ac74-cc822a69fe20-certificates") pod "keda-operator-ffbb595cb-56kh2" (UID: "5b2aaa72-bf96-4f17-ac74-cc822a69fe20") : references non-existent secret key: ca.crt Apr 23 18:04:02.473123 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:02.473086 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-kz8fb" event={"ID":"e4a8135b-0084-4026-bcd2-da276c3f2c14","Type":"ContainerStarted","Data":"2efe99e09bc46a1746a3482ed300aea76b4c168ce439d9a010f37255f9175167"} Apr 23 18:04:02.760560 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:02.760480 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/84757a68-1fb9-4a27-8f0e-504aee785ab8-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6jbrq\" (UID: \"84757a68-1fb9-4a27-8f0e-504aee785ab8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq" Apr 23 18:04:02.760904 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:02.760624 2576 secret.go:281] references non-existent secret key: tls.crt Apr 23 18:04:02.760904 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:02.760642 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 18:04:02.760904 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:02.760660 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq: references non-existent secret key: tls.crt Apr 23 18:04:02.760904 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:02.760712 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84757a68-1fb9-4a27-8f0e-504aee785ab8-certificates podName:84757a68-1fb9-4a27-8f0e-504aee785ab8 nodeName:}" failed. No retries permitted until 2026-04-23 18:04:04.760697015 +0000 UTC m=+333.928024330 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/84757a68-1fb9-4a27-8f0e-504aee785ab8-certificates") pod "keda-metrics-apiserver-7c9f485588-6jbrq" (UID: "84757a68-1fb9-4a27-8f0e-504aee785ab8") : references non-existent secret key: tls.crt Apr 23 18:04:04.476906 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:04.476861 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5b2aaa72-bf96-4f17-ac74-cc822a69fe20-certificates\") pod \"keda-operator-ffbb595cb-56kh2\" (UID: \"5b2aaa72-bf96-4f17-ac74-cc822a69fe20\") " pod="openshift-keda/keda-operator-ffbb595cb-56kh2" Apr 23 18:04:04.477552 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:04.477013 2576 secret.go:281] references non-existent secret key: ca.crt Apr 23 18:04:04.477552 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:04.477030 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 18:04:04.477552 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:04.477039 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-56kh2: references non-existent secret key: ca.crt Apr 23 18:04:04.477552 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:04.477097 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b2aaa72-bf96-4f17-ac74-cc822a69fe20-certificates podName:5b2aaa72-bf96-4f17-ac74-cc822a69fe20 nodeName:}" failed. No retries permitted until 2026-04-23 18:04:08.477082284 +0000 UTC m=+337.644409581 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5b2aaa72-bf96-4f17-ac74-cc822a69fe20-certificates") pod "keda-operator-ffbb595cb-56kh2" (UID: "5b2aaa72-bf96-4f17-ac74-cc822a69fe20") : references non-existent secret key: ca.crt Apr 23 18:04:04.481135 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:04.481102 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-kz8fb" event={"ID":"e4a8135b-0084-4026-bcd2-da276c3f2c14","Type":"ContainerStarted","Data":"49fb887dcdd790b9aa598f677ad10d7cfff6915b56002acd582387d01a351854"} Apr 23 18:04:04.481272 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:04.481208 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-kz8fb" Apr 23 18:04:04.498902 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:04.498857 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-kz8fb" podStartSLOduration=1.6713612420000001 podStartE2EDuration="3.498843863s" podCreationTimestamp="2026-04-23 18:04:01 +0000 UTC" firstStartedPulling="2026-04-23 18:04:01.891445479 +0000 UTC m=+331.058772786" lastFinishedPulling="2026-04-23 18:04:03.718928107 +0000 UTC m=+332.886255407" observedRunningTime="2026-04-23 18:04:04.497194693 +0000 UTC m=+333.664522014" watchObservedRunningTime="2026-04-23 18:04:04.498843863 +0000 UTC m=+333.666171182" Apr 23 18:04:04.780261 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:04.780158 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/84757a68-1fb9-4a27-8f0e-504aee785ab8-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6jbrq\" (UID: \"84757a68-1fb9-4a27-8f0e-504aee785ab8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq" Apr 23 18:04:04.780401 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:04.780297 2576 secret.go:281] references non-existent secret key: tls.crt Apr 23 18:04:04.780401 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:04.780316 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 18:04:04.780401 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:04.780335 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq: references non-existent secret key: tls.crt Apr 23 18:04:04.780401 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:04:04.780386 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84757a68-1fb9-4a27-8f0e-504aee785ab8-certificates podName:84757a68-1fb9-4a27-8f0e-504aee785ab8 nodeName:}" failed. No retries permitted until 2026-04-23 18:04:08.78037143 +0000 UTC m=+337.947698730 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/84757a68-1fb9-4a27-8f0e-504aee785ab8-certificates") pod "keda-metrics-apiserver-7c9f485588-6jbrq" (UID: "84757a68-1fb9-4a27-8f0e-504aee785ab8") : references non-existent secret key: tls.crt Apr 23 18:04:08.520094 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:08.520000 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5b2aaa72-bf96-4f17-ac74-cc822a69fe20-certificates\") pod \"keda-operator-ffbb595cb-56kh2\" (UID: \"5b2aaa72-bf96-4f17-ac74-cc822a69fe20\") " pod="openshift-keda/keda-operator-ffbb595cb-56kh2" Apr 23 18:04:08.522657 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:08.522632 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5b2aaa72-bf96-4f17-ac74-cc822a69fe20-certificates\") pod \"keda-operator-ffbb595cb-56kh2\" (UID: \"5b2aaa72-bf96-4f17-ac74-cc822a69fe20\") " pod="openshift-keda/keda-operator-ffbb595cb-56kh2" Apr 23 18:04:08.822742 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:08.822712 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-56kh2" Apr 23 18:04:08.822976 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:08.822951 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/84757a68-1fb9-4a27-8f0e-504aee785ab8-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6jbrq\" (UID: \"84757a68-1fb9-4a27-8f0e-504aee785ab8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq" Apr 23 18:04:08.825584 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:08.825561 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/84757a68-1fb9-4a27-8f0e-504aee785ab8-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6jbrq\" (UID: \"84757a68-1fb9-4a27-8f0e-504aee785ab8\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq" Apr 23 18:04:08.888763 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:08.888731 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq" Apr 23 18:04:08.953493 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:08.953425 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-56kh2"] Apr 23 18:04:08.957032 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:04:08.957006 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b2aaa72_bf96_4f17_ac74_cc822a69fe20.slice/crio-823739f2c64d762c29d799d0c3948f6899071cd72a2751469032a47e8f0ce688 WatchSource:0}: Error finding container 823739f2c64d762c29d799d0c3948f6899071cd72a2751469032a47e8f0ce688: Status 404 returned error can't find the container with id 823739f2c64d762c29d799d0c3948f6899071cd72a2751469032a47e8f0ce688 Apr 23 18:04:09.016989 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:09.016955 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq"] Apr 23 18:04:09.020760 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:04:09.020733 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84757a68_1fb9_4a27_8f0e_504aee785ab8.slice/crio-56a6fb70d9f0f8984e62d5e238e025aa9fc0777fa55330da3a2987fae5b2d929 WatchSource:0}: Error finding container 56a6fb70d9f0f8984e62d5e238e025aa9fc0777fa55330da3a2987fae5b2d929: Status 404 returned error can't find the container with id 56a6fb70d9f0f8984e62d5e238e025aa9fc0777fa55330da3a2987fae5b2d929 Apr 23 18:04:09.498109 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:09.498070 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-56kh2" event={"ID":"5b2aaa72-bf96-4f17-ac74-cc822a69fe20","Type":"ContainerStarted","Data":"823739f2c64d762c29d799d0c3948f6899071cd72a2751469032a47e8f0ce688"} Apr 23 18:04:09.499071 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:09.499041 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq" event={"ID":"84757a68-1fb9-4a27-8f0e-504aee785ab8","Type":"ContainerStarted","Data":"56a6fb70d9f0f8984e62d5e238e025aa9fc0777fa55330da3a2987fae5b2d929"} Apr 23 18:04:13.514729 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:13.514686 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-56kh2" event={"ID":"5b2aaa72-bf96-4f17-ac74-cc822a69fe20","Type":"ContainerStarted","Data":"2fc9d9ed104952e557b7b322cde7ac325e405ec841fa1c0c5f2b78bb706d462f"} Apr 23 18:04:13.515259 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:13.514782 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-56kh2" Apr 23 18:04:13.515898 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:13.515877 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq" event={"ID":"84757a68-1fb9-4a27-8f0e-504aee785ab8","Type":"ContainerStarted","Data":"6fadfd11c8493959cfff97a509779d12ab1c601ecc57f114246d0cd4e38e9dbe"} Apr 23 18:04:13.516008 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:13.515996 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq" Apr 23 18:04:13.532082 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:13.532037 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-56kh2" podStartSLOduration=9.951015959 podStartE2EDuration="13.532024702s" podCreationTimestamp="2026-04-23 18:04:00 +0000 UTC" firstStartedPulling="2026-04-23 18:04:08.958204685 +0000 UTC m=+338.125531982" lastFinishedPulling="2026-04-23 18:04:12.539213429 +0000 UTC m=+341.706540725" observedRunningTime="2026-04-23 18:04:13.529934903 +0000 UTC m=+342.697262221" watchObservedRunningTime="2026-04-23 18:04:13.532024702 +0000 UTC m=+342.699352020" Apr 23 18:04:13.544393 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:13.544358 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq" podStartSLOduration=9.031870734 podStartE2EDuration="12.544342814s" podCreationTimestamp="2026-04-23 18:04:01 +0000 UTC" firstStartedPulling="2026-04-23 18:04:09.022206194 +0000 UTC m=+338.189533494" lastFinishedPulling="2026-04-23 18:04:12.53467826 +0000 UTC m=+341.702005574" observedRunningTime="2026-04-23 18:04:13.543911934 +0000 UTC m=+342.711239252" watchObservedRunningTime="2026-04-23 18:04:13.544342814 +0000 UTC m=+342.711670135" Apr 23 18:04:21.471121 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:21.471089 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxsbk" Apr 23 18:04:24.523404 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:24.523370 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6jbrq" Apr 23 18:04:25.486288 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:25.486254 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-kz8fb" Apr 23 18:04:34.521577 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:04:34.521539 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-56kh2" Apr 23 18:05:08.480328 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:05:08.480245 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-ghlhs"] Apr 23 18:05:08.486640 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:05:08.486618 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-ghlhs" Apr 23 18:05:08.488797 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:05:08.488771 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 23 18:05:08.488921 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:05:08.488817 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 18:05:08.488921 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:05:08.488856 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 18:05:08.488921 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:05:08.488861 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-fr69r\"" Apr 23 18:05:08.491601 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:05:08.491577 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-ghlhs"] Apr 23 18:05:08.623603 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:05:08.623567 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wtwq\" (UniqueName: \"kubernetes.io/projected/56921661-c4b9-4336-8078-2ef8305fc3e8-kube-api-access-6wtwq\") pod \"seaweedfs-86cc847c5c-ghlhs\" (UID: \"56921661-c4b9-4336-8078-2ef8305fc3e8\") " pod="kserve/seaweedfs-86cc847c5c-ghlhs" Apr 23 18:05:08.623766 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:05:08.623623 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/56921661-c4b9-4336-8078-2ef8305fc3e8-data\") pod \"seaweedfs-86cc847c5c-ghlhs\" (UID: \"56921661-c4b9-4336-8078-2ef8305fc3e8\") " pod="kserve/seaweedfs-86cc847c5c-ghlhs" Apr 23 18:05:08.724392 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:05:08.724355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wtwq\" (UniqueName: \"kubernetes.io/projected/56921661-c4b9-4336-8078-2ef8305fc3e8-kube-api-access-6wtwq\") pod \"seaweedfs-86cc847c5c-ghlhs\" (UID: \"56921661-c4b9-4336-8078-2ef8305fc3e8\") " pod="kserve/seaweedfs-86cc847c5c-ghlhs" Apr 23 18:05:08.724392 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:05:08.724408 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/56921661-c4b9-4336-8078-2ef8305fc3e8-data\") pod \"seaweedfs-86cc847c5c-ghlhs\" (UID: \"56921661-c4b9-4336-8078-2ef8305fc3e8\") " pod="kserve/seaweedfs-86cc847c5c-ghlhs" Apr 23 18:05:08.724810 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:05:08.724790 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/56921661-c4b9-4336-8078-2ef8305fc3e8-data\") pod \"seaweedfs-86cc847c5c-ghlhs\" (UID: \"56921661-c4b9-4336-8078-2ef8305fc3e8\") " pod="kserve/seaweedfs-86cc847c5c-ghlhs" Apr 23 18:05:08.732547 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:05:08.732447 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wtwq\" (UniqueName: \"kubernetes.io/projected/56921661-c4b9-4336-8078-2ef8305fc3e8-kube-api-access-6wtwq\") pod \"seaweedfs-86cc847c5c-ghlhs\" (UID: \"56921661-c4b9-4336-8078-2ef8305fc3e8\") " pod="kserve/seaweedfs-86cc847c5c-ghlhs" Apr 23 18:05:08.797500 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:05:08.797447 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-ghlhs" Apr 23 18:05:08.918889 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:05:08.918860 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-ghlhs"] Apr 23 18:05:08.921995 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:05:08.921968 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56921661_c4b9_4336_8078_2ef8305fc3e8.slice/crio-e5f9ea5ae4d17c430dabe1a052324c3238efe34914fb74079eba41f7b4e5267c WatchSource:0}: Error finding container e5f9ea5ae4d17c430dabe1a052324c3238efe34914fb74079eba41f7b4e5267c: Status 404 returned error can't find the container with id e5f9ea5ae4d17c430dabe1a052324c3238efe34914fb74079eba41f7b4e5267c Apr 23 18:05:09.700500 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:05:09.700444 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-ghlhs" event={"ID":"56921661-c4b9-4336-8078-2ef8305fc3e8","Type":"ContainerStarted","Data":"e5f9ea5ae4d17c430dabe1a052324c3238efe34914fb74079eba41f7b4e5267c"} Apr 23 18:05:11.708093 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:05:11.708005 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-ghlhs" event={"ID":"56921661-c4b9-4336-8078-2ef8305fc3e8","Type":"ContainerStarted","Data":"04613dcf48da76f5dbc21efa432f137ca7afd083e24247ee3a818807f0ce41e2"} Apr 23 18:05:11.708425 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:05:11.708138 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-ghlhs" Apr 23 18:05:11.723454 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:05:11.723331 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-ghlhs" podStartSLOduration=1.207402094 podStartE2EDuration="3.723314057s" podCreationTimestamp="2026-04-23 18:05:08 +0000 UTC" firstStartedPulling="2026-04-23 18:05:08.923364198 +0000 UTC m=+398.090691509" lastFinishedPulling="2026-04-23 18:05:11.439276169 +0000 UTC m=+400.606603472" observedRunningTime="2026-04-23 18:05:11.72255389 +0000 UTC m=+400.889881209" watchObservedRunningTime="2026-04-23 18:05:11.723314057 +0000 UTC m=+400.890641376" Apr 23 18:05:17.713980 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:05:17.713950 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-ghlhs" Apr 23 18:06:20.236995 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:06:20.236960 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-x89r9"] Apr 23 18:06:20.239234 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:06:20.239216 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-x89r9" Apr 23 18:06:20.241819 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:06:20.241799 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-fflxl\"" Apr 23 18:06:20.241922 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:06:20.241814 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 23 18:06:20.251021 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:06:20.250995 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-x89r9"] Apr 23 18:06:20.298102 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:06:20.298069 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmz5d\" (UniqueName: \"kubernetes.io/projected/95f618ae-2c6a-441e-9f67-a46e8ff85d44-kube-api-access-rmz5d\") pod \"model-serving-api-86f7b4b499-x89r9\" (UID: \"95f618ae-2c6a-441e-9f67-a46e8ff85d44\") " pod="kserve/model-serving-api-86f7b4b499-x89r9" Apr 23 18:06:20.298275 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:06:20.298126 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/95f618ae-2c6a-441e-9f67-a46e8ff85d44-tls-certs\") pod \"model-serving-api-86f7b4b499-x89r9\" (UID: \"95f618ae-2c6a-441e-9f67-a46e8ff85d44\") " pod="kserve/model-serving-api-86f7b4b499-x89r9" Apr 23 18:06:20.399542 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:06:20.399509 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rmz5d\" (UniqueName: \"kubernetes.io/projected/95f618ae-2c6a-441e-9f67-a46e8ff85d44-kube-api-access-rmz5d\") pod \"model-serving-api-86f7b4b499-x89r9\" (UID: \"95f618ae-2c6a-441e-9f67-a46e8ff85d44\") " pod="kserve/model-serving-api-86f7b4b499-x89r9" Apr 23 18:06:20.399701 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:06:20.399594 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/95f618ae-2c6a-441e-9f67-a46e8ff85d44-tls-certs\") pod \"model-serving-api-86f7b4b499-x89r9\" (UID: \"95f618ae-2c6a-441e-9f67-a46e8ff85d44\") " pod="kserve/model-serving-api-86f7b4b499-x89r9" Apr 23 18:06:20.402339 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:06:20.402309 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/95f618ae-2c6a-441e-9f67-a46e8ff85d44-tls-certs\") pod \"model-serving-api-86f7b4b499-x89r9\" (UID: \"95f618ae-2c6a-441e-9f67-a46e8ff85d44\") " pod="kserve/model-serving-api-86f7b4b499-x89r9" Apr 23 18:06:20.406843 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:06:20.406820 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmz5d\" (UniqueName: \"kubernetes.io/projected/95f618ae-2c6a-441e-9f67-a46e8ff85d44-kube-api-access-rmz5d\") pod \"model-serving-api-86f7b4b499-x89r9\" (UID: \"95f618ae-2c6a-441e-9f67-a46e8ff85d44\") " pod="kserve/model-serving-api-86f7b4b499-x89r9" Apr 23 18:06:20.549785 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:06:20.549697 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-x89r9" Apr 23 18:06:20.680862 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:06:20.680834 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-x89r9"] Apr 23 18:06:20.682718 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:06:20.682689 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95f618ae_2c6a_441e_9f67_a46e8ff85d44.slice/crio-217730975cd0e1b1878b331196733b9dbbaed15e2e494b8032bce651089f6bb3 WatchSource:0}: Error finding container 217730975cd0e1b1878b331196733b9dbbaed15e2e494b8032bce651089f6bb3: Status 404 returned error can't find the container with id 217730975cd0e1b1878b331196733b9dbbaed15e2e494b8032bce651089f6bb3 Apr 23 18:06:20.932868 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:06:20.932830 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-x89r9" event={"ID":"95f618ae-2c6a-441e-9f67-a46e8ff85d44","Type":"ContainerStarted","Data":"217730975cd0e1b1878b331196733b9dbbaed15e2e494b8032bce651089f6bb3"} Apr 23 18:06:22.945405 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:06:22.945371 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-x89r9" event={"ID":"95f618ae-2c6a-441e-9f67-a46e8ff85d44","Type":"ContainerStarted","Data":"12314fa30a1184874cdffc2c5b59261e234a7808c434d52538bcddd64713e995"} Apr 23 18:06:22.945766 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:06:22.945604 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-x89r9" Apr 23 18:06:22.966725 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:06:22.966665 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-x89r9" podStartSLOduration=0.778813461 podStartE2EDuration="2.966650716s" podCreationTimestamp="2026-04-23 18:06:20 +0000 UTC" firstStartedPulling="2026-04-23 18:06:20.684562652 +0000 UTC m=+469.851889954" lastFinishedPulling="2026-04-23 18:06:22.87239991 +0000 UTC m=+472.039727209" observedRunningTime="2026-04-23 18:06:22.965046707 +0000 UTC m=+472.132374027" watchObservedRunningTime="2026-04-23 18:06:22.966650716 +0000 UTC m=+472.133978350" Apr 23 18:06:33.953813 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:06:33.953780 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-x89r9" Apr 23 18:07:23.013157 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:23.013059 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-cpgb7"] Apr 23 18:07:23.016217 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:23.016197 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-cpgb7" Apr 23 18:07:23.018003 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:23.017979 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 23 18:07:23.018183 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:23.018171 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 23 18:07:23.025333 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:23.025312 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-cpgb7"] Apr 23 18:07:23.125511 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:23.125480 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a91c2e04-906f-4720-b338-fb7867f30f39-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-cpgb7\" (UID: \"a91c2e04-906f-4720-b338-fb7867f30f39\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-cpgb7" Apr 23 18:07:23.125695 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:23.125558 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/a91c2e04-906f-4720-b338-fb7867f30f39-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-cpgb7\" (UID: \"a91c2e04-906f-4720-b338-fb7867f30f39\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-cpgb7" Apr 23 18:07:23.125695 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:23.125628 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9csst\" (UniqueName: \"kubernetes.io/projected/a91c2e04-906f-4720-b338-fb7867f30f39-kube-api-access-9csst\") pod \"seaweedfs-tls-custom-5c88b85bb7-cpgb7\" (UID: \"a91c2e04-906f-4720-b338-fb7867f30f39\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-cpgb7" Apr 23 18:07:23.226469 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:23.226435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a91c2e04-906f-4720-b338-fb7867f30f39-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-cpgb7\" (UID: \"a91c2e04-906f-4720-b338-fb7867f30f39\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-cpgb7" Apr 23 18:07:23.226675 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:23.226503 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/a91c2e04-906f-4720-b338-fb7867f30f39-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-cpgb7\" (UID: \"a91c2e04-906f-4720-b338-fb7867f30f39\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-cpgb7" Apr 23 18:07:23.226675 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:23.226573 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9csst\" (UniqueName: \"kubernetes.io/projected/a91c2e04-906f-4720-b338-fb7867f30f39-kube-api-access-9csst\") pod \"seaweedfs-tls-custom-5c88b85bb7-cpgb7\" (UID: \"a91c2e04-906f-4720-b338-fb7867f30f39\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-cpgb7" Apr 23 18:07:23.226816 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:23.226795 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a91c2e04-906f-4720-b338-fb7867f30f39-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-cpgb7\" (UID: \"a91c2e04-906f-4720-b338-fb7867f30f39\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-cpgb7" Apr 23 18:07:23.229099 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:23.229058 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/a91c2e04-906f-4720-b338-fb7867f30f39-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-cpgb7\" (UID: \"a91c2e04-906f-4720-b338-fb7867f30f39\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-cpgb7" Apr 23 18:07:23.235289 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:23.235255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9csst\" (UniqueName: \"kubernetes.io/projected/a91c2e04-906f-4720-b338-fb7867f30f39-kube-api-access-9csst\") pod \"seaweedfs-tls-custom-5c88b85bb7-cpgb7\" (UID: \"a91c2e04-906f-4720-b338-fb7867f30f39\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-cpgb7" Apr 23 18:07:23.325191 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:23.325155 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-cpgb7" Apr 23 18:07:23.445725 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:23.445693 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-cpgb7"] Apr 23 18:07:23.449307 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:07:23.449272 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda91c2e04_906f_4720_b338_fb7867f30f39.slice/crio-71d1fcece0965e3b2a0fc8a3cbc7218ff8a6ab39630bf23e1a72bdbce56577ee WatchSource:0}: Error finding container 71d1fcece0965e3b2a0fc8a3cbc7218ff8a6ab39630bf23e1a72bdbce56577ee: Status 404 returned error can't find the container with id 71d1fcece0965e3b2a0fc8a3cbc7218ff8a6ab39630bf23e1a72bdbce56577ee Apr 23 18:07:24.160383 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:24.160349 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-cpgb7" event={"ID":"a91c2e04-906f-4720-b338-fb7867f30f39","Type":"ContainerStarted","Data":"423f8bb849f074fb8580ec91f818ae34f8f54233abcd94c86ebb065ee7cdc630"} Apr 23 18:07:24.160383 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:24.160382 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-cpgb7" event={"ID":"a91c2e04-906f-4720-b338-fb7867f30f39","Type":"ContainerStarted","Data":"71d1fcece0965e3b2a0fc8a3cbc7218ff8a6ab39630bf23e1a72bdbce56577ee"} Apr 23 18:07:24.178368 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:24.178316 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-cpgb7" podStartSLOduration=1.9282940549999998 podStartE2EDuration="2.178298506s" podCreationTimestamp="2026-04-23 18:07:22 +0000 UTC" firstStartedPulling="2026-04-23 18:07:23.452515621 +0000 UTC m=+532.619842924" lastFinishedPulling="2026-04-23 18:07:23.702520075 +0000 UTC m=+532.869847375" observedRunningTime="2026-04-23 18:07:24.176732072 +0000 UTC m=+533.344059391" watchObservedRunningTime="2026-04-23 18:07:24.178298506 +0000 UTC m=+533.345625824" Apr 23 18:07:34.550750 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:34.550709 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-z6n4h"] Apr 23 18:07:34.555607 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:34.555586 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-z6n4h" Apr 23 18:07:34.557710 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:34.557690 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 23 18:07:34.560784 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:34.560759 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-z6n4h"] Apr 23 18:07:34.620573 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:34.620537 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d7mq\" (UniqueName: \"kubernetes.io/projected/cf3159ad-fdc1-42eb-9c5c-d7876a8bc0db-kube-api-access-5d7mq\") pod \"s3-tls-init-serving-z6n4h\" (UID: \"cf3159ad-fdc1-42eb-9c5c-d7876a8bc0db\") " pod="kserve/s3-tls-init-serving-z6n4h" Apr 23 18:07:34.721781 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:34.721742 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5d7mq\" (UniqueName: \"kubernetes.io/projected/cf3159ad-fdc1-42eb-9c5c-d7876a8bc0db-kube-api-access-5d7mq\") pod \"s3-tls-init-serving-z6n4h\" (UID: \"cf3159ad-fdc1-42eb-9c5c-d7876a8bc0db\") " pod="kserve/s3-tls-init-serving-z6n4h" Apr 23 18:07:34.731230 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:34.731206 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d7mq\" (UniqueName: \"kubernetes.io/projected/cf3159ad-fdc1-42eb-9c5c-d7876a8bc0db-kube-api-access-5d7mq\") pod \"s3-tls-init-serving-z6n4h\" (UID: \"cf3159ad-fdc1-42eb-9c5c-d7876a8bc0db\") " pod="kserve/s3-tls-init-serving-z6n4h" Apr 23 18:07:34.888661 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:34.888628 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-z6n4h" Apr 23 18:07:35.013942 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:35.013919 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-z6n4h"] Apr 23 18:07:35.015923 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:07:35.015897 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf3159ad_fdc1_42eb_9c5c_d7876a8bc0db.slice/crio-e8d6b14774a267918f230e58bfff80d24b991f1ecb1e6f7a9960e0bccb593753 WatchSource:0}: Error finding container e8d6b14774a267918f230e58bfff80d24b991f1ecb1e6f7a9960e0bccb593753: Status 404 returned error can't find the container with id e8d6b14774a267918f230e58bfff80d24b991f1ecb1e6f7a9960e0bccb593753 Apr 23 18:07:35.199272 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:35.199189 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-z6n4h" event={"ID":"cf3159ad-fdc1-42eb-9c5c-d7876a8bc0db","Type":"ContainerStarted","Data":"e8d6b14774a267918f230e58bfff80d24b991f1ecb1e6f7a9960e0bccb593753"} Apr 23 18:07:40.222370 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:40.222335 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-z6n4h" event={"ID":"cf3159ad-fdc1-42eb-9c5c-d7876a8bc0db","Type":"ContainerStarted","Data":"37214153643ad4e05e08a7b1ab2ed7612727a1b094181f907f345b3c0ea83dd0"} Apr 23 18:07:40.244389 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:40.244342 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-z6n4h" podStartSLOduration=1.781053595 podStartE2EDuration="6.244326715s" podCreationTimestamp="2026-04-23 18:07:34 +0000 UTC" firstStartedPulling="2026-04-23 18:07:35.017804253 +0000 UTC m=+544.185131553" lastFinishedPulling="2026-04-23 18:07:39.481077372 +0000 UTC m=+548.648404673" observedRunningTime="2026-04-23 18:07:40.242622183 +0000 UTC m=+549.409949502" watchObservedRunningTime="2026-04-23 18:07:40.244326715 +0000 UTC m=+549.411654033" Apr 23 18:07:44.237899 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:44.237865 2576 generic.go:358] "Generic (PLEG): container finished" podID="cf3159ad-fdc1-42eb-9c5c-d7876a8bc0db" containerID="37214153643ad4e05e08a7b1ab2ed7612727a1b094181f907f345b3c0ea83dd0" exitCode=0 Apr 23 18:07:44.238296 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:44.237937 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-z6n4h" event={"ID":"cf3159ad-fdc1-42eb-9c5c-d7876a8bc0db","Type":"ContainerDied","Data":"37214153643ad4e05e08a7b1ab2ed7612727a1b094181f907f345b3c0ea83dd0"} Apr 23 18:07:45.384093 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:45.384069 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-z6n4h" Apr 23 18:07:45.516683 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:45.516602 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d7mq\" (UniqueName: \"kubernetes.io/projected/cf3159ad-fdc1-42eb-9c5c-d7876a8bc0db-kube-api-access-5d7mq\") pod \"cf3159ad-fdc1-42eb-9c5c-d7876a8bc0db\" (UID: \"cf3159ad-fdc1-42eb-9c5c-d7876a8bc0db\") " Apr 23 18:07:45.518856 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:45.518826 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf3159ad-fdc1-42eb-9c5c-d7876a8bc0db-kube-api-access-5d7mq" (OuterVolumeSpecName: "kube-api-access-5d7mq") pod "cf3159ad-fdc1-42eb-9c5c-d7876a8bc0db" (UID: "cf3159ad-fdc1-42eb-9c5c-d7876a8bc0db"). InnerVolumeSpecName "kube-api-access-5d7mq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:07:45.618005 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:45.617971 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5d7mq\" (UniqueName: \"kubernetes.io/projected/cf3159ad-fdc1-42eb-9c5c-d7876a8bc0db-kube-api-access-5d7mq\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:07:46.246135 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:46.246102 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-z6n4h" event={"ID":"cf3159ad-fdc1-42eb-9c5c-d7876a8bc0db","Type":"ContainerDied","Data":"e8d6b14774a267918f230e58bfff80d24b991f1ecb1e6f7a9960e0bccb593753"} Apr 23 18:07:46.246135 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:46.246139 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8d6b14774a267918f230e58bfff80d24b991f1ecb1e6f7a9960e0bccb593753" Apr 23 18:07:46.246354 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:46.246110 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-z6n4h" Apr 23 18:07:55.360452 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.360415 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk"] Apr 23 18:07:55.360898 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.360840 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf3159ad-fdc1-42eb-9c5c-d7876a8bc0db" containerName="s3-tls-init-serving" Apr 23 18:07:55.360898 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.360853 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf3159ad-fdc1-42eb-9c5c-d7876a8bc0db" containerName="s3-tls-init-serving" Apr 23 18:07:55.360968 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.360918 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cf3159ad-fdc1-42eb-9c5c-d7876a8bc0db" containerName="s3-tls-init-serving" Apr 23 18:07:55.363572 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.363555 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:07:55.365971 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.365948 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 18:07:55.366113 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.365948 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-predictor-serving-cert\"" Apr 23 18:07:55.366229 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.366214 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:07:55.366612 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.366593 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\"" Apr 23 18:07:55.366706 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.366625 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t4fg7\"" Apr 23 18:07:55.372962 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.372938 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk"] Apr 23 18:07:55.507955 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.507922 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33a60d5b-55fa-475e-b06c-918705cb4b90-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk\" (UID: \"33a60d5b-55fa-475e-b06c-918705cb4b90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:07:55.508141 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.507972 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33a60d5b-55fa-475e-b06c-918705cb4b90-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk\" (UID: \"33a60d5b-55fa-475e-b06c-918705cb4b90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:07:55.508141 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.508074 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33a60d5b-55fa-475e-b06c-918705cb4b90-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk\" (UID: \"33a60d5b-55fa-475e-b06c-918705cb4b90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:07:55.508269 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.508190 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9fd2\" (UniqueName: \"kubernetes.io/projected/33a60d5b-55fa-475e-b06c-918705cb4b90-kube-api-access-k9fd2\") pod \"isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk\" (UID: \"33a60d5b-55fa-475e-b06c-918705cb4b90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:07:55.609838 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.609795 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33a60d5b-55fa-475e-b06c-918705cb4b90-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk\" (UID: \"33a60d5b-55fa-475e-b06c-918705cb4b90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:07:55.609838 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.609840 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33a60d5b-55fa-475e-b06c-918705cb4b90-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk\" (UID: \"33a60d5b-55fa-475e-b06c-918705cb4b90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:07:55.610083 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.609876 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33a60d5b-55fa-475e-b06c-918705cb4b90-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk\" (UID: \"33a60d5b-55fa-475e-b06c-918705cb4b90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:07:55.610083 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.609908 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9fd2\" (UniqueName: \"kubernetes.io/projected/33a60d5b-55fa-475e-b06c-918705cb4b90-kube-api-access-k9fd2\") pod \"isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk\" (UID: \"33a60d5b-55fa-475e-b06c-918705cb4b90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:07:55.610201 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.610177 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33a60d5b-55fa-475e-b06c-918705cb4b90-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk\" (UID: \"33a60d5b-55fa-475e-b06c-918705cb4b90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:07:55.610566 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.610512 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33a60d5b-55fa-475e-b06c-918705cb4b90-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk\" (UID: \"33a60d5b-55fa-475e-b06c-918705cb4b90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:07:55.612519 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.612496 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33a60d5b-55fa-475e-b06c-918705cb4b90-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk\" (UID: \"33a60d5b-55fa-475e-b06c-918705cb4b90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:07:55.619396 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.619372 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9fd2\" (UniqueName: \"kubernetes.io/projected/33a60d5b-55fa-475e-b06c-918705cb4b90-kube-api-access-k9fd2\") pod \"isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk\" (UID: \"33a60d5b-55fa-475e-b06c-918705cb4b90\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:07:55.674244 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.674214 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:07:55.799869 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:55.799840 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk"] Apr 23 18:07:55.801936 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:07:55.801902 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33a60d5b_55fa_475e_b06c_918705cb4b90.slice/crio-b5f04ae08718c3645cc9cd3efac1e0adc48aa0e2a011056d2f02851c7654b2b9 WatchSource:0}: Error finding container b5f04ae08718c3645cc9cd3efac1e0adc48aa0e2a011056d2f02851c7654b2b9: Status 404 returned error can't find the container with id b5f04ae08718c3645cc9cd3efac1e0adc48aa0e2a011056d2f02851c7654b2b9 Apr 23 18:07:56.281931 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:07:56.281895 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" event={"ID":"33a60d5b-55fa-475e-b06c-918705cb4b90","Type":"ContainerStarted","Data":"b5f04ae08718c3645cc9cd3efac1e0adc48aa0e2a011056d2f02851c7654b2b9"} Apr 23 18:08:00.297876 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:00.297784 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" event={"ID":"33a60d5b-55fa-475e-b06c-918705cb4b90","Type":"ContainerStarted","Data":"f84686f858005be3383a7e0d712224e80c9fae7e661bb9c339af61fdb0c420e8"} Apr 23 18:08:04.311921 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:04.311885 2576 generic.go:358] "Generic (PLEG): container finished" podID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerID="f84686f858005be3383a7e0d712224e80c9fae7e661bb9c339af61fdb0c420e8" exitCode=0 Apr 23 18:08:04.312328 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:04.311962 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" event={"ID":"33a60d5b-55fa-475e-b06c-918705cb4b90","Type":"ContainerDied","Data":"f84686f858005be3383a7e0d712224e80c9fae7e661bb9c339af61fdb0c420e8"} Apr 23 18:08:17.365544 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:17.365506 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" event={"ID":"33a60d5b-55fa-475e-b06c-918705cb4b90","Type":"ContainerStarted","Data":"946a8a271ecc947f03cb8b400545807e2ceadde3785dcfb92c14157a2d36e25a"} Apr 23 18:08:20.380318 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:20.380269 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" event={"ID":"33a60d5b-55fa-475e-b06c-918705cb4b90","Type":"ContainerStarted","Data":"972e6b6f19dcbd899d1dc11630a704c3e72d8099cc117468850c5eb6d6b451a3"} Apr 23 18:08:23.392478 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:23.392428 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" event={"ID":"33a60d5b-55fa-475e-b06c-918705cb4b90","Type":"ContainerStarted","Data":"00845fe66c24ebcda409e257421ff926fb568aaeb0ccd9af590bed92f9cb07ec"} Apr 23 18:08:23.392884 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:23.392670 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:08:23.413115 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:23.413053 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podStartSLOduration=1.34786067 podStartE2EDuration="28.413039582s" podCreationTimestamp="2026-04-23 18:07:55 +0000 UTC" firstStartedPulling="2026-04-23 18:07:55.803859814 +0000 UTC m=+564.971187126" lastFinishedPulling="2026-04-23 18:08:22.86903874 +0000 UTC m=+592.036366038" observedRunningTime="2026-04-23 18:08:23.411588791 +0000 UTC m=+592.578916113" watchObservedRunningTime="2026-04-23 18:08:23.413039582 +0000 UTC m=+592.580366900" Apr 23 18:08:24.395735 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:24.395694 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:08:24.395735 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:24.395743 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:08:24.397233 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:24.397187 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 18:08:24.397915 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:24.397889 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:08:24.400871 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:24.400852 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:08:25.399252 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:25.399206 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 18:08:25.399733 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:25.399567 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:08:26.402431 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:26.402386 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 18:08:26.402881 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:26.402627 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:08:31.316052 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:31.316025 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:08:31.316664 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:31.316648 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:08:36.402606 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:36.402548 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 18:08:36.405020 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:36.402988 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:08:46.402772 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:46.402672 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 18:08:46.403148 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:46.403070 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:08:56.403207 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:56.403146 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 18:08:56.403631 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:08:56.403582 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:09:06.402973 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:06.402921 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 18:09:06.403496 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:06.403340 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:09:16.402636 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:16.402589 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 18:09:16.403170 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:16.403122 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:09:26.403450 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:26.403407 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:09:26.403940 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:26.403500 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:09:40.405331 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.405295 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk"] Apr 23 18:09:40.405907 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.405789 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kserve-container" containerID="cri-o://946a8a271ecc947f03cb8b400545807e2ceadde3785dcfb92c14157a2d36e25a" gracePeriod=30 Apr 23 18:09:40.405907 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.405820 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="agent" containerID="cri-o://00845fe66c24ebcda409e257421ff926fb568aaeb0ccd9af590bed92f9cb07ec" gracePeriod=30 Apr 23 18:09:40.406016 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.405834 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kube-rbac-proxy" containerID="cri-o://972e6b6f19dcbd899d1dc11630a704c3e72d8099cc117468850c5eb6d6b451a3" gracePeriod=30 Apr 23 18:09:40.514634 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.514590 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b"] Apr 23 18:09:40.517577 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.517557 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:09:40.519450 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.519429 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-predictor-serving-cert\"" Apr 23 18:09:40.519591 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.519559 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\"" Apr 23 18:09:40.527962 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.527939 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b"] Apr 23 18:09:40.599184 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.599145 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b\" (UID: \"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:09:40.599368 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.599214 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b\" (UID: \"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:09:40.599368 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.599235 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82sjk\" (UniqueName: \"kubernetes.io/projected/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-kube-api-access-82sjk\") pod \"isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b\" (UID: \"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:09:40.599368 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.599270 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b\" (UID: \"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:09:40.676069 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.675973 2576 generic.go:358] "Generic (PLEG): container finished" podID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerID="972e6b6f19dcbd899d1dc11630a704c3e72d8099cc117468850c5eb6d6b451a3" exitCode=2 Apr 23 18:09:40.676069 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.676034 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" event={"ID":"33a60d5b-55fa-475e-b06c-918705cb4b90","Type":"ContainerDied","Data":"972e6b6f19dcbd899d1dc11630a704c3e72d8099cc117468850c5eb6d6b451a3"} Apr 23 18:09:40.700583 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.700545 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b\" (UID: \"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:09:40.700741 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.700594 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82sjk\" (UniqueName: \"kubernetes.io/projected/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-kube-api-access-82sjk\") pod \"isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b\" (UID: \"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:09:40.700741 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.700658 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b\" (UID: \"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:09:40.700741 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.700699 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b\" (UID: \"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:09:40.701161 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.701133 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b\" (UID: \"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:09:40.701340 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.701321 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b\" (UID: \"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:09:40.703232 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.703213 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b\" (UID: \"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:09:40.708942 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.708923 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82sjk\" (UniqueName: \"kubernetes.io/projected/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-kube-api-access-82sjk\") pod \"isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b\" (UID: \"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:09:40.830183 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.830146 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:09:40.955893 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.955871 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b"] Apr 23 18:09:40.958512 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:09:40.958482 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda09d0da1_dbd4_4d56_a9bf_a15385dca7f3.slice/crio-5a0bd39a1905c70e6a7d500eb5641b64b06fa3c0e4ff4abb054c09146f632389 WatchSource:0}: Error finding container 5a0bd39a1905c70e6a7d500eb5641b64b06fa3c0e4ff4abb054c09146f632389: Status 404 returned error can't find the container with id 5a0bd39a1905c70e6a7d500eb5641b64b06fa3c0e4ff4abb054c09146f632389 Apr 23 18:09:40.960284 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:40.960270 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:09:41.680729 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:41.680691 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" event={"ID":"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3","Type":"ContainerStarted","Data":"c381135a274feb3c1cb3c127d8cd7b7cdad88099c253e3297c14d8c9dd6425d1"} Apr 23 18:09:41.680729 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:41.680729 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" event={"ID":"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3","Type":"ContainerStarted","Data":"5a0bd39a1905c70e6a7d500eb5641b64b06fa3c0e4ff4abb054c09146f632389"} Apr 23 18:09:44.396937 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:44.396897 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.33:8643/healthz\": dial tcp 10.132.0.33:8643: connect: connection refused" Apr 23 18:09:44.692721 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:44.692634 2576 generic.go:358] "Generic (PLEG): container finished" podID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerID="946a8a271ecc947f03cb8b400545807e2ceadde3785dcfb92c14157a2d36e25a" exitCode=0 Apr 23 18:09:44.692867 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:44.692714 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" event={"ID":"33a60d5b-55fa-475e-b06c-918705cb4b90","Type":"ContainerDied","Data":"946a8a271ecc947f03cb8b400545807e2ceadde3785dcfb92c14157a2d36e25a"} Apr 23 18:09:45.697208 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:45.697175 2576 generic.go:358] "Generic (PLEG): container finished" podID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerID="c381135a274feb3c1cb3c127d8cd7b7cdad88099c253e3297c14d8c9dd6425d1" exitCode=0 Apr 23 18:09:45.697681 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:45.697252 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" event={"ID":"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3","Type":"ContainerDied","Data":"c381135a274feb3c1cb3c127d8cd7b7cdad88099c253e3297c14d8c9dd6425d1"} Apr 23 18:09:46.403113 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:46.403065 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 18:09:46.403440 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:46.403409 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:09:46.702711 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:46.702631 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" event={"ID":"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3","Type":"ContainerStarted","Data":"52fa39ee2f116537371d603884de638182ea5134c7ce5c0fc5627d4c09505e7b"} Apr 23 18:09:46.702711 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:46.702674 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" event={"ID":"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3","Type":"ContainerStarted","Data":"eed1e655534ef67421381a34e5e590f4937f450c58add0b6ebde40e297939ed8"} Apr 23 18:09:46.702711 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:46.702684 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" event={"ID":"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3","Type":"ContainerStarted","Data":"e907607a9c37b11537404837d18388a6b08829f3f5547a5b47067957894dfbb9"} Apr 23 18:09:46.703133 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:46.702975 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:09:46.703133 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:46.703116 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:09:46.704350 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:46.704323 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:5000: connect: connection refused" Apr 23 18:09:46.723497 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:46.723438 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podStartSLOduration=6.723425036 podStartE2EDuration="6.723425036s" podCreationTimestamp="2026-04-23 18:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:09:46.721729846 +0000 UTC m=+675.889057166" watchObservedRunningTime="2026-04-23 18:09:46.723425036 +0000 UTC m=+675.890752359" Apr 23 18:09:47.706755 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:47.706725 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:09:47.707241 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:47.706875 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:5000: connect: connection refused" Apr 23 18:09:47.707756 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:47.707726 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:09:48.709970 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:48.709926 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:5000: connect: connection refused" Apr 23 18:09:48.710360 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:48.710317 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:09:49.396616 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:49.396573 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.33:8643/healthz\": dial tcp 10.132.0.33:8643: connect: connection refused" Apr 23 18:09:53.714240 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:53.714211 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:09:53.714792 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:53.714762 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:5000: connect: connection refused" Apr 23 18:09:53.715071 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:53.715044 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:09:54.395905 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:54.395865 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.33:8643/healthz\": dial tcp 10.132.0.33:8643: connect: connection refused" Apr 23 18:09:54.396088 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:54.396015 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:09:56.403112 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:56.403061 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 18:09:56.403566 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:56.403413 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:09:59.396680 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:09:59.396638 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.33:8643/healthz\": dial tcp 10.132.0.33:8643: connect: connection refused" Apr 23 18:10:03.714735 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:03.714688 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:5000: connect: connection refused" Apr 23 18:10:03.715127 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:03.715106 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:10:04.396245 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:04.396202 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.33:8643/healthz\": dial tcp 10.132.0.33:8643: connect: connection refused" Apr 23 18:10:06.402905 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:06.402855 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 18:10:06.403321 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:06.403021 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:10:06.403321 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:06.403217 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:10:06.403420 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:06.403332 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:10:09.396906 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:09.396854 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.33:8643/healthz\": dial tcp 10.132.0.33:8643: connect: connection refused" Apr 23 18:10:10.548815 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.548787 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:10:10.656576 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.656546 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9fd2\" (UniqueName: \"kubernetes.io/projected/33a60d5b-55fa-475e-b06c-918705cb4b90-kube-api-access-k9fd2\") pod \"33a60d5b-55fa-475e-b06c-918705cb4b90\" (UID: \"33a60d5b-55fa-475e-b06c-918705cb4b90\") " Apr 23 18:10:10.656723 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.656597 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33a60d5b-55fa-475e-b06c-918705cb4b90-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"33a60d5b-55fa-475e-b06c-918705cb4b90\" (UID: \"33a60d5b-55fa-475e-b06c-918705cb4b90\") " Apr 23 18:10:10.656723 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.656637 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33a60d5b-55fa-475e-b06c-918705cb4b90-kserve-provision-location\") pod \"33a60d5b-55fa-475e-b06c-918705cb4b90\" (UID: \"33a60d5b-55fa-475e-b06c-918705cb4b90\") " Apr 23 18:10:10.656723 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.656679 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33a60d5b-55fa-475e-b06c-918705cb4b90-proxy-tls\") pod \"33a60d5b-55fa-475e-b06c-918705cb4b90\" (UID: \"33a60d5b-55fa-475e-b06c-918705cb4b90\") " Apr 23 18:10:10.657019 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.656989 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33a60d5b-55fa-475e-b06c-918705cb4b90-isvc-sklearn-batcher-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-kube-rbac-proxy-sar-config") pod "33a60d5b-55fa-475e-b06c-918705cb4b90" (UID: "33a60d5b-55fa-475e-b06c-918705cb4b90"). InnerVolumeSpecName "isvc-sklearn-batcher-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:10:10.657140 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.657013 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33a60d5b-55fa-475e-b06c-918705cb4b90-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "33a60d5b-55fa-475e-b06c-918705cb4b90" (UID: "33a60d5b-55fa-475e-b06c-918705cb4b90"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:10:10.659129 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.659109 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33a60d5b-55fa-475e-b06c-918705cb4b90-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "33a60d5b-55fa-475e-b06c-918705cb4b90" (UID: "33a60d5b-55fa-475e-b06c-918705cb4b90"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:10:10.659214 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.659143 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33a60d5b-55fa-475e-b06c-918705cb4b90-kube-api-access-k9fd2" (OuterVolumeSpecName: "kube-api-access-k9fd2") pod "33a60d5b-55fa-475e-b06c-918705cb4b90" (UID: "33a60d5b-55fa-475e-b06c-918705cb4b90"). InnerVolumeSpecName "kube-api-access-k9fd2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:10:10.758271 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.758222 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33a60d5b-55fa-475e-b06c-918705cb4b90-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:10:10.758271 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.758263 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33a60d5b-55fa-475e-b06c-918705cb4b90-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:10:10.758271 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.758274 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k9fd2\" (UniqueName: \"kubernetes.io/projected/33a60d5b-55fa-475e-b06c-918705cb4b90-kube-api-access-k9fd2\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:10:10.758271 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.758284 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33a60d5b-55fa-475e-b06c-918705cb4b90-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:10:10.789214 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.789180 2576 generic.go:358] "Generic (PLEG): container finished" podID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerID="00845fe66c24ebcda409e257421ff926fb568aaeb0ccd9af590bed92f9cb07ec" exitCode=0 Apr 23 18:10:10.789358 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.789229 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" event={"ID":"33a60d5b-55fa-475e-b06c-918705cb4b90","Type":"ContainerDied","Data":"00845fe66c24ebcda409e257421ff926fb568aaeb0ccd9af590bed92f9cb07ec"} Apr 23 18:10:10.789358 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.789256 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" event={"ID":"33a60d5b-55fa-475e-b06c-918705cb4b90","Type":"ContainerDied","Data":"b5f04ae08718c3645cc9cd3efac1e0adc48aa0e2a011056d2f02851c7654b2b9"} Apr 23 18:10:10.789358 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.789273 2576 scope.go:117] "RemoveContainer" containerID="00845fe66c24ebcda409e257421ff926fb568aaeb0ccd9af590bed92f9cb07ec" Apr 23 18:10:10.789358 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.789288 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk" Apr 23 18:10:10.798531 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.798516 2576 scope.go:117] "RemoveContainer" containerID="972e6b6f19dcbd899d1dc11630a704c3e72d8099cc117468850c5eb6d6b451a3" Apr 23 18:10:10.805871 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.805852 2576 scope.go:117] "RemoveContainer" containerID="946a8a271ecc947f03cb8b400545807e2ceadde3785dcfb92c14157a2d36e25a" Apr 23 18:10:10.811538 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.811514 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk"] Apr 23 18:10:10.813708 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.813619 2576 scope.go:117] "RemoveContainer" containerID="f84686f858005be3383a7e0d712224e80c9fae7e661bb9c339af61fdb0c420e8" Apr 23 18:10:10.815317 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.815299 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-75bcd877cc-2tvjk"] Apr 23 18:10:10.820778 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.820758 2576 scope.go:117] "RemoveContainer" containerID="00845fe66c24ebcda409e257421ff926fb568aaeb0ccd9af590bed92f9cb07ec" Apr 23 18:10:10.821023 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:10:10.821004 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00845fe66c24ebcda409e257421ff926fb568aaeb0ccd9af590bed92f9cb07ec\": container with ID starting with 00845fe66c24ebcda409e257421ff926fb568aaeb0ccd9af590bed92f9cb07ec not found: ID does not exist" containerID="00845fe66c24ebcda409e257421ff926fb568aaeb0ccd9af590bed92f9cb07ec" Apr 23 18:10:10.821096 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.821031 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00845fe66c24ebcda409e257421ff926fb568aaeb0ccd9af590bed92f9cb07ec"} err="failed to get container status \"00845fe66c24ebcda409e257421ff926fb568aaeb0ccd9af590bed92f9cb07ec\": rpc error: code = NotFound desc = could not find container \"00845fe66c24ebcda409e257421ff926fb568aaeb0ccd9af590bed92f9cb07ec\": container with ID starting with 00845fe66c24ebcda409e257421ff926fb568aaeb0ccd9af590bed92f9cb07ec not found: ID does not exist" Apr 23 18:10:10.821096 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.821049 2576 scope.go:117] "RemoveContainer" containerID="972e6b6f19dcbd899d1dc11630a704c3e72d8099cc117468850c5eb6d6b451a3" Apr 23 18:10:10.821261 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:10:10.821244 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"972e6b6f19dcbd899d1dc11630a704c3e72d8099cc117468850c5eb6d6b451a3\": container with ID starting with 972e6b6f19dcbd899d1dc11630a704c3e72d8099cc117468850c5eb6d6b451a3 not found: ID does not exist" containerID="972e6b6f19dcbd899d1dc11630a704c3e72d8099cc117468850c5eb6d6b451a3" Apr 23 18:10:10.821302 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.821268 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"972e6b6f19dcbd899d1dc11630a704c3e72d8099cc117468850c5eb6d6b451a3"} err="failed to get container status \"972e6b6f19dcbd899d1dc11630a704c3e72d8099cc117468850c5eb6d6b451a3\": rpc error: code = NotFound desc = could not find container \"972e6b6f19dcbd899d1dc11630a704c3e72d8099cc117468850c5eb6d6b451a3\": container with ID starting with 972e6b6f19dcbd899d1dc11630a704c3e72d8099cc117468850c5eb6d6b451a3 not found: ID does not exist" Apr 23 18:10:10.821302 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.821284 2576 scope.go:117] "RemoveContainer" containerID="946a8a271ecc947f03cb8b400545807e2ceadde3785dcfb92c14157a2d36e25a" Apr 23 18:10:10.821480 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:10:10.821447 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"946a8a271ecc947f03cb8b400545807e2ceadde3785dcfb92c14157a2d36e25a\": container with ID starting with 946a8a271ecc947f03cb8b400545807e2ceadde3785dcfb92c14157a2d36e25a not found: ID does not exist" containerID="946a8a271ecc947f03cb8b400545807e2ceadde3785dcfb92c14157a2d36e25a" Apr 23 18:10:10.821545 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.821487 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"946a8a271ecc947f03cb8b400545807e2ceadde3785dcfb92c14157a2d36e25a"} err="failed to get container status \"946a8a271ecc947f03cb8b400545807e2ceadde3785dcfb92c14157a2d36e25a\": rpc error: code = NotFound desc = could not find container \"946a8a271ecc947f03cb8b400545807e2ceadde3785dcfb92c14157a2d36e25a\": container with ID starting with 946a8a271ecc947f03cb8b400545807e2ceadde3785dcfb92c14157a2d36e25a not found: ID does not exist" Apr 23 18:10:10.821545 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.821504 2576 scope.go:117] "RemoveContainer" containerID="f84686f858005be3383a7e0d712224e80c9fae7e661bb9c339af61fdb0c420e8" Apr 23 18:10:10.821726 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:10:10.821711 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f84686f858005be3383a7e0d712224e80c9fae7e661bb9c339af61fdb0c420e8\": container with ID starting with f84686f858005be3383a7e0d712224e80c9fae7e661bb9c339af61fdb0c420e8 not found: ID does not exist" containerID="f84686f858005be3383a7e0d712224e80c9fae7e661bb9c339af61fdb0c420e8" Apr 23 18:10:10.821763 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:10.821731 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f84686f858005be3383a7e0d712224e80c9fae7e661bb9c339af61fdb0c420e8"} err="failed to get container status \"f84686f858005be3383a7e0d712224e80c9fae7e661bb9c339af61fdb0c420e8\": rpc error: code = NotFound desc = could not find container \"f84686f858005be3383a7e0d712224e80c9fae7e661bb9c339af61fdb0c420e8\": container with ID starting with f84686f858005be3383a7e0d712224e80c9fae7e661bb9c339af61fdb0c420e8 not found: ID does not exist" Apr 23 18:10:11.415412 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:11.415378 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" path="/var/lib/kubelet/pods/33a60d5b-55fa-475e-b06c-918705cb4b90/volumes" Apr 23 18:10:13.715610 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:13.715564 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:5000: connect: connection refused" Apr 23 18:10:13.716085 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:13.715919 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:10:23.715121 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:23.715082 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:5000: connect: connection refused" Apr 23 18:10:23.715636 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:23.715612 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:10:33.715172 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:33.715124 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:5000: connect: connection refused" Apr 23 18:10:33.715587 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:33.715554 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:10:43.715411 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:43.715373 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:5000: connect: connection refused" Apr 23 18:10:43.715904 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:43.715871 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:10:53.715187 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:53.715155 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:10:53.715776 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:10:53.715367 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:11:05.588595 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.588563 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b"] Apr 23 18:11:05.589174 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.589017 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kserve-container" containerID="cri-o://e907607a9c37b11537404837d18388a6b08829f3f5547a5b47067957894dfbb9" gracePeriod=30 Apr 23 18:11:05.589174 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.589126 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="agent" containerID="cri-o://52fa39ee2f116537371d603884de638182ea5134c7ce5c0fc5627d4c09505e7b" gracePeriod=30 Apr 23 18:11:05.589304 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.589180 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kube-rbac-proxy" containerID="cri-o://eed1e655534ef67421381a34e5e590f4937f450c58add0b6ebde40e297939ed8" gracePeriod=30 Apr 23 18:11:05.637241 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.637208 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t"] Apr 23 18:11:05.637679 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.637660 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kserve-container" Apr 23 18:11:05.637764 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.637680 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kserve-container" Apr 23 18:11:05.637764 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.637698 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="storage-initializer" Apr 23 18:11:05.637764 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.637706 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="storage-initializer" Apr 23 18:11:05.637764 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.637719 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="agent" Apr 23 18:11:05.637764 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.637727 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="agent" Apr 23 18:11:05.638007 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.637790 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kube-rbac-proxy" Apr 23 18:11:05.638007 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.637800 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kube-rbac-proxy" Apr 23 18:11:05.638007 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.637893 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kube-rbac-proxy" Apr 23 18:11:05.638007 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.637906 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="agent" Apr 23 18:11:05.638007 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.637917 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="33a60d5b-55fa-475e-b06c-918705cb4b90" containerName="kserve-container" Apr 23 18:11:05.641275 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.641256 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" Apr 23 18:11:05.643380 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.643354 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-predictor-serving-cert\"" Apr 23 18:11:05.643380 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.643375 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-kube-rbac-proxy-sar-config\"" Apr 23 18:11:05.649493 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.649471 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t"] Apr 23 18:11:05.702967 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.702936 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3951c0b5-f95b-4db0-b1a5-795144405449-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-zcm4t\" (UID: \"3951c0b5-f95b-4db0-b1a5-795144405449\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" Apr 23 18:11:05.703120 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.702989 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3951c0b5-f95b-4db0-b1a5-795144405449-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-zcm4t\" (UID: \"3951c0b5-f95b-4db0-b1a5-795144405449\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" Apr 23 18:11:05.703120 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.703100 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4wdk\" (UniqueName: \"kubernetes.io/projected/3951c0b5-f95b-4db0-b1a5-795144405449-kube-api-access-h4wdk\") pod \"message-dumper-predictor-c7d86bcbd-zcm4t\" (UID: \"3951c0b5-f95b-4db0-b1a5-795144405449\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" Apr 23 18:11:05.803906 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.803874 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4wdk\" (UniqueName: \"kubernetes.io/projected/3951c0b5-f95b-4db0-b1a5-795144405449-kube-api-access-h4wdk\") pod \"message-dumper-predictor-c7d86bcbd-zcm4t\" (UID: \"3951c0b5-f95b-4db0-b1a5-795144405449\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" Apr 23 18:11:05.804102 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.803936 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3951c0b5-f95b-4db0-b1a5-795144405449-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-zcm4t\" (UID: \"3951c0b5-f95b-4db0-b1a5-795144405449\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" Apr 23 18:11:05.804102 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.803979 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3951c0b5-f95b-4db0-b1a5-795144405449-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-zcm4t\" (UID: \"3951c0b5-f95b-4db0-b1a5-795144405449\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" Apr 23 18:11:05.804822 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.804794 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3951c0b5-f95b-4db0-b1a5-795144405449-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-zcm4t\" (UID: \"3951c0b5-f95b-4db0-b1a5-795144405449\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" Apr 23 18:11:05.806759 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.806733 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3951c0b5-f95b-4db0-b1a5-795144405449-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-zcm4t\" (UID: \"3951c0b5-f95b-4db0-b1a5-795144405449\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" Apr 23 18:11:05.812508 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.812484 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4wdk\" (UniqueName: \"kubernetes.io/projected/3951c0b5-f95b-4db0-b1a5-795144405449-kube-api-access-h4wdk\") pod \"message-dumper-predictor-c7d86bcbd-zcm4t\" (UID: \"3951c0b5-f95b-4db0-b1a5-795144405449\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" Apr 23 18:11:05.952162 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.952080 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" Apr 23 18:11:05.987726 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.987690 2576 generic.go:358] "Generic (PLEG): container finished" podID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerID="eed1e655534ef67421381a34e5e590f4937f450c58add0b6ebde40e297939ed8" exitCode=2 Apr 23 18:11:05.987873 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:05.987751 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" event={"ID":"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3","Type":"ContainerDied","Data":"eed1e655534ef67421381a34e5e590f4937f450c58add0b6ebde40e297939ed8"} Apr 23 18:11:06.075712 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:06.075685 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t"] Apr 23 18:11:06.077912 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:11:06.077881 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3951c0b5_f95b_4db0_b1a5_795144405449.slice/crio-708da9899c26d6f24258bf59909402ddcf3aaf29132dd29f5b0cbbeb45d885b5 WatchSource:0}: Error finding container 708da9899c26d6f24258bf59909402ddcf3aaf29132dd29f5b0cbbeb45d885b5: Status 404 returned error can't find the container with id 708da9899c26d6f24258bf59909402ddcf3aaf29132dd29f5b0cbbeb45d885b5 Apr 23 18:11:06.993841 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:06.993797 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" event={"ID":"3951c0b5-f95b-4db0-b1a5-795144405449","Type":"ContainerStarted","Data":"708da9899c26d6f24258bf59909402ddcf3aaf29132dd29f5b0cbbeb45d885b5"} Apr 23 18:11:07.999256 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:07.999215 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" event={"ID":"3951c0b5-f95b-4db0-b1a5-795144405449","Type":"ContainerStarted","Data":"3e29da5b30a7f9f5f651de80ae1e9132f55c5e9e66e1a94c4bcfcaf78684e5fb"} Apr 23 18:11:07.999256 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:07.999260 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" event={"ID":"3951c0b5-f95b-4db0-b1a5-795144405449","Type":"ContainerStarted","Data":"b55dfabb6555b35cdb99655d294db109ffa8f60ad6ebf3d1afa3f4ef6ab5ec1c"} Apr 23 18:11:07.999704 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:07.999355 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" Apr 23 18:11:08.016446 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:08.016393 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" podStartSLOduration=1.792538152 podStartE2EDuration="3.016377395s" podCreationTimestamp="2026-04-23 18:11:05 +0000 UTC" firstStartedPulling="2026-04-23 18:11:06.079813608 +0000 UTC m=+755.247140918" lastFinishedPulling="2026-04-23 18:11:07.303652851 +0000 UTC m=+756.470980161" observedRunningTime="2026-04-23 18:11:08.01529746 +0000 UTC m=+757.182624779" watchObservedRunningTime="2026-04-23 18:11:08.016377395 +0000 UTC m=+757.183704715" Apr 23 18:11:08.710572 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:08.710529 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.34:8643/healthz\": dial tcp 10.132.0.34:8643: connect: connection refused" Apr 23 18:11:09.005961 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:09.005881 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" Apr 23 18:11:09.007597 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:09.007575 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" Apr 23 18:11:10.011347 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:10.011316 2576 generic.go:358] "Generic (PLEG): container finished" podID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerID="e907607a9c37b11537404837d18388a6b08829f3f5547a5b47067957894dfbb9" exitCode=0 Apr 23 18:11:10.011833 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:10.011378 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" event={"ID":"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3","Type":"ContainerDied","Data":"e907607a9c37b11537404837d18388a6b08829f3f5547a5b47067957894dfbb9"} Apr 23 18:11:13.710970 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:13.710936 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.34:8643/healthz\": dial tcp 10.132.0.34:8643: connect: connection refused" Apr 23 18:11:13.715574 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:13.715548 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:5000: connect: connection refused" Apr 23 18:11:13.715851 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:13.715826 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:11:16.019541 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:16.019513 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" Apr 23 18:11:18.710859 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:18.710823 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.34:8643/healthz\": dial tcp 10.132.0.34:8643: connect: connection refused" Apr 23 18:11:18.711287 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:18.710939 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:11:23.710407 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:23.710369 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.34:8643/healthz\": dial tcp 10.132.0.34:8643: connect: connection refused" Apr 23 18:11:23.714781 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:23.714744 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:5000: connect: connection refused" Apr 23 18:11:23.715114 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:23.715093 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:11:25.706687 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:25.706647 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n"] Apr 23 18:11:25.710300 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:25.710283 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:11:25.712381 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:25.712358 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-predictor-serving-cert\"" Apr 23 18:11:25.712527 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:25.712504 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-kube-rbac-proxy-sar-config\"" Apr 23 18:11:25.722894 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:25.722872 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n"] Apr 23 18:11:25.767303 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:25.767269 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4c13a039-c44c-48a9-909c-99dd4f81e7d1-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-6c87ff55b4-bhg6n\" (UID: \"4c13a039-c44c-48a9-909c-99dd4f81e7d1\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:11:25.767501 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:25.767347 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8nq9\" (UniqueName: \"kubernetes.io/projected/4c13a039-c44c-48a9-909c-99dd4f81e7d1-kube-api-access-q8nq9\") pod \"isvc-logger-predictor-6c87ff55b4-bhg6n\" (UID: \"4c13a039-c44c-48a9-909c-99dd4f81e7d1\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:11:25.767501 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:25.767394 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c13a039-c44c-48a9-909c-99dd4f81e7d1-kserve-provision-location\") pod \"isvc-logger-predictor-6c87ff55b4-bhg6n\" (UID: \"4c13a039-c44c-48a9-909c-99dd4f81e7d1\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:11:25.767501 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:25.767428 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c13a039-c44c-48a9-909c-99dd4f81e7d1-proxy-tls\") pod \"isvc-logger-predictor-6c87ff55b4-bhg6n\" (UID: \"4c13a039-c44c-48a9-909c-99dd4f81e7d1\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:11:25.868694 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:25.868665 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4c13a039-c44c-48a9-909c-99dd4f81e7d1-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-6c87ff55b4-bhg6n\" (UID: \"4c13a039-c44c-48a9-909c-99dd4f81e7d1\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:11:25.868850 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:25.868723 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8nq9\" (UniqueName: \"kubernetes.io/projected/4c13a039-c44c-48a9-909c-99dd4f81e7d1-kube-api-access-q8nq9\") pod \"isvc-logger-predictor-6c87ff55b4-bhg6n\" (UID: \"4c13a039-c44c-48a9-909c-99dd4f81e7d1\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:11:25.868850 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:25.868747 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c13a039-c44c-48a9-909c-99dd4f81e7d1-kserve-provision-location\") pod \"isvc-logger-predictor-6c87ff55b4-bhg6n\" (UID: \"4c13a039-c44c-48a9-909c-99dd4f81e7d1\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:11:25.868850 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:25.868772 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c13a039-c44c-48a9-909c-99dd4f81e7d1-proxy-tls\") pod \"isvc-logger-predictor-6c87ff55b4-bhg6n\" (UID: \"4c13a039-c44c-48a9-909c-99dd4f81e7d1\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:11:25.869250 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:25.869211 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c13a039-c44c-48a9-909c-99dd4f81e7d1-kserve-provision-location\") pod \"isvc-logger-predictor-6c87ff55b4-bhg6n\" (UID: \"4c13a039-c44c-48a9-909c-99dd4f81e7d1\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:11:25.869343 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:25.869276 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4c13a039-c44c-48a9-909c-99dd4f81e7d1-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-6c87ff55b4-bhg6n\" (UID: \"4c13a039-c44c-48a9-909c-99dd4f81e7d1\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:11:25.871224 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:25.871208 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c13a039-c44c-48a9-909c-99dd4f81e7d1-proxy-tls\") pod \"isvc-logger-predictor-6c87ff55b4-bhg6n\" (UID: \"4c13a039-c44c-48a9-909c-99dd4f81e7d1\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:11:25.878084 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:25.878061 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8nq9\" (UniqueName: \"kubernetes.io/projected/4c13a039-c44c-48a9-909c-99dd4f81e7d1-kube-api-access-q8nq9\") pod \"isvc-logger-predictor-6c87ff55b4-bhg6n\" (UID: \"4c13a039-c44c-48a9-909c-99dd4f81e7d1\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:11:26.020491 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:26.020384 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:11:26.150234 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:26.150208 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n"] Apr 23 18:11:26.151814 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:11:26.151781 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c13a039_c44c_48a9_909c_99dd4f81e7d1.slice/crio-78d660f3624dfb4239ac577cffab7368f7a1b0ee4b727d97c20cab433cbb02a0 WatchSource:0}: Error finding container 78d660f3624dfb4239ac577cffab7368f7a1b0ee4b727d97c20cab433cbb02a0: Status 404 returned error can't find the container with id 78d660f3624dfb4239ac577cffab7368f7a1b0ee4b727d97c20cab433cbb02a0 Apr 23 18:11:27.071667 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:27.071623 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" event={"ID":"4c13a039-c44c-48a9-909c-99dd4f81e7d1","Type":"ContainerStarted","Data":"b59bb12be31b82b7d43e64dc351a6ac9ed2b03691966bdc10ffa04729455b0fe"} Apr 23 18:11:27.071667 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:27.071670 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" event={"ID":"4c13a039-c44c-48a9-909c-99dd4f81e7d1","Type":"ContainerStarted","Data":"78d660f3624dfb4239ac577cffab7368f7a1b0ee4b727d97c20cab433cbb02a0"} Apr 23 18:11:28.710922 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:28.710885 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.34:8643/healthz\": dial tcp 10.132.0.34:8643: connect: connection refused" Apr 23 18:11:30.085453 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:30.085419 2576 generic.go:358] "Generic (PLEG): container finished" podID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerID="b59bb12be31b82b7d43e64dc351a6ac9ed2b03691966bdc10ffa04729455b0fe" exitCode=0 Apr 23 18:11:30.085847 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:30.085480 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" event={"ID":"4c13a039-c44c-48a9-909c-99dd4f81e7d1","Type":"ContainerDied","Data":"b59bb12be31b82b7d43e64dc351a6ac9ed2b03691966bdc10ffa04729455b0fe"} Apr 23 18:11:31.090980 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:31.090948 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" event={"ID":"4c13a039-c44c-48a9-909c-99dd4f81e7d1","Type":"ContainerStarted","Data":"01a63245f876a23dd4c86dc3f81764ca109de0c214ad5d854f7521c441731e0c"} Apr 23 18:11:31.090980 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:31.090987 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" event={"ID":"4c13a039-c44c-48a9-909c-99dd4f81e7d1","Type":"ContainerStarted","Data":"6cb68adefc026551190bc931a9051abfa0d9b738a85d226206201c121913ab78"} Apr 23 18:11:31.091411 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:31.090997 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" event={"ID":"4c13a039-c44c-48a9-909c-99dd4f81e7d1","Type":"ContainerStarted","Data":"ad49e5cd196d09f25936c868d65c7403ffc73c5ea90c9cad08f47a965cf5d551"} Apr 23 18:11:31.091411 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:31.091346 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:11:31.091513 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:31.091486 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:11:31.092854 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:31.092827 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 18:11:31.111168 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:31.111128 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podStartSLOduration=6.11111549 podStartE2EDuration="6.11111549s" podCreationTimestamp="2026-04-23 18:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:11:31.109510134 +0000 UTC m=+780.276837453" watchObservedRunningTime="2026-04-23 18:11:31.11111549 +0000 UTC m=+780.278442809" Apr 23 18:11:32.094476 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:32.094425 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 18:11:32.094874 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:32.094485 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:11:32.095528 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:32.095502 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:11:33.098472 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:33.098411 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 18:11:33.098886 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:33.098770 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:11:33.710512 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:33.710432 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.34:8643/healthz\": dial tcp 10.132.0.34:8643: connect: connection refused" Apr 23 18:11:33.714758 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:33.714728 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:5000: connect: connection refused" Apr 23 18:11:33.714884 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:33.714868 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:11:33.715100 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:33.715080 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:11:33.715201 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:33.715187 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:11:35.730263 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:35.730242 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:11:35.864399 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:35.864313 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82sjk\" (UniqueName: \"kubernetes.io/projected/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-kube-api-access-82sjk\") pod \"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3\" (UID: \"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3\") " Apr 23 18:11:35.864597 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:35.864396 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3\" (UID: \"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3\") " Apr 23 18:11:35.864597 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:35.864453 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-proxy-tls\") pod \"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3\" (UID: \"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3\") " Apr 23 18:11:35.864597 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:35.864510 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-kserve-provision-location\") pod \"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3\" (UID: \"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3\") " Apr 23 18:11:35.864844 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:35.864820 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config") pod "a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" (UID: "a09d0da1-dbd4-4d56-a9bf-a15385dca7f3"). InnerVolumeSpecName "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:11:35.864905 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:35.864865 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" (UID: "a09d0da1-dbd4-4d56-a9bf-a15385dca7f3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:11:35.866681 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:35.866655 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" (UID: "a09d0da1-dbd4-4d56-a9bf-a15385dca7f3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:11:35.866781 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:35.866697 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-kube-api-access-82sjk" (OuterVolumeSpecName: "kube-api-access-82sjk") pod "a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" (UID: "a09d0da1-dbd4-4d56-a9bf-a15385dca7f3"). InnerVolumeSpecName "kube-api-access-82sjk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:11:35.965394 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:35.965361 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-82sjk\" (UniqueName: \"kubernetes.io/projected/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-kube-api-access-82sjk\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:11:35.965394 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:35.965390 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:11:35.965599 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:35.965406 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:11:35.965599 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:35.965419 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:11:36.110135 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:36.110097 2576 generic.go:358] "Generic (PLEG): container finished" podID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerID="52fa39ee2f116537371d603884de638182ea5134c7ce5c0fc5627d4c09505e7b" exitCode=0 Apr 23 18:11:36.110290 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:36.110188 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" Apr 23 18:11:36.110290 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:36.110192 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" event={"ID":"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3","Type":"ContainerDied","Data":"52fa39ee2f116537371d603884de638182ea5134c7ce5c0fc5627d4c09505e7b"} Apr 23 18:11:36.110290 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:36.110235 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b" event={"ID":"a09d0da1-dbd4-4d56-a9bf-a15385dca7f3","Type":"ContainerDied","Data":"5a0bd39a1905c70e6a7d500eb5641b64b06fa3c0e4ff4abb054c09146f632389"} Apr 23 18:11:36.110290 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:36.110252 2576 scope.go:117] "RemoveContainer" containerID="52fa39ee2f116537371d603884de638182ea5134c7ce5c0fc5627d4c09505e7b" Apr 23 18:11:36.119612 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:36.119593 2576 scope.go:117] "RemoveContainer" containerID="eed1e655534ef67421381a34e5e590f4937f450c58add0b6ebde40e297939ed8" Apr 23 18:11:36.127415 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:36.127394 2576 scope.go:117] "RemoveContainer" containerID="e907607a9c37b11537404837d18388a6b08829f3f5547a5b47067957894dfbb9" Apr 23 18:11:36.135095 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:36.135070 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b"] Apr 23 18:11:36.135612 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:36.135596 2576 scope.go:117] "RemoveContainer" containerID="c381135a274feb3c1cb3c127d8cd7b7cdad88099c253e3297c14d8c9dd6425d1" Apr 23 18:11:36.138664 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:36.138642 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-db5fdd56d-j4h5b"] Apr 23 18:11:36.143094 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:36.143079 2576 scope.go:117] "RemoveContainer" containerID="52fa39ee2f116537371d603884de638182ea5134c7ce5c0fc5627d4c09505e7b" Apr 23 18:11:36.143359 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:11:36.143341 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52fa39ee2f116537371d603884de638182ea5134c7ce5c0fc5627d4c09505e7b\": container with ID starting with 52fa39ee2f116537371d603884de638182ea5134c7ce5c0fc5627d4c09505e7b not found: ID does not exist" containerID="52fa39ee2f116537371d603884de638182ea5134c7ce5c0fc5627d4c09505e7b" Apr 23 18:11:36.143399 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:36.143368 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52fa39ee2f116537371d603884de638182ea5134c7ce5c0fc5627d4c09505e7b"} err="failed to get container status \"52fa39ee2f116537371d603884de638182ea5134c7ce5c0fc5627d4c09505e7b\": rpc error: code = NotFound desc = could not find container \"52fa39ee2f116537371d603884de638182ea5134c7ce5c0fc5627d4c09505e7b\": container with ID starting with 52fa39ee2f116537371d603884de638182ea5134c7ce5c0fc5627d4c09505e7b not found: ID does not exist" Apr 23 18:11:36.143399 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:36.143387 2576 scope.go:117] "RemoveContainer" containerID="eed1e655534ef67421381a34e5e590f4937f450c58add0b6ebde40e297939ed8" Apr 23 18:11:36.143651 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:11:36.143632 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eed1e655534ef67421381a34e5e590f4937f450c58add0b6ebde40e297939ed8\": container with ID starting with eed1e655534ef67421381a34e5e590f4937f450c58add0b6ebde40e297939ed8 not found: ID does not exist" containerID="eed1e655534ef67421381a34e5e590f4937f450c58add0b6ebde40e297939ed8" Apr 23 18:11:36.143691 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:36.143656 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eed1e655534ef67421381a34e5e590f4937f450c58add0b6ebde40e297939ed8"} err="failed to get container status \"eed1e655534ef67421381a34e5e590f4937f450c58add0b6ebde40e297939ed8\": rpc error: code = NotFound desc = could not find container \"eed1e655534ef67421381a34e5e590f4937f450c58add0b6ebde40e297939ed8\": container with ID starting with eed1e655534ef67421381a34e5e590f4937f450c58add0b6ebde40e297939ed8 not found: ID does not exist" Apr 23 18:11:36.143691 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:36.143671 2576 scope.go:117] "RemoveContainer" containerID="e907607a9c37b11537404837d18388a6b08829f3f5547a5b47067957894dfbb9" Apr 23 18:11:36.143890 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:11:36.143859 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e907607a9c37b11537404837d18388a6b08829f3f5547a5b47067957894dfbb9\": container with ID starting with e907607a9c37b11537404837d18388a6b08829f3f5547a5b47067957894dfbb9 not found: ID does not exist" containerID="e907607a9c37b11537404837d18388a6b08829f3f5547a5b47067957894dfbb9" Apr 23 18:11:36.143933 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:36.143886 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e907607a9c37b11537404837d18388a6b08829f3f5547a5b47067957894dfbb9"} err="failed to get container status \"e907607a9c37b11537404837d18388a6b08829f3f5547a5b47067957894dfbb9\": rpc error: code = NotFound desc = could not find container \"e907607a9c37b11537404837d18388a6b08829f3f5547a5b47067957894dfbb9\": container with ID starting with e907607a9c37b11537404837d18388a6b08829f3f5547a5b47067957894dfbb9 not found: ID does not exist" Apr 23 18:11:36.143933 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:36.143900 2576 scope.go:117] "RemoveContainer" containerID="c381135a274feb3c1cb3c127d8cd7b7cdad88099c253e3297c14d8c9dd6425d1" Apr 23 18:11:36.144117 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:11:36.144100 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c381135a274feb3c1cb3c127d8cd7b7cdad88099c253e3297c14d8c9dd6425d1\": container with ID starting with c381135a274feb3c1cb3c127d8cd7b7cdad88099c253e3297c14d8c9dd6425d1 not found: ID does not exist" containerID="c381135a274feb3c1cb3c127d8cd7b7cdad88099c253e3297c14d8c9dd6425d1" Apr 23 18:11:36.144159 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:36.144128 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c381135a274feb3c1cb3c127d8cd7b7cdad88099c253e3297c14d8c9dd6425d1"} err="failed to get container status \"c381135a274feb3c1cb3c127d8cd7b7cdad88099c253e3297c14d8c9dd6425d1\": rpc error: code = NotFound desc = could not find container \"c381135a274feb3c1cb3c127d8cd7b7cdad88099c253e3297c14d8c9dd6425d1\": container with ID starting with c381135a274feb3c1cb3c127d8cd7b7cdad88099c253e3297c14d8c9dd6425d1 not found: ID does not exist" Apr 23 18:11:37.416025 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:37.415994 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" path="/var/lib/kubelet/pods/a09d0da1-dbd4-4d56-a9bf-a15385dca7f3/volumes" Apr 23 18:11:38.102685 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:38.102655 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:11:38.103406 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:38.103374 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 18:11:38.103609 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:38.103584 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:11:48.103659 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:48.103572 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 18:11:48.104126 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:48.104097 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:11:58.103659 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:58.103615 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 18:11:58.104112 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:11:58.104084 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:12:08.103775 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:08.103734 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 18:12:08.104210 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:08.104190 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:12:18.103241 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:18.103192 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 18:12:18.103722 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:18.103696 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:12:28.103932 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:28.103886 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 18:12:28.104444 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:28.104354 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:12:38.104684 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:38.104653 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:12:38.105115 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:38.104857 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:12:50.727793 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:50.727764 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-c7d86bcbd-zcm4t_3951c0b5-f95b-4db0-b1a5-795144405449/kserve-container/0.log" Apr 23 18:12:50.905492 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:50.905444 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t"] Apr 23 18:12:50.905800 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:50.905752 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" podUID="3951c0b5-f95b-4db0-b1a5-795144405449" containerName="kserve-container" containerID="cri-o://b55dfabb6555b35cdb99655d294db109ffa8f60ad6ebf3d1afa3f4ef6ab5ec1c" gracePeriod=30 Apr 23 18:12:50.905871 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:50.905789 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" podUID="3951c0b5-f95b-4db0-b1a5-795144405449" containerName="kube-rbac-proxy" containerID="cri-o://3e29da5b30a7f9f5f651de80ae1e9132f55c5e9e66e1a94c4bcfcaf78684e5fb" gracePeriod=30 Apr 23 18:12:50.996374 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:50.996290 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp"] Apr 23 18:12:50.996844 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:50.996825 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kserve-container" Apr 23 18:12:50.996962 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:50.996863 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kserve-container" Apr 23 18:12:50.996962 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:50.996885 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="storage-initializer" Apr 23 18:12:50.996962 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:50.996894 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="storage-initializer" Apr 23 18:12:50.996962 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:50.996934 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kube-rbac-proxy" Apr 23 18:12:50.996962 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:50.996942 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kube-rbac-proxy" Apr 23 18:12:50.996962 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:50.996955 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="agent" Apr 23 18:12:50.996962 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:50.996965 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="agent" Apr 23 18:12:50.997307 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:50.997071 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kube-rbac-proxy" Apr 23 18:12:50.997307 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:50.997085 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="kserve-container" Apr 23 18:12:50.997307 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:50.997097 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a09d0da1-dbd4-4d56-a9bf-a15385dca7f3" containerName="agent" Apr 23 18:12:51.000517 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.000496 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" Apr 23 18:12:51.002879 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.002753 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-predictor-serving-cert\"" Apr 23 18:12:51.002879 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.002754 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-kube-rbac-proxy-sar-config\"" Apr 23 18:12:51.009823 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.009742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/815c3ab5-d87a-43b0-ae94-477e48b22583-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-j4tlp\" (UID: \"815c3ab5-d87a-43b0-ae94-477e48b22583\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" Apr 23 18:12:51.009823 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.009811 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/815c3ab5-d87a-43b0-ae94-477e48b22583-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-j4tlp\" (UID: \"815c3ab5-d87a-43b0-ae94-477e48b22583\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" Apr 23 18:12:51.010005 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.009869 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/815c3ab5-d87a-43b0-ae94-477e48b22583-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-j4tlp\" (UID: \"815c3ab5-d87a-43b0-ae94-477e48b22583\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" Apr 23 18:12:51.010189 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.010008 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p7t9\" (UniqueName: \"kubernetes.io/projected/815c3ab5-d87a-43b0-ae94-477e48b22583-kube-api-access-6p7t9\") pod \"isvc-lightgbm-predictor-bdf964bd-j4tlp\" (UID: \"815c3ab5-d87a-43b0-ae94-477e48b22583\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" Apr 23 18:12:51.011287 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.011265 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp"] Apr 23 18:12:51.015265 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.015239 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n"] Apr 23 18:12:51.015797 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.015635 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kserve-container" containerID="cri-o://ad49e5cd196d09f25936c868d65c7403ffc73c5ea90c9cad08f47a965cf5d551" gracePeriod=30 Apr 23 18:12:51.015797 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.015651 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="agent" containerID="cri-o://01a63245f876a23dd4c86dc3f81764ca109de0c214ad5d854f7521c441731e0c" gracePeriod=30 Apr 23 18:12:51.015797 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.015674 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kube-rbac-proxy" containerID="cri-o://6cb68adefc026551190bc931a9051abfa0d9b738a85d226206201c121913ab78" gracePeriod=30 Apr 23 18:12:51.016075 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.016008 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" podUID="3951c0b5-f95b-4db0-b1a5-795144405449" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.35:8643/healthz\": dial tcp 10.132.0.35:8643: connect: connection refused" Apr 23 18:12:51.111006 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.110977 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6p7t9\" (UniqueName: \"kubernetes.io/projected/815c3ab5-d87a-43b0-ae94-477e48b22583-kube-api-access-6p7t9\") pod \"isvc-lightgbm-predictor-bdf964bd-j4tlp\" (UID: \"815c3ab5-d87a-43b0-ae94-477e48b22583\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" Apr 23 18:12:51.111168 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.111015 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/815c3ab5-d87a-43b0-ae94-477e48b22583-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-j4tlp\" (UID: \"815c3ab5-d87a-43b0-ae94-477e48b22583\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" Apr 23 18:12:51.111168 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.111034 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/815c3ab5-d87a-43b0-ae94-477e48b22583-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-j4tlp\" (UID: \"815c3ab5-d87a-43b0-ae94-477e48b22583\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" Apr 23 18:12:51.111168 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:12:51.111154 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-predictor-serving-cert: secret "isvc-lightgbm-predictor-serving-cert" not found Apr 23 18:12:51.111335 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:12:51.111205 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/815c3ab5-d87a-43b0-ae94-477e48b22583-proxy-tls podName:815c3ab5-d87a-43b0-ae94-477e48b22583 nodeName:}" failed. No retries permitted until 2026-04-23 18:12:51.611188544 +0000 UTC m=+860.778515840 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/815c3ab5-d87a-43b0-ae94-477e48b22583-proxy-tls") pod "isvc-lightgbm-predictor-bdf964bd-j4tlp" (UID: "815c3ab5-d87a-43b0-ae94-477e48b22583") : secret "isvc-lightgbm-predictor-serving-cert" not found Apr 23 18:12:51.111335 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.111234 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/815c3ab5-d87a-43b0-ae94-477e48b22583-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-j4tlp\" (UID: \"815c3ab5-d87a-43b0-ae94-477e48b22583\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" Apr 23 18:12:51.111450 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.111415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/815c3ab5-d87a-43b0-ae94-477e48b22583-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-j4tlp\" (UID: \"815c3ab5-d87a-43b0-ae94-477e48b22583\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" Apr 23 18:12:51.111828 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.111808 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/815c3ab5-d87a-43b0-ae94-477e48b22583-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-j4tlp\" (UID: \"815c3ab5-d87a-43b0-ae94-477e48b22583\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" Apr 23 18:12:51.120644 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.120616 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p7t9\" (UniqueName: \"kubernetes.io/projected/815c3ab5-d87a-43b0-ae94-477e48b22583-kube-api-access-6p7t9\") pod \"isvc-lightgbm-predictor-bdf964bd-j4tlp\" (UID: \"815c3ab5-d87a-43b0-ae94-477e48b22583\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" Apr 23 18:12:51.160108 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.160086 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" Apr 23 18:12:51.212075 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.212044 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3951c0b5-f95b-4db0-b1a5-795144405449-proxy-tls\") pod \"3951c0b5-f95b-4db0-b1a5-795144405449\" (UID: \"3951c0b5-f95b-4db0-b1a5-795144405449\") " Apr 23 18:12:51.212274 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.212092 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4wdk\" (UniqueName: \"kubernetes.io/projected/3951c0b5-f95b-4db0-b1a5-795144405449-kube-api-access-h4wdk\") pod \"3951c0b5-f95b-4db0-b1a5-795144405449\" (UID: \"3951c0b5-f95b-4db0-b1a5-795144405449\") " Apr 23 18:12:51.212274 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.212137 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3951c0b5-f95b-4db0-b1a5-795144405449-message-dumper-kube-rbac-proxy-sar-config\") pod \"3951c0b5-f95b-4db0-b1a5-795144405449\" (UID: \"3951c0b5-f95b-4db0-b1a5-795144405449\") " Apr 23 18:12:51.212567 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.212539 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3951c0b5-f95b-4db0-b1a5-795144405449-message-dumper-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-kube-rbac-proxy-sar-config") pod "3951c0b5-f95b-4db0-b1a5-795144405449" (UID: "3951c0b5-f95b-4db0-b1a5-795144405449"). InnerVolumeSpecName "message-dumper-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:12:51.214374 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.214346 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3951c0b5-f95b-4db0-b1a5-795144405449-kube-api-access-h4wdk" (OuterVolumeSpecName: "kube-api-access-h4wdk") pod "3951c0b5-f95b-4db0-b1a5-795144405449" (UID: "3951c0b5-f95b-4db0-b1a5-795144405449"). InnerVolumeSpecName "kube-api-access-h4wdk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:12:51.214374 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.214359 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3951c0b5-f95b-4db0-b1a5-795144405449-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3951c0b5-f95b-4db0-b1a5-795144405449" (UID: "3951c0b5-f95b-4db0-b1a5-795144405449"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:12:51.313866 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.313792 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3951c0b5-f95b-4db0-b1a5-795144405449-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:12:51.313866 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.313821 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h4wdk\" (UniqueName: \"kubernetes.io/projected/3951c0b5-f95b-4db0-b1a5-795144405449-kube-api-access-h4wdk\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:12:51.313866 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.313833 2576 reconciler_common.go:299] "Volume detached for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3951c0b5-f95b-4db0-b1a5-795144405449-message-dumper-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:12:51.372596 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.372563 2576 generic.go:358] "Generic (PLEG): container finished" podID="3951c0b5-f95b-4db0-b1a5-795144405449" containerID="3e29da5b30a7f9f5f651de80ae1e9132f55c5e9e66e1a94c4bcfcaf78684e5fb" exitCode=2 Apr 23 18:12:51.372596 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.372588 2576 generic.go:358] "Generic (PLEG): container finished" podID="3951c0b5-f95b-4db0-b1a5-795144405449" containerID="b55dfabb6555b35cdb99655d294db109ffa8f60ad6ebf3d1afa3f4ef6ab5ec1c" exitCode=2 Apr 23 18:12:51.372819 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.372639 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" Apr 23 18:12:51.372819 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.372649 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" event={"ID":"3951c0b5-f95b-4db0-b1a5-795144405449","Type":"ContainerDied","Data":"3e29da5b30a7f9f5f651de80ae1e9132f55c5e9e66e1a94c4bcfcaf78684e5fb"} Apr 23 18:12:51.372819 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.372689 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" event={"ID":"3951c0b5-f95b-4db0-b1a5-795144405449","Type":"ContainerDied","Data":"b55dfabb6555b35cdb99655d294db109ffa8f60ad6ebf3d1afa3f4ef6ab5ec1c"} Apr 23 18:12:51.372819 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.372704 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t" event={"ID":"3951c0b5-f95b-4db0-b1a5-795144405449","Type":"ContainerDied","Data":"708da9899c26d6f24258bf59909402ddcf3aaf29132dd29f5b0cbbeb45d885b5"} Apr 23 18:12:51.372819 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.372721 2576 scope.go:117] "RemoveContainer" containerID="3e29da5b30a7f9f5f651de80ae1e9132f55c5e9e66e1a94c4bcfcaf78684e5fb" Apr 23 18:12:51.375207 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.375177 2576 generic.go:358] "Generic (PLEG): container finished" podID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerID="6cb68adefc026551190bc931a9051abfa0d9b738a85d226206201c121913ab78" exitCode=2 Apr 23 18:12:51.375320 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.375246 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" event={"ID":"4c13a039-c44c-48a9-909c-99dd4f81e7d1","Type":"ContainerDied","Data":"6cb68adefc026551190bc931a9051abfa0d9b738a85d226206201c121913ab78"} Apr 23 18:12:51.381564 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.381544 2576 scope.go:117] "RemoveContainer" containerID="b55dfabb6555b35cdb99655d294db109ffa8f60ad6ebf3d1afa3f4ef6ab5ec1c" Apr 23 18:12:51.389671 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.389654 2576 scope.go:117] "RemoveContainer" containerID="3e29da5b30a7f9f5f651de80ae1e9132f55c5e9e66e1a94c4bcfcaf78684e5fb" Apr 23 18:12:51.389914 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:12:51.389894 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e29da5b30a7f9f5f651de80ae1e9132f55c5e9e66e1a94c4bcfcaf78684e5fb\": container with ID starting with 3e29da5b30a7f9f5f651de80ae1e9132f55c5e9e66e1a94c4bcfcaf78684e5fb not found: ID does not exist" containerID="3e29da5b30a7f9f5f651de80ae1e9132f55c5e9e66e1a94c4bcfcaf78684e5fb" Apr 23 18:12:51.389963 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.389922 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e29da5b30a7f9f5f651de80ae1e9132f55c5e9e66e1a94c4bcfcaf78684e5fb"} err="failed to get container status \"3e29da5b30a7f9f5f651de80ae1e9132f55c5e9e66e1a94c4bcfcaf78684e5fb\": rpc error: code = NotFound desc = could not find container \"3e29da5b30a7f9f5f651de80ae1e9132f55c5e9e66e1a94c4bcfcaf78684e5fb\": container with ID starting with 3e29da5b30a7f9f5f651de80ae1e9132f55c5e9e66e1a94c4bcfcaf78684e5fb not found: ID does not exist" Apr 23 18:12:51.389963 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.389941 2576 scope.go:117] "RemoveContainer" containerID="b55dfabb6555b35cdb99655d294db109ffa8f60ad6ebf3d1afa3f4ef6ab5ec1c" Apr 23 18:12:51.390176 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:12:51.390160 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b55dfabb6555b35cdb99655d294db109ffa8f60ad6ebf3d1afa3f4ef6ab5ec1c\": container with ID starting with b55dfabb6555b35cdb99655d294db109ffa8f60ad6ebf3d1afa3f4ef6ab5ec1c not found: ID does not exist" containerID="b55dfabb6555b35cdb99655d294db109ffa8f60ad6ebf3d1afa3f4ef6ab5ec1c" Apr 23 18:12:51.390218 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.390182 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b55dfabb6555b35cdb99655d294db109ffa8f60ad6ebf3d1afa3f4ef6ab5ec1c"} err="failed to get container status \"b55dfabb6555b35cdb99655d294db109ffa8f60ad6ebf3d1afa3f4ef6ab5ec1c\": rpc error: code = NotFound desc = could not find container \"b55dfabb6555b35cdb99655d294db109ffa8f60ad6ebf3d1afa3f4ef6ab5ec1c\": container with ID starting with b55dfabb6555b35cdb99655d294db109ffa8f60ad6ebf3d1afa3f4ef6ab5ec1c not found: ID does not exist" Apr 23 18:12:51.390218 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.390197 2576 scope.go:117] "RemoveContainer" containerID="3e29da5b30a7f9f5f651de80ae1e9132f55c5e9e66e1a94c4bcfcaf78684e5fb" Apr 23 18:12:51.390395 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.390375 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e29da5b30a7f9f5f651de80ae1e9132f55c5e9e66e1a94c4bcfcaf78684e5fb"} err="failed to get container status \"3e29da5b30a7f9f5f651de80ae1e9132f55c5e9e66e1a94c4bcfcaf78684e5fb\": rpc error: code = NotFound desc = could not find container \"3e29da5b30a7f9f5f651de80ae1e9132f55c5e9e66e1a94c4bcfcaf78684e5fb\": container with ID starting with 3e29da5b30a7f9f5f651de80ae1e9132f55c5e9e66e1a94c4bcfcaf78684e5fb not found: ID does not exist" Apr 23 18:12:51.390440 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.390396 2576 scope.go:117] "RemoveContainer" containerID="b55dfabb6555b35cdb99655d294db109ffa8f60ad6ebf3d1afa3f4ef6ab5ec1c" Apr 23 18:12:51.390880 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.390865 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b55dfabb6555b35cdb99655d294db109ffa8f60ad6ebf3d1afa3f4ef6ab5ec1c"} err="failed to get container status \"b55dfabb6555b35cdb99655d294db109ffa8f60ad6ebf3d1afa3f4ef6ab5ec1c\": rpc error: code = NotFound desc = could not find container \"b55dfabb6555b35cdb99655d294db109ffa8f60ad6ebf3d1afa3f4ef6ab5ec1c\": container with ID starting with b55dfabb6555b35cdb99655d294db109ffa8f60ad6ebf3d1afa3f4ef6ab5ec1c not found: ID does not exist" Apr 23 18:12:51.394388 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.394367 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t"] Apr 23 18:12:51.397954 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.397936 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zcm4t"] Apr 23 18:12:51.415327 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.415305 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3951c0b5-f95b-4db0-b1a5-795144405449" path="/var/lib/kubelet/pods/3951c0b5-f95b-4db0-b1a5-795144405449/volumes" Apr 23 18:12:51.616071 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.616038 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/815c3ab5-d87a-43b0-ae94-477e48b22583-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-j4tlp\" (UID: \"815c3ab5-d87a-43b0-ae94-477e48b22583\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" Apr 23 18:12:51.618634 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.618606 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/815c3ab5-d87a-43b0-ae94-477e48b22583-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-j4tlp\" (UID: \"815c3ab5-d87a-43b0-ae94-477e48b22583\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" Apr 23 18:12:51.915673 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:51.915585 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" Apr 23 18:12:52.042212 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:52.042179 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp"] Apr 23 18:12:52.046049 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:12:52.046012 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod815c3ab5_d87a_43b0_ae94_477e48b22583.slice/crio-fe253ac85ab653daebece407a0eb49f1e33e2916c6500c2e6363b617324867cf WatchSource:0}: Error finding container fe253ac85ab653daebece407a0eb49f1e33e2916c6500c2e6363b617324867cf: Status 404 returned error can't find the container with id fe253ac85ab653daebece407a0eb49f1e33e2916c6500c2e6363b617324867cf Apr 23 18:12:52.381014 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:52.380981 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" event={"ID":"815c3ab5-d87a-43b0-ae94-477e48b22583","Type":"ContainerStarted","Data":"428dd0bd663f245d4ba344d72968f4c55ade6608683a6ba2039b360a749041f1"} Apr 23 18:12:52.381014 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:52.381020 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" event={"ID":"815c3ab5-d87a-43b0-ae94-477e48b22583","Type":"ContainerStarted","Data":"fe253ac85ab653daebece407a0eb49f1e33e2916c6500c2e6363b617324867cf"} Apr 23 18:12:53.099564 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:53.099499 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.36:8643/healthz\": dial tcp 10.132.0.36:8643: connect: connection refused" Apr 23 18:12:55.392162 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:55.392123 2576 generic.go:358] "Generic (PLEG): container finished" podID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerID="ad49e5cd196d09f25936c868d65c7403ffc73c5ea90c9cad08f47a965cf5d551" exitCode=0 Apr 23 18:12:55.392537 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:55.392196 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" event={"ID":"4c13a039-c44c-48a9-909c-99dd4f81e7d1","Type":"ContainerDied","Data":"ad49e5cd196d09f25936c868d65c7403ffc73c5ea90c9cad08f47a965cf5d551"} Apr 23 18:12:56.397128 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:56.397097 2576 generic.go:358] "Generic (PLEG): container finished" podID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerID="428dd0bd663f245d4ba344d72968f4c55ade6608683a6ba2039b360a749041f1" exitCode=0 Apr 23 18:12:56.397509 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:56.397175 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" event={"ID":"815c3ab5-d87a-43b0-ae94-477e48b22583","Type":"ContainerDied","Data":"428dd0bd663f245d4ba344d72968f4c55ade6608683a6ba2039b360a749041f1"} Apr 23 18:12:58.099487 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:58.099426 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.36:8643/healthz\": dial tcp 10.132.0.36:8643: connect: connection refused" Apr 23 18:12:58.103763 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:58.103732 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 18:12:58.104280 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:12:58.104250 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:13:03.098605 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:03.098564 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.36:8643/healthz\": dial tcp 10.132.0.36:8643: connect: connection refused" Apr 23 18:13:03.098973 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:03.098697 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:13:03.427681 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:03.427600 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" event={"ID":"815c3ab5-d87a-43b0-ae94-477e48b22583","Type":"ContainerStarted","Data":"4d8109d4c527df70f24db3fbd7dc45ce18a84fc29b4444636395cd7c9026b261"} Apr 23 18:13:03.427681 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:03.427637 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" event={"ID":"815c3ab5-d87a-43b0-ae94-477e48b22583","Type":"ContainerStarted","Data":"0136546367cf81d25c7fca86c8cc18c84b4fd41e59ac4dafafd7be11658d3d77"} Apr 23 18:13:03.427863 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:03.427828 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" Apr 23 18:13:03.447349 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:03.447305 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" podStartSLOduration=7.279683478 podStartE2EDuration="13.447291944s" podCreationTimestamp="2026-04-23 18:12:50 +0000 UTC" firstStartedPulling="2026-04-23 18:12:56.398510295 +0000 UTC m=+865.565837594" lastFinishedPulling="2026-04-23 18:13:02.566118761 +0000 UTC m=+871.733446060" observedRunningTime="2026-04-23 18:13:03.445839671 +0000 UTC m=+872.613166989" watchObservedRunningTime="2026-04-23 18:13:03.447291944 +0000 UTC m=+872.614619262" Apr 23 18:13:04.431376 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:04.431334 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" Apr 23 18:13:04.432720 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:04.432692 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" podUID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 23 18:13:05.435473 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:05.435416 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" podUID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 23 18:13:08.099413 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:08.099370 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.36:8643/healthz\": dial tcp 10.132.0.36:8643: connect: connection refused" Apr 23 18:13:08.103771 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:08.103742 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 18:13:08.104144 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:08.104118 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:13:10.439453 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:10.439375 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" Apr 23 18:13:10.439907 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:10.439882 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" podUID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 23 18:13:13.099181 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:13.099132 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.36:8643/healthz\": dial tcp 10.132.0.36:8643: connect: connection refused" Apr 23 18:13:18.099005 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:18.098959 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.36:8643/healthz\": dial tcp 10.132.0.36:8643: connect: connection refused" Apr 23 18:13:18.103290 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:18.103254 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 18:13:18.103441 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:18.103425 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:13:18.103565 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:18.103542 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:13:18.103672 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:18.103658 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:13:20.440680 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:20.440637 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" podUID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 23 18:13:21.191052 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.191028 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:13:21.275753 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.275671 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4c13a039-c44c-48a9-909c-99dd4f81e7d1-isvc-logger-kube-rbac-proxy-sar-config\") pod \"4c13a039-c44c-48a9-909c-99dd4f81e7d1\" (UID: \"4c13a039-c44c-48a9-909c-99dd4f81e7d1\") " Apr 23 18:13:21.275753 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.275715 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8nq9\" (UniqueName: \"kubernetes.io/projected/4c13a039-c44c-48a9-909c-99dd4f81e7d1-kube-api-access-q8nq9\") pod \"4c13a039-c44c-48a9-909c-99dd4f81e7d1\" (UID: \"4c13a039-c44c-48a9-909c-99dd4f81e7d1\") " Apr 23 18:13:21.275947 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.275776 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c13a039-c44c-48a9-909c-99dd4f81e7d1-proxy-tls\") pod \"4c13a039-c44c-48a9-909c-99dd4f81e7d1\" (UID: \"4c13a039-c44c-48a9-909c-99dd4f81e7d1\") " Apr 23 18:13:21.275947 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.275812 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c13a039-c44c-48a9-909c-99dd4f81e7d1-kserve-provision-location\") pod \"4c13a039-c44c-48a9-909c-99dd4f81e7d1\" (UID: \"4c13a039-c44c-48a9-909c-99dd4f81e7d1\") " Apr 23 18:13:21.276232 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.276110 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c13a039-c44c-48a9-909c-99dd4f81e7d1-isvc-logger-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-kube-rbac-proxy-sar-config") pod "4c13a039-c44c-48a9-909c-99dd4f81e7d1" (UID: "4c13a039-c44c-48a9-909c-99dd4f81e7d1"). InnerVolumeSpecName "isvc-logger-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:13:21.276331 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.276264 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c13a039-c44c-48a9-909c-99dd4f81e7d1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4c13a039-c44c-48a9-909c-99dd4f81e7d1" (UID: "4c13a039-c44c-48a9-909c-99dd4f81e7d1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:13:21.277970 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.277947 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c13a039-c44c-48a9-909c-99dd4f81e7d1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4c13a039-c44c-48a9-909c-99dd4f81e7d1" (UID: "4c13a039-c44c-48a9-909c-99dd4f81e7d1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:13:21.278060 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.277985 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c13a039-c44c-48a9-909c-99dd4f81e7d1-kube-api-access-q8nq9" (OuterVolumeSpecName: "kube-api-access-q8nq9") pod "4c13a039-c44c-48a9-909c-99dd4f81e7d1" (UID: "4c13a039-c44c-48a9-909c-99dd4f81e7d1"). InnerVolumeSpecName "kube-api-access-q8nq9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:13:21.376480 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.376427 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4c13a039-c44c-48a9-909c-99dd4f81e7d1-isvc-logger-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:13:21.376480 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.376480 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q8nq9\" (UniqueName: \"kubernetes.io/projected/4c13a039-c44c-48a9-909c-99dd4f81e7d1-kube-api-access-q8nq9\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:13:21.376674 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.376492 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c13a039-c44c-48a9-909c-99dd4f81e7d1-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:13:21.376674 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.376502 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c13a039-c44c-48a9-909c-99dd4f81e7d1-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:13:21.488894 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.488861 2576 generic.go:358] "Generic (PLEG): container finished" podID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerID="01a63245f876a23dd4c86dc3f81764ca109de0c214ad5d854f7521c441731e0c" exitCode=0 Apr 23 18:13:21.489289 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.488947 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" Apr 23 18:13:21.489289 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.488950 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" event={"ID":"4c13a039-c44c-48a9-909c-99dd4f81e7d1","Type":"ContainerDied","Data":"01a63245f876a23dd4c86dc3f81764ca109de0c214ad5d854f7521c441731e0c"} Apr 23 18:13:21.489289 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.488992 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n" event={"ID":"4c13a039-c44c-48a9-909c-99dd4f81e7d1","Type":"ContainerDied","Data":"78d660f3624dfb4239ac577cffab7368f7a1b0ee4b727d97c20cab433cbb02a0"} Apr 23 18:13:21.489289 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.489010 2576 scope.go:117] "RemoveContainer" containerID="01a63245f876a23dd4c86dc3f81764ca109de0c214ad5d854f7521c441731e0c" Apr 23 18:13:21.498388 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.498369 2576 scope.go:117] "RemoveContainer" containerID="6cb68adefc026551190bc931a9051abfa0d9b738a85d226206201c121913ab78" Apr 23 18:13:21.506552 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.506537 2576 scope.go:117] "RemoveContainer" containerID="ad49e5cd196d09f25936c868d65c7403ffc73c5ea90c9cad08f47a965cf5d551" Apr 23 18:13:21.509314 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.509289 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n"] Apr 23 18:13:21.515445 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.515419 2576 scope.go:117] "RemoveContainer" containerID="b59bb12be31b82b7d43e64dc351a6ac9ed2b03691966bdc10ffa04729455b0fe" Apr 23 18:13:21.515542 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.515502 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-6c87ff55b4-bhg6n"] Apr 23 18:13:21.525361 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.525343 2576 scope.go:117] "RemoveContainer" containerID="01a63245f876a23dd4c86dc3f81764ca109de0c214ad5d854f7521c441731e0c" Apr 23 18:13:21.525665 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:13:21.525646 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01a63245f876a23dd4c86dc3f81764ca109de0c214ad5d854f7521c441731e0c\": container with ID starting with 01a63245f876a23dd4c86dc3f81764ca109de0c214ad5d854f7521c441731e0c not found: ID does not exist" containerID="01a63245f876a23dd4c86dc3f81764ca109de0c214ad5d854f7521c441731e0c" Apr 23 18:13:21.525736 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.525673 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01a63245f876a23dd4c86dc3f81764ca109de0c214ad5d854f7521c441731e0c"} err="failed to get container status \"01a63245f876a23dd4c86dc3f81764ca109de0c214ad5d854f7521c441731e0c\": rpc error: code = NotFound desc = could not find container \"01a63245f876a23dd4c86dc3f81764ca109de0c214ad5d854f7521c441731e0c\": container with ID starting with 01a63245f876a23dd4c86dc3f81764ca109de0c214ad5d854f7521c441731e0c not found: ID does not exist" Apr 23 18:13:21.525736 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.525693 2576 scope.go:117] "RemoveContainer" containerID="6cb68adefc026551190bc931a9051abfa0d9b738a85d226206201c121913ab78" Apr 23 18:13:21.525953 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:13:21.525913 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cb68adefc026551190bc931a9051abfa0d9b738a85d226206201c121913ab78\": container with ID starting with 6cb68adefc026551190bc931a9051abfa0d9b738a85d226206201c121913ab78 not found: ID does not exist" containerID="6cb68adefc026551190bc931a9051abfa0d9b738a85d226206201c121913ab78" Apr 23 18:13:21.525953 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.525938 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cb68adefc026551190bc931a9051abfa0d9b738a85d226206201c121913ab78"} err="failed to get container status \"6cb68adefc026551190bc931a9051abfa0d9b738a85d226206201c121913ab78\": rpc error: code = NotFound desc = could not find container \"6cb68adefc026551190bc931a9051abfa0d9b738a85d226206201c121913ab78\": container with ID starting with 6cb68adefc026551190bc931a9051abfa0d9b738a85d226206201c121913ab78 not found: ID does not exist" Apr 23 18:13:21.526039 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.525955 2576 scope.go:117] "RemoveContainer" containerID="ad49e5cd196d09f25936c868d65c7403ffc73c5ea90c9cad08f47a965cf5d551" Apr 23 18:13:21.526187 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:13:21.526170 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad49e5cd196d09f25936c868d65c7403ffc73c5ea90c9cad08f47a965cf5d551\": container with ID starting with ad49e5cd196d09f25936c868d65c7403ffc73c5ea90c9cad08f47a965cf5d551 not found: ID does not exist" containerID="ad49e5cd196d09f25936c868d65c7403ffc73c5ea90c9cad08f47a965cf5d551" Apr 23 18:13:21.526231 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.526189 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad49e5cd196d09f25936c868d65c7403ffc73c5ea90c9cad08f47a965cf5d551"} err="failed to get container status \"ad49e5cd196d09f25936c868d65c7403ffc73c5ea90c9cad08f47a965cf5d551\": rpc error: code = NotFound desc = could not find container \"ad49e5cd196d09f25936c868d65c7403ffc73c5ea90c9cad08f47a965cf5d551\": container with ID starting with ad49e5cd196d09f25936c868d65c7403ffc73c5ea90c9cad08f47a965cf5d551 not found: ID does not exist" Apr 23 18:13:21.526231 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.526199 2576 scope.go:117] "RemoveContainer" containerID="b59bb12be31b82b7d43e64dc351a6ac9ed2b03691966bdc10ffa04729455b0fe" Apr 23 18:13:21.526382 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:13:21.526363 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b59bb12be31b82b7d43e64dc351a6ac9ed2b03691966bdc10ffa04729455b0fe\": container with ID starting with b59bb12be31b82b7d43e64dc351a6ac9ed2b03691966bdc10ffa04729455b0fe not found: ID does not exist" containerID="b59bb12be31b82b7d43e64dc351a6ac9ed2b03691966bdc10ffa04729455b0fe" Apr 23 18:13:21.526424 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:21.526388 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b59bb12be31b82b7d43e64dc351a6ac9ed2b03691966bdc10ffa04729455b0fe"} err="failed to get container status \"b59bb12be31b82b7d43e64dc351a6ac9ed2b03691966bdc10ffa04729455b0fe\": rpc error: code = NotFound desc = could not find container \"b59bb12be31b82b7d43e64dc351a6ac9ed2b03691966bdc10ffa04729455b0fe\": container with ID starting with b59bb12be31b82b7d43e64dc351a6ac9ed2b03691966bdc10ffa04729455b0fe not found: ID does not exist" Apr 23 18:13:23.416420 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:23.416383 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" path="/var/lib/kubelet/pods/4c13a039-c44c-48a9-909c-99dd4f81e7d1/volumes" Apr 23 18:13:30.440226 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:30.440186 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" podUID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 23 18:13:31.349989 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:31.349960 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:13:31.353234 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:31.353213 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:13:40.440754 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:40.440714 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" podUID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 23 18:13:50.439946 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:13:50.439910 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" podUID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 23 18:14:00.440770 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:00.440731 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" podUID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 23 18:14:10.440588 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:10.440548 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" podUID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 23 18:14:20.441056 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:20.441021 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" Apr 23 18:14:21.149265 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.149235 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp"] Apr 23 18:14:21.149570 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.149543 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" podUID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerName="kserve-container" containerID="cri-o://0136546367cf81d25c7fca86c8cc18c84b4fd41e59ac4dafafd7be11658d3d77" gracePeriod=30 Apr 23 18:14:21.149684 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.149577 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" podUID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerName="kube-rbac-proxy" containerID="cri-o://4d8109d4c527df70f24db3fbd7dc45ce18a84fc29b4444636395cd7c9026b261" gracePeriod=30 Apr 23 18:14:21.270273 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.270246 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd"] Apr 23 18:14:21.270635 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.270621 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3951c0b5-f95b-4db0-b1a5-795144405449" containerName="kube-rbac-proxy" Apr 23 18:14:21.270717 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.270638 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3951c0b5-f95b-4db0-b1a5-795144405449" containerName="kube-rbac-proxy" Apr 23 18:14:21.270717 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.270648 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3951c0b5-f95b-4db0-b1a5-795144405449" containerName="kserve-container" Apr 23 18:14:21.270717 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.270653 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3951c0b5-f95b-4db0-b1a5-795144405449" containerName="kserve-container" Apr 23 18:14:21.270717 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.270663 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="storage-initializer" Apr 23 18:14:21.270717 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.270668 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="storage-initializer" Apr 23 18:14:21.270717 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.270680 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kube-rbac-proxy" Apr 23 18:14:21.270717 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.270686 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kube-rbac-proxy" Apr 23 18:14:21.270717 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.270693 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="agent" Apr 23 18:14:21.270717 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.270698 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="agent" Apr 23 18:14:21.270717 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.270711 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kserve-container" Apr 23 18:14:21.270717 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.270716 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kserve-container" Apr 23 18:14:21.271047 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.270771 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kube-rbac-proxy" Apr 23 18:14:21.271047 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.270780 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3951c0b5-f95b-4db0-b1a5-795144405449" containerName="kube-rbac-proxy" Apr 23 18:14:21.271047 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.270786 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3951c0b5-f95b-4db0-b1a5-795144405449" containerName="kserve-container" Apr 23 18:14:21.271047 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.270791 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="agent" Apr 23 18:14:21.271047 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.270798 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c13a039-c44c-48a9-909c-99dd4f81e7d1" containerName="kserve-container" Apr 23 18:14:21.274653 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.274635 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" Apr 23 18:14:21.276911 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.276888 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-predictor-serving-cert\"" Apr 23 18:14:21.277094 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.277008 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\"" Apr 23 18:14:21.283876 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.283855 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd"] Apr 23 18:14:21.380887 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.380809 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd\" (UID: \"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" Apr 23 18:14:21.380887 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.380856 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd\" (UID: \"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" Apr 23 18:14:21.380887 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.380880 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpjfl\" (UniqueName: \"kubernetes.io/projected/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-kube-api-access-tpjfl\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd\" (UID: \"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" Apr 23 18:14:21.381120 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.380928 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd\" (UID: \"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" Apr 23 18:14:21.481662 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.481621 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd\" (UID: \"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" Apr 23 18:14:21.482109 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.481694 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd\" (UID: \"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" Apr 23 18:14:21.482109 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.481726 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd\" (UID: \"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" Apr 23 18:14:21.482109 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.481758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpjfl\" (UniqueName: \"kubernetes.io/projected/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-kube-api-access-tpjfl\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd\" (UID: \"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" Apr 23 18:14:21.482235 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.482133 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd\" (UID: \"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" Apr 23 18:14:21.482413 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.482389 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd\" (UID: \"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" Apr 23 18:14:21.484216 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.484193 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd\" (UID: \"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" Apr 23 18:14:21.497437 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.497413 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpjfl\" (UniqueName: \"kubernetes.io/projected/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-kube-api-access-tpjfl\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd\" (UID: \"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" Apr 23 18:14:21.586434 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.586391 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" Apr 23 18:14:21.697673 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.697642 2576 generic.go:358] "Generic (PLEG): container finished" podID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerID="4d8109d4c527df70f24db3fbd7dc45ce18a84fc29b4444636395cd7c9026b261" exitCode=2 Apr 23 18:14:21.697823 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.697697 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" event={"ID":"815c3ab5-d87a-43b0-ae94-477e48b22583","Type":"ContainerDied","Data":"4d8109d4c527df70f24db3fbd7dc45ce18a84fc29b4444636395cd7c9026b261"} Apr 23 18:14:21.713924 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:21.713902 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd"] Apr 23 18:14:21.715895 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:14:21.715869 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94c4bdb6_8058_4d5c_ba3f_f3bbc24663b5.slice/crio-e9c9d06583df9c9c09960f3c997ae2f8925b42166008e6d03dceb0d8de0262ff WatchSource:0}: Error finding container e9c9d06583df9c9c09960f3c997ae2f8925b42166008e6d03dceb0d8de0262ff: Status 404 returned error can't find the container with id e9c9d06583df9c9c09960f3c997ae2f8925b42166008e6d03dceb0d8de0262ff Apr 23 18:14:22.702426 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:22.702390 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" event={"ID":"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5","Type":"ContainerStarted","Data":"14a4c49955387c23f189b55189b756fb3018b19ed38e84c3b2449275572c3fe1"} Apr 23 18:14:22.702426 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:22.702427 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" event={"ID":"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5","Type":"ContainerStarted","Data":"e9c9d06583df9c9c09960f3c997ae2f8925b42166008e6d03dceb0d8de0262ff"} Apr 23 18:14:25.436635 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:25.436600 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" podUID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.37:8643/healthz\": dial tcp 10.132.0.37:8643: connect: connection refused" Apr 23 18:14:25.713798 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:25.713706 2576 generic.go:358] "Generic (PLEG): container finished" podID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerID="14a4c49955387c23f189b55189b756fb3018b19ed38e84c3b2449275572c3fe1" exitCode=0 Apr 23 18:14:25.713798 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:25.713781 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" event={"ID":"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5","Type":"ContainerDied","Data":"14a4c49955387c23f189b55189b756fb3018b19ed38e84c3b2449275572c3fe1"} Apr 23 18:14:26.719914 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:26.719888 2576 generic.go:358] "Generic (PLEG): container finished" podID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerID="0136546367cf81d25c7fca86c8cc18c84b4fd41e59ac4dafafd7be11658d3d77" exitCode=0 Apr 23 18:14:26.720299 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:26.719961 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" event={"ID":"815c3ab5-d87a-43b0-ae94-477e48b22583","Type":"ContainerDied","Data":"0136546367cf81d25c7fca86c8cc18c84b4fd41e59ac4dafafd7be11658d3d77"} Apr 23 18:14:26.722035 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:26.722010 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" event={"ID":"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5","Type":"ContainerStarted","Data":"97e7e3e86b8c31156ff996ad5577363cded38e04649c18ea072e092985e1346f"} Apr 23 18:14:26.722139 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:26.722051 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" event={"ID":"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5","Type":"ContainerStarted","Data":"c14c0b4f47ec45efb3b9977943cd908dde536300ada4487bd7868ca1caca301d"} Apr 23 18:14:26.722383 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:26.722360 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" Apr 23 18:14:26.722511 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:26.722476 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" Apr 23 18:14:26.723562 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:26.723536 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" podUID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 23 18:14:26.742085 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:26.742040 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" podStartSLOduration=5.742025888 podStartE2EDuration="5.742025888s" podCreationTimestamp="2026-04-23 18:14:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:14:26.740710573 +0000 UTC m=+955.908037893" watchObservedRunningTime="2026-04-23 18:14:26.742025888 +0000 UTC m=+955.909353207" Apr 23 18:14:26.791342 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:26.791316 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" Apr 23 18:14:26.930191 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:26.930090 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/815c3ab5-d87a-43b0-ae94-477e48b22583-proxy-tls\") pod \"815c3ab5-d87a-43b0-ae94-477e48b22583\" (UID: \"815c3ab5-d87a-43b0-ae94-477e48b22583\") " Apr 23 18:14:26.930191 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:26.930161 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/815c3ab5-d87a-43b0-ae94-477e48b22583-kserve-provision-location\") pod \"815c3ab5-d87a-43b0-ae94-477e48b22583\" (UID: \"815c3ab5-d87a-43b0-ae94-477e48b22583\") " Apr 23 18:14:26.930419 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:26.930194 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/815c3ab5-d87a-43b0-ae94-477e48b22583-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"815c3ab5-d87a-43b0-ae94-477e48b22583\" (UID: \"815c3ab5-d87a-43b0-ae94-477e48b22583\") " Apr 23 18:14:26.930419 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:26.930223 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p7t9\" (UniqueName: \"kubernetes.io/projected/815c3ab5-d87a-43b0-ae94-477e48b22583-kube-api-access-6p7t9\") pod \"815c3ab5-d87a-43b0-ae94-477e48b22583\" (UID: \"815c3ab5-d87a-43b0-ae94-477e48b22583\") " Apr 23 18:14:26.930561 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:26.930536 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/815c3ab5-d87a-43b0-ae94-477e48b22583-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "815c3ab5-d87a-43b0-ae94-477e48b22583" (UID: "815c3ab5-d87a-43b0-ae94-477e48b22583"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:14:26.930626 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:26.930571 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/815c3ab5-d87a-43b0-ae94-477e48b22583-isvc-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-kube-rbac-proxy-sar-config") pod "815c3ab5-d87a-43b0-ae94-477e48b22583" (UID: "815c3ab5-d87a-43b0-ae94-477e48b22583"). InnerVolumeSpecName "isvc-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:14:26.932533 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:26.932508 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/815c3ab5-d87a-43b0-ae94-477e48b22583-kube-api-access-6p7t9" (OuterVolumeSpecName: "kube-api-access-6p7t9") pod "815c3ab5-d87a-43b0-ae94-477e48b22583" (UID: "815c3ab5-d87a-43b0-ae94-477e48b22583"). InnerVolumeSpecName "kube-api-access-6p7t9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:14:26.932533 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:26.932512 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/815c3ab5-d87a-43b0-ae94-477e48b22583-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "815c3ab5-d87a-43b0-ae94-477e48b22583" (UID: "815c3ab5-d87a-43b0-ae94-477e48b22583"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:14:27.031104 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:27.031055 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/815c3ab5-d87a-43b0-ae94-477e48b22583-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:14:27.031104 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:27.031100 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/815c3ab5-d87a-43b0-ae94-477e48b22583-isvc-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:14:27.031104 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:27.031113 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6p7t9\" (UniqueName: \"kubernetes.io/projected/815c3ab5-d87a-43b0-ae94-477e48b22583-kube-api-access-6p7t9\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:14:27.031104 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:27.031122 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/815c3ab5-d87a-43b0-ae94-477e48b22583-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:14:27.727250 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:27.727159 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" event={"ID":"815c3ab5-d87a-43b0-ae94-477e48b22583","Type":"ContainerDied","Data":"fe253ac85ab653daebece407a0eb49f1e33e2916c6500c2e6363b617324867cf"} Apr 23 18:14:27.727250 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:27.727222 2576 scope.go:117] "RemoveContainer" containerID="4d8109d4c527df70f24db3fbd7dc45ce18a84fc29b4444636395cd7c9026b261" Apr 23 18:14:27.727250 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:27.727241 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp" Apr 23 18:14:27.727827 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:27.727507 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" podUID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 23 18:14:27.735654 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:27.735637 2576 scope.go:117] "RemoveContainer" containerID="0136546367cf81d25c7fca86c8cc18c84b4fd41e59ac4dafafd7be11658d3d77" Apr 23 18:14:27.743036 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:27.743013 2576 scope.go:117] "RemoveContainer" containerID="428dd0bd663f245d4ba344d72968f4c55ade6608683a6ba2039b360a749041f1" Apr 23 18:14:27.745935 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:27.745916 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp"] Apr 23 18:14:27.753593 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:27.753572 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-j4tlp"] Apr 23 18:14:29.415097 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:29.415065 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="815c3ab5-d87a-43b0-ae94-477e48b22583" path="/var/lib/kubelet/pods/815c3ab5-d87a-43b0-ae94-477e48b22583/volumes" Apr 23 18:14:32.731743 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:32.731716 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" Apr 23 18:14:32.732329 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:32.732303 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" podUID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 23 18:14:42.732573 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:42.732485 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" podUID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 23 18:14:52.733188 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:14:52.733143 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" podUID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 23 18:15:02.732721 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:02.732680 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" podUID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 23 18:15:12.732958 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:12.732913 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" podUID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 23 18:15:22.732663 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:22.732622 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" podUID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 23 18:15:32.733275 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:32.733232 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" podUID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 23 18:15:42.733212 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:42.733177 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" Apr 23 18:15:51.934117 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:51.934084 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd"] Apr 23 18:15:51.934555 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:51.934502 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" podUID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerName="kserve-container" containerID="cri-o://c14c0b4f47ec45efb3b9977943cd908dde536300ada4487bd7868ca1caca301d" gracePeriod=30 Apr 23 18:15:51.934627 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:51.934550 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" podUID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerName="kube-rbac-proxy" containerID="cri-o://97e7e3e86b8c31156ff996ad5577363cded38e04649c18ea072e092985e1346f" gracePeriod=30 Apr 23 18:15:52.065522 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.065490 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6"] Apr 23 18:15:52.065899 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.065886 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerName="storage-initializer" Apr 23 18:15:52.065940 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.065902 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerName="storage-initializer" Apr 23 18:15:52.065940 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.065927 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerName="kube-rbac-proxy" Apr 23 18:15:52.065940 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.065932 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerName="kube-rbac-proxy" Apr 23 18:15:52.066040 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.065943 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerName="kserve-container" Apr 23 18:15:52.066040 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.065948 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerName="kserve-container" Apr 23 18:15:52.066040 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.066001 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerName="kserve-container" Apr 23 18:15:52.066040 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.066015 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="815c3ab5-d87a-43b0-ae94-477e48b22583" containerName="kube-rbac-proxy" Apr 23 18:15:52.069222 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.069200 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" Apr 23 18:15:52.073489 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.073337 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-predictor-serving-cert\"" Apr 23 18:15:52.073648 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.073610 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 23 18:15:52.083716 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.083686 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6"] Apr 23 18:15:52.117408 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.117370 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c9abfca8-1035-4cc4-8319-47dd66058adf-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6\" (UID: \"c9abfca8-1035-4cc4-8319-47dd66058adf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" Apr 23 18:15:52.117408 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.117419 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkzpg\" (UniqueName: \"kubernetes.io/projected/c9abfca8-1035-4cc4-8319-47dd66058adf-kube-api-access-bkzpg\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6\" (UID: \"c9abfca8-1035-4cc4-8319-47dd66058adf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" Apr 23 18:15:52.117754 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.117484 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9abfca8-1035-4cc4-8319-47dd66058adf-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6\" (UID: \"c9abfca8-1035-4cc4-8319-47dd66058adf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" Apr 23 18:15:52.117754 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.117509 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c9abfca8-1035-4cc4-8319-47dd66058adf-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6\" (UID: \"c9abfca8-1035-4cc4-8319-47dd66058adf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" Apr 23 18:15:52.218378 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.218280 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c9abfca8-1035-4cc4-8319-47dd66058adf-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6\" (UID: \"c9abfca8-1035-4cc4-8319-47dd66058adf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" Apr 23 18:15:52.218378 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.218323 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkzpg\" (UniqueName: \"kubernetes.io/projected/c9abfca8-1035-4cc4-8319-47dd66058adf-kube-api-access-bkzpg\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6\" (UID: \"c9abfca8-1035-4cc4-8319-47dd66058adf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" Apr 23 18:15:52.218378 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.218368 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9abfca8-1035-4cc4-8319-47dd66058adf-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6\" (UID: \"c9abfca8-1035-4cc4-8319-47dd66058adf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" Apr 23 18:15:52.218694 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.218409 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c9abfca8-1035-4cc4-8319-47dd66058adf-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6\" (UID: \"c9abfca8-1035-4cc4-8319-47dd66058adf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" Apr 23 18:15:52.218694 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:15:52.218605 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-serving-cert: secret "isvc-lightgbm-v2-runtime-predictor-serving-cert" not found Apr 23 18:15:52.218694 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:15:52.218676 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9abfca8-1035-4cc4-8319-47dd66058adf-proxy-tls podName:c9abfca8-1035-4cc4-8319-47dd66058adf nodeName:}" failed. No retries permitted until 2026-04-23 18:15:52.718656434 +0000 UTC m=+1041.885983732 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c9abfca8-1035-4cc4-8319-47dd66058adf-proxy-tls") pod "isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" (UID: "c9abfca8-1035-4cc4-8319-47dd66058adf") : secret "isvc-lightgbm-v2-runtime-predictor-serving-cert" not found Apr 23 18:15:52.218853 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.218806 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9abfca8-1035-4cc4-8319-47dd66058adf-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6\" (UID: \"c9abfca8-1035-4cc4-8319-47dd66058adf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" Apr 23 18:15:52.219029 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.219000 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c9abfca8-1035-4cc4-8319-47dd66058adf-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6\" (UID: \"c9abfca8-1035-4cc4-8319-47dd66058adf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" Apr 23 18:15:52.232019 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.231997 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkzpg\" (UniqueName: \"kubernetes.io/projected/c9abfca8-1035-4cc4-8319-47dd66058adf-kube-api-access-bkzpg\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6\" (UID: \"c9abfca8-1035-4cc4-8319-47dd66058adf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" Apr 23 18:15:52.721520 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.721484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c9abfca8-1035-4cc4-8319-47dd66058adf-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6\" (UID: \"c9abfca8-1035-4cc4-8319-47dd66058adf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" Apr 23 18:15:52.723983 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.723961 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c9abfca8-1035-4cc4-8319-47dd66058adf-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6\" (UID: \"c9abfca8-1035-4cc4-8319-47dd66058adf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" Apr 23 18:15:52.728176 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.728137 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" podUID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.38:8643/healthz\": dial tcp 10.132.0.38:8643: connect: connection refused" Apr 23 18:15:52.732522 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.732498 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" podUID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 23 18:15:52.980166 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:52.980077 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" Apr 23 18:15:53.031648 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:53.031611 2576 generic.go:358] "Generic (PLEG): container finished" podID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerID="97e7e3e86b8c31156ff996ad5577363cded38e04649c18ea072e092985e1346f" exitCode=2 Apr 23 18:15:53.031829 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:53.031697 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" event={"ID":"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5","Type":"ContainerDied","Data":"97e7e3e86b8c31156ff996ad5577363cded38e04649c18ea072e092985e1346f"} Apr 23 18:15:53.113362 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:53.113328 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6"] Apr 23 18:15:53.116557 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:15:53.116514 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9abfca8_1035_4cc4_8319_47dd66058adf.slice/crio-5b28ccbce183fb8bc20a0c7acb64492aa3b44c613a5cdf5d57539d9d0d21fada WatchSource:0}: Error finding container 5b28ccbce183fb8bc20a0c7acb64492aa3b44c613a5cdf5d57539d9d0d21fada: Status 404 returned error can't find the container with id 5b28ccbce183fb8bc20a0c7acb64492aa3b44c613a5cdf5d57539d9d0d21fada Apr 23 18:15:53.118339 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:53.118322 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:15:54.036529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:54.036492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" event={"ID":"c9abfca8-1035-4cc4-8319-47dd66058adf","Type":"ContainerStarted","Data":"1786e900b65d1fb84225529da50f0897644c36f83a5e2f2fa5b5d7695cb31a3d"} Apr 23 18:15:54.036529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:54.036528 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" event={"ID":"c9abfca8-1035-4cc4-8319-47dd66058adf","Type":"ContainerStarted","Data":"5b28ccbce183fb8bc20a0c7acb64492aa3b44c613a5cdf5d57539d9d0d21fada"} Apr 23 18:15:57.047562 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:57.047520 2576 generic.go:358] "Generic (PLEG): container finished" podID="c9abfca8-1035-4cc4-8319-47dd66058adf" containerID="1786e900b65d1fb84225529da50f0897644c36f83a5e2f2fa5b5d7695cb31a3d" exitCode=0 Apr 23 18:15:57.047962 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:57.047593 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" event={"ID":"c9abfca8-1035-4cc4-8319-47dd66058adf","Type":"ContainerDied","Data":"1786e900b65d1fb84225529da50f0897644c36f83a5e2f2fa5b5d7695cb31a3d"} Apr 23 18:15:57.384043 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:57.384020 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" Apr 23 18:15:57.468784 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:57.468670 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-kserve-provision-location\") pod \"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5\" (UID: \"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5\") " Apr 23 18:15:57.468784 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:57.468722 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-proxy-tls\") pod \"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5\" (UID: \"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5\") " Apr 23 18:15:57.469124 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:57.468784 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5\" (UID: \"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5\") " Apr 23 18:15:57.469124 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:57.468814 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpjfl\" (UniqueName: \"kubernetes.io/projected/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-kube-api-access-tpjfl\") pod \"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5\" (UID: \"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5\") " Apr 23 18:15:57.469843 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:57.469792 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" (UID: "94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:15:57.470071 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:57.470043 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config") pod "94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" (UID: "94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5"). InnerVolumeSpecName "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:15:57.472097 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:57.472061 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" (UID: "94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:15:57.473646 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:57.473571 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-kube-api-access-tpjfl" (OuterVolumeSpecName: "kube-api-access-tpjfl") pod "94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" (UID: "94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5"). InnerVolumeSpecName "kube-api-access-tpjfl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:15:57.570639 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:57.570539 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:15:57.570639 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:57.570572 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:15:57.570639 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:57.570590 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:15:57.570639 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:57.570605 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tpjfl\" (UniqueName: \"kubernetes.io/projected/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5-kube-api-access-tpjfl\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:15:58.057280 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:58.057238 2576 generic.go:358] "Generic (PLEG): container finished" podID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerID="c14c0b4f47ec45efb3b9977943cd908dde536300ada4487bd7868ca1caca301d" exitCode=0 Apr 23 18:15:58.057770 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:58.057319 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" event={"ID":"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5","Type":"ContainerDied","Data":"c14c0b4f47ec45efb3b9977943cd908dde536300ada4487bd7868ca1caca301d"} Apr 23 18:15:58.057770 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:58.057351 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" event={"ID":"94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5","Type":"ContainerDied","Data":"e9c9d06583df9c9c09960f3c997ae2f8925b42166008e6d03dceb0d8de0262ff"} Apr 23 18:15:58.057770 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:58.057371 2576 scope.go:117] "RemoveContainer" containerID="97e7e3e86b8c31156ff996ad5577363cded38e04649c18ea072e092985e1346f" Apr 23 18:15:58.057770 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:58.057645 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd" Apr 23 18:15:58.075643 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:58.075621 2576 scope.go:117] "RemoveContainer" containerID="c14c0b4f47ec45efb3b9977943cd908dde536300ada4487bd7868ca1caca301d" Apr 23 18:15:58.089040 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:58.088995 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd"] Apr 23 18:15:58.094492 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:58.094413 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-f9ctd"] Apr 23 18:15:58.097620 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:58.097229 2576 scope.go:117] "RemoveContainer" containerID="14a4c49955387c23f189b55189b756fb3018b19ed38e84c3b2449275572c3fe1" Apr 23 18:15:58.109958 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:58.109936 2576 scope.go:117] "RemoveContainer" containerID="97e7e3e86b8c31156ff996ad5577363cded38e04649c18ea072e092985e1346f" Apr 23 18:15:58.110624 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:15:58.110557 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97e7e3e86b8c31156ff996ad5577363cded38e04649c18ea072e092985e1346f\": container with ID starting with 97e7e3e86b8c31156ff996ad5577363cded38e04649c18ea072e092985e1346f not found: ID does not exist" containerID="97e7e3e86b8c31156ff996ad5577363cded38e04649c18ea072e092985e1346f" Apr 23 18:15:58.110624 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:58.110600 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97e7e3e86b8c31156ff996ad5577363cded38e04649c18ea072e092985e1346f"} err="failed to get container status \"97e7e3e86b8c31156ff996ad5577363cded38e04649c18ea072e092985e1346f\": rpc error: code = NotFound desc = could not find container \"97e7e3e86b8c31156ff996ad5577363cded38e04649c18ea072e092985e1346f\": container with ID starting with 97e7e3e86b8c31156ff996ad5577363cded38e04649c18ea072e092985e1346f not found: ID does not exist" Apr 23 18:15:58.110775 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:58.110627 2576 scope.go:117] "RemoveContainer" containerID="c14c0b4f47ec45efb3b9977943cd908dde536300ada4487bd7868ca1caca301d" Apr 23 18:15:58.110961 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:15:58.110938 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c14c0b4f47ec45efb3b9977943cd908dde536300ada4487bd7868ca1caca301d\": container with ID starting with c14c0b4f47ec45efb3b9977943cd908dde536300ada4487bd7868ca1caca301d not found: ID does not exist" containerID="c14c0b4f47ec45efb3b9977943cd908dde536300ada4487bd7868ca1caca301d" Apr 23 18:15:58.111028 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:58.110972 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c14c0b4f47ec45efb3b9977943cd908dde536300ada4487bd7868ca1caca301d"} err="failed to get container status \"c14c0b4f47ec45efb3b9977943cd908dde536300ada4487bd7868ca1caca301d\": rpc error: code = NotFound desc = could not find container \"c14c0b4f47ec45efb3b9977943cd908dde536300ada4487bd7868ca1caca301d\": container with ID starting with c14c0b4f47ec45efb3b9977943cd908dde536300ada4487bd7868ca1caca301d not found: ID does not exist" Apr 23 18:15:58.111028 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:58.110995 2576 scope.go:117] "RemoveContainer" containerID="14a4c49955387c23f189b55189b756fb3018b19ed38e84c3b2449275572c3fe1" Apr 23 18:15:58.111319 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:15:58.111292 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14a4c49955387c23f189b55189b756fb3018b19ed38e84c3b2449275572c3fe1\": container with ID starting with 14a4c49955387c23f189b55189b756fb3018b19ed38e84c3b2449275572c3fe1 not found: ID does not exist" containerID="14a4c49955387c23f189b55189b756fb3018b19ed38e84c3b2449275572c3fe1" Apr 23 18:15:58.111385 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:58.111328 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a4c49955387c23f189b55189b756fb3018b19ed38e84c3b2449275572c3fe1"} err="failed to get container status \"14a4c49955387c23f189b55189b756fb3018b19ed38e84c3b2449275572c3fe1\": rpc error: code = NotFound desc = could not find container \"14a4c49955387c23f189b55189b756fb3018b19ed38e84c3b2449275572c3fe1\": container with ID starting with 14a4c49955387c23f189b55189b756fb3018b19ed38e84c3b2449275572c3fe1 not found: ID does not exist" Apr 23 18:15:59.419756 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:15:59.419718 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" path="/var/lib/kubelet/pods/94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5/volumes" Apr 23 18:18:00.555149 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:00.555058 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" event={"ID":"c9abfca8-1035-4cc4-8319-47dd66058adf","Type":"ContainerStarted","Data":"25a4f809e9b74f75142441e64262e1a3f762d87a299dbf27dae77d45467e642f"} Apr 23 18:18:00.555149 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:00.555100 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" event={"ID":"c9abfca8-1035-4cc4-8319-47dd66058adf","Type":"ContainerStarted","Data":"43d302a10dda91cafdd43be796a26f60e1da981dabbc815b7ebb11360115bdcf"} Apr 23 18:18:00.555594 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:00.555168 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" Apr 23 18:18:00.555594 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:00.555291 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" Apr 23 18:18:00.586308 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:00.586239 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" podStartSLOduration=5.368516021 podStartE2EDuration="2m8.586190265s" podCreationTimestamp="2026-04-23 18:15:52 +0000 UTC" firstStartedPulling="2026-04-23 18:15:57.0488111 +0000 UTC m=+1046.216138400" lastFinishedPulling="2026-04-23 18:18:00.266485344 +0000 UTC m=+1169.433812644" observedRunningTime="2026-04-23 18:18:00.584729838 +0000 UTC m=+1169.752057157" watchObservedRunningTime="2026-04-23 18:18:00.586190265 +0000 UTC m=+1169.753517585" Apr 23 18:18:06.570184 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:06.570154 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" Apr 23 18:18:31.383689 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:31.383657 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:18:31.384233 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:31.383663 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:18:36.574150 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:36.574118 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" Apr 23 18:18:42.247669 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.247636 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6"] Apr 23 18:18:42.249242 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.248035 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" podUID="c9abfca8-1035-4cc4-8319-47dd66058adf" containerName="kserve-container" containerID="cri-o://43d302a10dda91cafdd43be796a26f60e1da981dabbc815b7ebb11360115bdcf" gracePeriod=30 Apr 23 18:18:42.249242 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.248180 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" podUID="c9abfca8-1035-4cc4-8319-47dd66058adf" containerName="kube-rbac-proxy" containerID="cri-o://25a4f809e9b74f75142441e64262e1a3f762d87a299dbf27dae77d45467e642f" gracePeriod=30 Apr 23 18:18:42.347975 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.347931 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl"] Apr 23 18:18:42.348412 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.348393 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerName="storage-initializer" Apr 23 18:18:42.348529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.348414 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerName="storage-initializer" Apr 23 18:18:42.348529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.348433 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerName="kserve-container" Apr 23 18:18:42.348529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.348441 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerName="kserve-container" Apr 23 18:18:42.348529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.348453 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerName="kube-rbac-proxy" Apr 23 18:18:42.348529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.348480 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerName="kube-rbac-proxy" Apr 23 18:18:42.348793 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.348578 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerName="kserve-container" Apr 23 18:18:42.348793 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.348593 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="94c4bdb6-8058-4d5c-ba3f-f3bbc24663b5" containerName="kube-rbac-proxy" Apr 23 18:18:42.350895 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.350871 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" Apr 23 18:18:42.353046 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.353011 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-predictor-serving-cert\"" Apr 23 18:18:42.353234 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.353019 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 23 18:18:42.362711 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.362589 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl"] Apr 23 18:18:42.390095 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.390066 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9207e288-1692-44d0-8c9d-e21ac58a2087-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl\" (UID: \"9207e288-1692-44d0-8c9d-e21ac58a2087\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" Apr 23 18:18:42.390258 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.390116 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4r7p\" (UniqueName: \"kubernetes.io/projected/9207e288-1692-44d0-8c9d-e21ac58a2087-kube-api-access-s4r7p\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl\" (UID: \"9207e288-1692-44d0-8c9d-e21ac58a2087\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" Apr 23 18:18:42.390258 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.390174 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9207e288-1692-44d0-8c9d-e21ac58a2087-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl\" (UID: \"9207e288-1692-44d0-8c9d-e21ac58a2087\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" Apr 23 18:18:42.390258 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.390196 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9207e288-1692-44d0-8c9d-e21ac58a2087-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl\" (UID: \"9207e288-1692-44d0-8c9d-e21ac58a2087\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" Apr 23 18:18:42.490621 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.490578 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9207e288-1692-44d0-8c9d-e21ac58a2087-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl\" (UID: \"9207e288-1692-44d0-8c9d-e21ac58a2087\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" Apr 23 18:18:42.490823 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.490641 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4r7p\" (UniqueName: \"kubernetes.io/projected/9207e288-1692-44d0-8c9d-e21ac58a2087-kube-api-access-s4r7p\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl\" (UID: \"9207e288-1692-44d0-8c9d-e21ac58a2087\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" Apr 23 18:18:42.490823 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.490682 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9207e288-1692-44d0-8c9d-e21ac58a2087-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl\" (UID: \"9207e288-1692-44d0-8c9d-e21ac58a2087\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" Apr 23 18:18:42.490823 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.490715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9207e288-1692-44d0-8c9d-e21ac58a2087-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl\" (UID: \"9207e288-1692-44d0-8c9d-e21ac58a2087\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" Apr 23 18:18:42.490823 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:18:42.490727 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-serving-cert: secret "isvc-lightgbm-v2-kserve-predictor-serving-cert" not found Apr 23 18:18:42.490823 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:18:42.490801 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9207e288-1692-44d0-8c9d-e21ac58a2087-proxy-tls podName:9207e288-1692-44d0-8c9d-e21ac58a2087 nodeName:}" failed. No retries permitted until 2026-04-23 18:18:42.990778972 +0000 UTC m=+1212.158106271 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9207e288-1692-44d0-8c9d-e21ac58a2087-proxy-tls") pod "isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" (UID: "9207e288-1692-44d0-8c9d-e21ac58a2087") : secret "isvc-lightgbm-v2-kserve-predictor-serving-cert" not found Apr 23 18:18:42.491165 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.491145 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9207e288-1692-44d0-8c9d-e21ac58a2087-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl\" (UID: \"9207e288-1692-44d0-8c9d-e21ac58a2087\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" Apr 23 18:18:42.491335 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.491318 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9207e288-1692-44d0-8c9d-e21ac58a2087-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl\" (UID: \"9207e288-1692-44d0-8c9d-e21ac58a2087\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" Apr 23 18:18:42.503256 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.503186 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4r7p\" (UniqueName: \"kubernetes.io/projected/9207e288-1692-44d0-8c9d-e21ac58a2087-kube-api-access-s4r7p\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl\" (UID: \"9207e288-1692-44d0-8c9d-e21ac58a2087\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" Apr 23 18:18:42.702907 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.702871 2576 generic.go:358] "Generic (PLEG): container finished" podID="c9abfca8-1035-4cc4-8319-47dd66058adf" containerID="25a4f809e9b74f75142441e64262e1a3f762d87a299dbf27dae77d45467e642f" exitCode=2 Apr 23 18:18:42.703094 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.702945 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" event={"ID":"c9abfca8-1035-4cc4-8319-47dd66058adf","Type":"ContainerDied","Data":"25a4f809e9b74f75142441e64262e1a3f762d87a299dbf27dae77d45467e642f"} Apr 23 18:18:42.993963 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:42.993919 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9207e288-1692-44d0-8c9d-e21ac58a2087-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl\" (UID: \"9207e288-1692-44d0-8c9d-e21ac58a2087\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" Apr 23 18:18:42.994182 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:18:42.994068 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-serving-cert: secret "isvc-lightgbm-v2-kserve-predictor-serving-cert" not found Apr 23 18:18:42.994182 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:18:42.994139 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9207e288-1692-44d0-8c9d-e21ac58a2087-proxy-tls podName:9207e288-1692-44d0-8c9d-e21ac58a2087 nodeName:}" failed. No retries permitted until 2026-04-23 18:18:43.994123773 +0000 UTC m=+1213.161451070 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9207e288-1692-44d0-8c9d-e21ac58a2087-proxy-tls") pod "isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" (UID: "9207e288-1692-44d0-8c9d-e21ac58a2087") : secret "isvc-lightgbm-v2-kserve-predictor-serving-cert" not found Apr 23 18:18:43.299681 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.299659 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" Apr 23 18:18:43.397352 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.397311 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c9abfca8-1035-4cc4-8319-47dd66058adf-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"c9abfca8-1035-4cc4-8319-47dd66058adf\" (UID: \"c9abfca8-1035-4cc4-8319-47dd66058adf\") " Apr 23 18:18:43.397547 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.397377 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9abfca8-1035-4cc4-8319-47dd66058adf-kserve-provision-location\") pod \"c9abfca8-1035-4cc4-8319-47dd66058adf\" (UID: \"c9abfca8-1035-4cc4-8319-47dd66058adf\") " Apr 23 18:18:43.397547 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.397515 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkzpg\" (UniqueName: \"kubernetes.io/projected/c9abfca8-1035-4cc4-8319-47dd66058adf-kube-api-access-bkzpg\") pod \"c9abfca8-1035-4cc4-8319-47dd66058adf\" (UID: \"c9abfca8-1035-4cc4-8319-47dd66058adf\") " Apr 23 18:18:43.397637 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.397567 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c9abfca8-1035-4cc4-8319-47dd66058adf-proxy-tls\") pod \"c9abfca8-1035-4cc4-8319-47dd66058adf\" (UID: \"c9abfca8-1035-4cc4-8319-47dd66058adf\") " Apr 23 18:18:43.397747 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.397724 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9abfca8-1035-4cc4-8319-47dd66058adf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c9abfca8-1035-4cc4-8319-47dd66058adf" (UID: "c9abfca8-1035-4cc4-8319-47dd66058adf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:18:43.397784 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.397744 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9abfca8-1035-4cc4-8319-47dd66058adf-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config") pod "c9abfca8-1035-4cc4-8319-47dd66058adf" (UID: "c9abfca8-1035-4cc4-8319-47dd66058adf"). InnerVolumeSpecName "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:18:43.399775 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.399745 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9abfca8-1035-4cc4-8319-47dd66058adf-kube-api-access-bkzpg" (OuterVolumeSpecName: "kube-api-access-bkzpg") pod "c9abfca8-1035-4cc4-8319-47dd66058adf" (UID: "c9abfca8-1035-4cc4-8319-47dd66058adf"). InnerVolumeSpecName "kube-api-access-bkzpg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:18:43.399884 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.399791 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9abfca8-1035-4cc4-8319-47dd66058adf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c9abfca8-1035-4cc4-8319-47dd66058adf" (UID: "c9abfca8-1035-4cc4-8319-47dd66058adf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:18:43.498976 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.498936 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c9abfca8-1035-4cc4-8319-47dd66058adf-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:18:43.498976 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.498969 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c9abfca8-1035-4cc4-8319-47dd66058adf-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:18:43.498976 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.498979 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9abfca8-1035-4cc4-8319-47dd66058adf-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:18:43.499212 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.498989 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bkzpg\" (UniqueName: \"kubernetes.io/projected/c9abfca8-1035-4cc4-8319-47dd66058adf-kube-api-access-bkzpg\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:18:43.708085 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.708052 2576 generic.go:358] "Generic (PLEG): container finished" podID="c9abfca8-1035-4cc4-8319-47dd66058adf" containerID="43d302a10dda91cafdd43be796a26f60e1da981dabbc815b7ebb11360115bdcf" exitCode=0 Apr 23 18:18:43.708276 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.708137 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" Apr 23 18:18:43.708276 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.708148 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" event={"ID":"c9abfca8-1035-4cc4-8319-47dd66058adf","Type":"ContainerDied","Data":"43d302a10dda91cafdd43be796a26f60e1da981dabbc815b7ebb11360115bdcf"} Apr 23 18:18:43.708276 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.708188 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6" event={"ID":"c9abfca8-1035-4cc4-8319-47dd66058adf","Type":"ContainerDied","Data":"5b28ccbce183fb8bc20a0c7acb64492aa3b44c613a5cdf5d57539d9d0d21fada"} Apr 23 18:18:43.708276 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.708205 2576 scope.go:117] "RemoveContainer" containerID="25a4f809e9b74f75142441e64262e1a3f762d87a299dbf27dae77d45467e642f" Apr 23 18:18:43.716595 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.716575 2576 scope.go:117] "RemoveContainer" containerID="43d302a10dda91cafdd43be796a26f60e1da981dabbc815b7ebb11360115bdcf" Apr 23 18:18:43.723659 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.723643 2576 scope.go:117] "RemoveContainer" containerID="1786e900b65d1fb84225529da50f0897644c36f83a5e2f2fa5b5d7695cb31a3d" Apr 23 18:18:43.726689 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.726667 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6"] Apr 23 18:18:43.731203 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.731184 2576 scope.go:117] "RemoveContainer" containerID="25a4f809e9b74f75142441e64262e1a3f762d87a299dbf27dae77d45467e642f" Apr 23 18:18:43.731474 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:18:43.731438 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25a4f809e9b74f75142441e64262e1a3f762d87a299dbf27dae77d45467e642f\": container with ID starting with 25a4f809e9b74f75142441e64262e1a3f762d87a299dbf27dae77d45467e642f not found: ID does not exist" containerID="25a4f809e9b74f75142441e64262e1a3f762d87a299dbf27dae77d45467e642f" Apr 23 18:18:43.731537 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.731485 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-lslm6"] Apr 23 18:18:43.731537 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.731488 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25a4f809e9b74f75142441e64262e1a3f762d87a299dbf27dae77d45467e642f"} err="failed to get container status \"25a4f809e9b74f75142441e64262e1a3f762d87a299dbf27dae77d45467e642f\": rpc error: code = NotFound desc = could not find container \"25a4f809e9b74f75142441e64262e1a3f762d87a299dbf27dae77d45467e642f\": container with ID starting with 25a4f809e9b74f75142441e64262e1a3f762d87a299dbf27dae77d45467e642f not found: ID does not exist" Apr 23 18:18:43.731537 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.731507 2576 scope.go:117] "RemoveContainer" containerID="43d302a10dda91cafdd43be796a26f60e1da981dabbc815b7ebb11360115bdcf" Apr 23 18:18:43.731789 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:18:43.731772 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43d302a10dda91cafdd43be796a26f60e1da981dabbc815b7ebb11360115bdcf\": container with ID starting with 43d302a10dda91cafdd43be796a26f60e1da981dabbc815b7ebb11360115bdcf not found: ID does not exist" containerID="43d302a10dda91cafdd43be796a26f60e1da981dabbc815b7ebb11360115bdcf" Apr 23 18:18:43.731865 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.731794 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d302a10dda91cafdd43be796a26f60e1da981dabbc815b7ebb11360115bdcf"} err="failed to get container status \"43d302a10dda91cafdd43be796a26f60e1da981dabbc815b7ebb11360115bdcf\": rpc error: code = NotFound desc = could not find container \"43d302a10dda91cafdd43be796a26f60e1da981dabbc815b7ebb11360115bdcf\": container with ID starting with 43d302a10dda91cafdd43be796a26f60e1da981dabbc815b7ebb11360115bdcf not found: ID does not exist" Apr 23 18:18:43.731865 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.731810 2576 scope.go:117] "RemoveContainer" containerID="1786e900b65d1fb84225529da50f0897644c36f83a5e2f2fa5b5d7695cb31a3d" Apr 23 18:18:43.732101 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:18:43.732027 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1786e900b65d1fb84225529da50f0897644c36f83a5e2f2fa5b5d7695cb31a3d\": container with ID starting with 1786e900b65d1fb84225529da50f0897644c36f83a5e2f2fa5b5d7695cb31a3d not found: ID does not exist" containerID="1786e900b65d1fb84225529da50f0897644c36f83a5e2f2fa5b5d7695cb31a3d" Apr 23 18:18:43.732166 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:43.732112 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1786e900b65d1fb84225529da50f0897644c36f83a5e2f2fa5b5d7695cb31a3d"} err="failed to get container status \"1786e900b65d1fb84225529da50f0897644c36f83a5e2f2fa5b5d7695cb31a3d\": rpc error: code = NotFound desc = could not find container \"1786e900b65d1fb84225529da50f0897644c36f83a5e2f2fa5b5d7695cb31a3d\": container with ID starting with 1786e900b65d1fb84225529da50f0897644c36f83a5e2f2fa5b5d7695cb31a3d not found: ID does not exist" Apr 23 18:18:44.005125 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:44.005033 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9207e288-1692-44d0-8c9d-e21ac58a2087-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl\" (UID: \"9207e288-1692-44d0-8c9d-e21ac58a2087\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" Apr 23 18:18:44.007717 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:44.007686 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9207e288-1692-44d0-8c9d-e21ac58a2087-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl\" (UID: \"9207e288-1692-44d0-8c9d-e21ac58a2087\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" Apr 23 18:18:44.163821 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:44.163778 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" Apr 23 18:18:44.288325 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:44.287958 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl"] Apr 23 18:18:44.291054 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:18:44.291025 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9207e288_1692_44d0_8c9d_e21ac58a2087.slice/crio-dd818cfd7b6327ca59d12a97bf8c2024e6c2fb1f4082c1039d74e3041af84b13 WatchSource:0}: Error finding container dd818cfd7b6327ca59d12a97bf8c2024e6c2fb1f4082c1039d74e3041af84b13: Status 404 returned error can't find the container with id dd818cfd7b6327ca59d12a97bf8c2024e6c2fb1f4082c1039d74e3041af84b13 Apr 23 18:18:44.714500 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:44.714435 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" event={"ID":"9207e288-1692-44d0-8c9d-e21ac58a2087","Type":"ContainerStarted","Data":"38ab21e7642a988fbeb88ff60631f1b182ff889f1a5aaffbb0831fbcaa2464ba"} Apr 23 18:18:44.714500 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:44.714503 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" event={"ID":"9207e288-1692-44d0-8c9d-e21ac58a2087","Type":"ContainerStarted","Data":"dd818cfd7b6327ca59d12a97bf8c2024e6c2fb1f4082c1039d74e3041af84b13"} Apr 23 18:18:45.415441 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:45.415408 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9abfca8-1035-4cc4-8319-47dd66058adf" path="/var/lib/kubelet/pods/c9abfca8-1035-4cc4-8319-47dd66058adf/volumes" Apr 23 18:18:48.729687 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:48.729649 2576 generic.go:358] "Generic (PLEG): container finished" podID="9207e288-1692-44d0-8c9d-e21ac58a2087" containerID="38ab21e7642a988fbeb88ff60631f1b182ff889f1a5aaffbb0831fbcaa2464ba" exitCode=0 Apr 23 18:18:48.730098 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:48.729692 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" event={"ID":"9207e288-1692-44d0-8c9d-e21ac58a2087","Type":"ContainerDied","Data":"38ab21e7642a988fbeb88ff60631f1b182ff889f1a5aaffbb0831fbcaa2464ba"} Apr 23 18:18:49.735974 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:49.735934 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" event={"ID":"9207e288-1692-44d0-8c9d-e21ac58a2087","Type":"ContainerStarted","Data":"3762c89b58ac8fea4596c9024b45129fa5d8e4c2e9d577a46628f052285958d9"} Apr 23 18:18:49.735974 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:49.735974 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" event={"ID":"9207e288-1692-44d0-8c9d-e21ac58a2087","Type":"ContainerStarted","Data":"d88251670bb95804aa1eb0e7794696cb44c0c9974c6af476c0308f2787c8ba65"} Apr 23 18:18:49.736424 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:49.736259 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" Apr 23 18:18:49.736424 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:49.736395 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" Apr 23 18:18:49.737652 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:49.737627 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" podUID="9207e288-1692-44d0-8c9d-e21ac58a2087" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 23 18:18:49.755610 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:49.755565 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" podStartSLOduration=7.755552373 podStartE2EDuration="7.755552373s" podCreationTimestamp="2026-04-23 18:18:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:18:49.753542176 +0000 UTC m=+1218.920869495" watchObservedRunningTime="2026-04-23 18:18:49.755552373 +0000 UTC m=+1218.922879692" Apr 23 18:18:50.740283 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:50.740244 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" podUID="9207e288-1692-44d0-8c9d-e21ac58a2087" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 23 18:18:55.744968 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:55.744930 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" Apr 23 18:18:55.745620 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:18:55.745590 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" podUID="9207e288-1692-44d0-8c9d-e21ac58a2087" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 23 18:19:05.746635 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:05.746602 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" Apr 23 18:19:12.403097 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.403012 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl"] Apr 23 18:19:12.403555 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.403363 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" podUID="9207e288-1692-44d0-8c9d-e21ac58a2087" containerName="kserve-container" containerID="cri-o://d88251670bb95804aa1eb0e7794696cb44c0c9974c6af476c0308f2787c8ba65" gracePeriod=30 Apr 23 18:19:12.403555 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.403397 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" podUID="9207e288-1692-44d0-8c9d-e21ac58a2087" containerName="kube-rbac-proxy" containerID="cri-o://3762c89b58ac8fea4596c9024b45129fa5d8e4c2e9d577a46628f052285958d9" gracePeriod=30 Apr 23 18:19:12.505518 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.505455 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4"] Apr 23 18:19:12.505915 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.505900 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9abfca8-1035-4cc4-8319-47dd66058adf" containerName="storage-initializer" Apr 23 18:19:12.505915 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.505917 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9abfca8-1035-4cc4-8319-47dd66058adf" containerName="storage-initializer" Apr 23 18:19:12.506013 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.505930 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9abfca8-1035-4cc4-8319-47dd66058adf" containerName="kube-rbac-proxy" Apr 23 18:19:12.506013 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.505936 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9abfca8-1035-4cc4-8319-47dd66058adf" containerName="kube-rbac-proxy" Apr 23 18:19:12.506013 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.505946 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9abfca8-1035-4cc4-8319-47dd66058adf" containerName="kserve-container" Apr 23 18:19:12.506013 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.505952 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9abfca8-1035-4cc4-8319-47dd66058adf" containerName="kserve-container" Apr 23 18:19:12.506013 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.506013 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9abfca8-1035-4cc4-8319-47dd66058adf" containerName="kube-rbac-proxy" Apr 23 18:19:12.506164 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.506020 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9abfca8-1035-4cc4-8319-47dd66058adf" containerName="kserve-container" Apr 23 18:19:12.508156 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.508133 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" Apr 23 18:19:12.510824 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.510803 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-predictor-serving-cert\"" Apr 23 18:19:12.511014 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.510994 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 23 18:19:12.521346 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.521325 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4"] Apr 23 18:19:12.659572 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.659455 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ea1c030-692c-4cc6-8a41-0925c141a190-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4\" (UID: \"0ea1c030-692c-4cc6-8a41-0925c141a190\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" Apr 23 18:19:12.659572 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.659550 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0ea1c030-692c-4cc6-8a41-0925c141a190-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4\" (UID: \"0ea1c030-692c-4cc6-8a41-0925c141a190\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" Apr 23 18:19:12.659572 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.659575 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlv5f\" (UniqueName: \"kubernetes.io/projected/0ea1c030-692c-4cc6-8a41-0925c141a190-kube-api-access-hlv5f\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4\" (UID: \"0ea1c030-692c-4cc6-8a41-0925c141a190\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" Apr 23 18:19:12.659855 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.659608 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ea1c030-692c-4cc6-8a41-0925c141a190-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4\" (UID: \"0ea1c030-692c-4cc6-8a41-0925c141a190\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" Apr 23 18:19:12.760529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.760490 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ea1c030-692c-4cc6-8a41-0925c141a190-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4\" (UID: \"0ea1c030-692c-4cc6-8a41-0925c141a190\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" Apr 23 18:19:12.760720 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.760547 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ea1c030-692c-4cc6-8a41-0925c141a190-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4\" (UID: \"0ea1c030-692c-4cc6-8a41-0925c141a190\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" Apr 23 18:19:12.760720 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.760601 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0ea1c030-692c-4cc6-8a41-0925c141a190-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4\" (UID: \"0ea1c030-692c-4cc6-8a41-0925c141a190\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" Apr 23 18:19:12.760720 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.760624 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlv5f\" (UniqueName: \"kubernetes.io/projected/0ea1c030-692c-4cc6-8a41-0925c141a190-kube-api-access-hlv5f\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4\" (UID: \"0ea1c030-692c-4cc6-8a41-0925c141a190\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" Apr 23 18:19:12.760931 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.760909 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ea1c030-692c-4cc6-8a41-0925c141a190-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4\" (UID: \"0ea1c030-692c-4cc6-8a41-0925c141a190\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" Apr 23 18:19:12.761276 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.761258 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0ea1c030-692c-4cc6-8a41-0925c141a190-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4\" (UID: \"0ea1c030-692c-4cc6-8a41-0925c141a190\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" Apr 23 18:19:12.763252 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.763227 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ea1c030-692c-4cc6-8a41-0925c141a190-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4\" (UID: \"0ea1c030-692c-4cc6-8a41-0925c141a190\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" Apr 23 18:19:12.769125 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.769100 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlv5f\" (UniqueName: \"kubernetes.io/projected/0ea1c030-692c-4cc6-8a41-0925c141a190-kube-api-access-hlv5f\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4\" (UID: \"0ea1c030-692c-4cc6-8a41-0925c141a190\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" Apr 23 18:19:12.819118 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.819084 2576 generic.go:358] "Generic (PLEG): container finished" podID="9207e288-1692-44d0-8c9d-e21ac58a2087" containerID="3762c89b58ac8fea4596c9024b45129fa5d8e4c2e9d577a46628f052285958d9" exitCode=2 Apr 23 18:19:12.819302 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.819139 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" Apr 23 18:19:12.819302 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.819168 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" event={"ID":"9207e288-1692-44d0-8c9d-e21ac58a2087","Type":"ContainerDied","Data":"3762c89b58ac8fea4596c9024b45129fa5d8e4c2e9d577a46628f052285958d9"} Apr 23 18:19:12.963435 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:12.963405 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4"] Apr 23 18:19:12.965549 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:19:12.965519 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ea1c030_692c_4cc6_8a41_0925c141a190.slice/crio-7a05401b109e7277abf91ba1f336d531279b6a6cfaf854d8f2d71c8bfac5f1b8 WatchSource:0}: Error finding container 7a05401b109e7277abf91ba1f336d531279b6a6cfaf854d8f2d71c8bfac5f1b8: Status 404 returned error can't find the container with id 7a05401b109e7277abf91ba1f336d531279b6a6cfaf854d8f2d71c8bfac5f1b8 Apr 23 18:19:13.044606 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.044576 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" Apr 23 18:19:13.163642 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.163603 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9207e288-1692-44d0-8c9d-e21ac58a2087-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"9207e288-1692-44d0-8c9d-e21ac58a2087\" (UID: \"9207e288-1692-44d0-8c9d-e21ac58a2087\") " Apr 23 18:19:13.163836 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.163655 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4r7p\" (UniqueName: \"kubernetes.io/projected/9207e288-1692-44d0-8c9d-e21ac58a2087-kube-api-access-s4r7p\") pod \"9207e288-1692-44d0-8c9d-e21ac58a2087\" (UID: \"9207e288-1692-44d0-8c9d-e21ac58a2087\") " Apr 23 18:19:13.163836 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.163687 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9207e288-1692-44d0-8c9d-e21ac58a2087-proxy-tls\") pod \"9207e288-1692-44d0-8c9d-e21ac58a2087\" (UID: \"9207e288-1692-44d0-8c9d-e21ac58a2087\") " Apr 23 18:19:13.163836 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.163738 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9207e288-1692-44d0-8c9d-e21ac58a2087-kserve-provision-location\") pod \"9207e288-1692-44d0-8c9d-e21ac58a2087\" (UID: \"9207e288-1692-44d0-8c9d-e21ac58a2087\") " Apr 23 18:19:13.164046 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.163969 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9207e288-1692-44d0-8c9d-e21ac58a2087-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config") pod "9207e288-1692-44d0-8c9d-e21ac58a2087" (UID: "9207e288-1692-44d0-8c9d-e21ac58a2087"). InnerVolumeSpecName "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:19:13.164125 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.164104 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9207e288-1692-44d0-8c9d-e21ac58a2087-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9207e288-1692-44d0-8c9d-e21ac58a2087" (UID: "9207e288-1692-44d0-8c9d-e21ac58a2087"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:19:13.166101 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.166035 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9207e288-1692-44d0-8c9d-e21ac58a2087-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9207e288-1692-44d0-8c9d-e21ac58a2087" (UID: "9207e288-1692-44d0-8c9d-e21ac58a2087"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:19:13.166101 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.166034 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9207e288-1692-44d0-8c9d-e21ac58a2087-kube-api-access-s4r7p" (OuterVolumeSpecName: "kube-api-access-s4r7p") pod "9207e288-1692-44d0-8c9d-e21ac58a2087" (UID: "9207e288-1692-44d0-8c9d-e21ac58a2087"). InnerVolumeSpecName "kube-api-access-s4r7p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:19:13.264938 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.264900 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s4r7p\" (UniqueName: \"kubernetes.io/projected/9207e288-1692-44d0-8c9d-e21ac58a2087-kube-api-access-s4r7p\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:19:13.264938 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.264933 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9207e288-1692-44d0-8c9d-e21ac58a2087-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:19:13.264938 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.264944 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9207e288-1692-44d0-8c9d-e21ac58a2087-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:19:13.265188 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.264953 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9207e288-1692-44d0-8c9d-e21ac58a2087-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:19:13.824016 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.823976 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" event={"ID":"0ea1c030-692c-4cc6-8a41-0925c141a190","Type":"ContainerStarted","Data":"32758aaed502724d4fbd4f8ca76e7350f7ae0603b331f85f7ce2603306d13904"} Apr 23 18:19:13.824016 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.824016 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" event={"ID":"0ea1c030-692c-4cc6-8a41-0925c141a190","Type":"ContainerStarted","Data":"7a05401b109e7277abf91ba1f336d531279b6a6cfaf854d8f2d71c8bfac5f1b8"} Apr 23 18:19:13.825684 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.825658 2576 generic.go:358] "Generic (PLEG): container finished" podID="9207e288-1692-44d0-8c9d-e21ac58a2087" containerID="d88251670bb95804aa1eb0e7794696cb44c0c9974c6af476c0308f2787c8ba65" exitCode=0 Apr 23 18:19:13.825786 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.825725 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" Apr 23 18:19:13.825786 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.825746 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" event={"ID":"9207e288-1692-44d0-8c9d-e21ac58a2087","Type":"ContainerDied","Data":"d88251670bb95804aa1eb0e7794696cb44c0c9974c6af476c0308f2787c8ba65"} Apr 23 18:19:13.825786 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.825783 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl" event={"ID":"9207e288-1692-44d0-8c9d-e21ac58a2087","Type":"ContainerDied","Data":"dd818cfd7b6327ca59d12a97bf8c2024e6c2fb1f4082c1039d74e3041af84b13"} Apr 23 18:19:13.825888 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.825798 2576 scope.go:117] "RemoveContainer" containerID="3762c89b58ac8fea4596c9024b45129fa5d8e4c2e9d577a46628f052285958d9" Apr 23 18:19:13.836961 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.836939 2576 scope.go:117] "RemoveContainer" containerID="d88251670bb95804aa1eb0e7794696cb44c0c9974c6af476c0308f2787c8ba65" Apr 23 18:19:13.845000 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.844981 2576 scope.go:117] "RemoveContainer" containerID="38ab21e7642a988fbeb88ff60631f1b182ff889f1a5aaffbb0831fbcaa2464ba" Apr 23 18:19:13.852076 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.852058 2576 scope.go:117] "RemoveContainer" containerID="3762c89b58ac8fea4596c9024b45129fa5d8e4c2e9d577a46628f052285958d9" Apr 23 18:19:13.852310 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:19:13.852294 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3762c89b58ac8fea4596c9024b45129fa5d8e4c2e9d577a46628f052285958d9\": container with ID starting with 3762c89b58ac8fea4596c9024b45129fa5d8e4c2e9d577a46628f052285958d9 not found: ID does not exist" containerID="3762c89b58ac8fea4596c9024b45129fa5d8e4c2e9d577a46628f052285958d9" Apr 23 18:19:13.852353 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.852318 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3762c89b58ac8fea4596c9024b45129fa5d8e4c2e9d577a46628f052285958d9"} err="failed to get container status \"3762c89b58ac8fea4596c9024b45129fa5d8e4c2e9d577a46628f052285958d9\": rpc error: code = NotFound desc = could not find container \"3762c89b58ac8fea4596c9024b45129fa5d8e4c2e9d577a46628f052285958d9\": container with ID starting with 3762c89b58ac8fea4596c9024b45129fa5d8e4c2e9d577a46628f052285958d9 not found: ID does not exist" Apr 23 18:19:13.852353 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.852335 2576 scope.go:117] "RemoveContainer" containerID="d88251670bb95804aa1eb0e7794696cb44c0c9974c6af476c0308f2787c8ba65" Apr 23 18:19:13.852574 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:19:13.852556 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d88251670bb95804aa1eb0e7794696cb44c0c9974c6af476c0308f2787c8ba65\": container with ID starting with d88251670bb95804aa1eb0e7794696cb44c0c9974c6af476c0308f2787c8ba65 not found: ID does not exist" containerID="d88251670bb95804aa1eb0e7794696cb44c0c9974c6af476c0308f2787c8ba65" Apr 23 18:19:13.852625 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.852582 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88251670bb95804aa1eb0e7794696cb44c0c9974c6af476c0308f2787c8ba65"} err="failed to get container status \"d88251670bb95804aa1eb0e7794696cb44c0c9974c6af476c0308f2787c8ba65\": rpc error: code = NotFound desc = could not find container \"d88251670bb95804aa1eb0e7794696cb44c0c9974c6af476c0308f2787c8ba65\": container with ID starting with d88251670bb95804aa1eb0e7794696cb44c0c9974c6af476c0308f2787c8ba65 not found: ID does not exist" Apr 23 18:19:13.852625 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.852605 2576 scope.go:117] "RemoveContainer" containerID="38ab21e7642a988fbeb88ff60631f1b182ff889f1a5aaffbb0831fbcaa2464ba" Apr 23 18:19:13.852809 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:19:13.852792 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38ab21e7642a988fbeb88ff60631f1b182ff889f1a5aaffbb0831fbcaa2464ba\": container with ID starting with 38ab21e7642a988fbeb88ff60631f1b182ff889f1a5aaffbb0831fbcaa2464ba not found: ID does not exist" containerID="38ab21e7642a988fbeb88ff60631f1b182ff889f1a5aaffbb0831fbcaa2464ba" Apr 23 18:19:13.852852 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.852813 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38ab21e7642a988fbeb88ff60631f1b182ff889f1a5aaffbb0831fbcaa2464ba"} err="failed to get container status \"38ab21e7642a988fbeb88ff60631f1b182ff889f1a5aaffbb0831fbcaa2464ba\": rpc error: code = NotFound desc = could not find container \"38ab21e7642a988fbeb88ff60631f1b182ff889f1a5aaffbb0831fbcaa2464ba\": container with ID starting with 38ab21e7642a988fbeb88ff60631f1b182ff889f1a5aaffbb0831fbcaa2464ba not found: ID does not exist" Apr 23 18:19:13.856318 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.856297 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl"] Apr 23 18:19:13.857805 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:13.857788 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-4vxbl"] Apr 23 18:19:15.415221 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:15.415180 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9207e288-1692-44d0-8c9d-e21ac58a2087" path="/var/lib/kubelet/pods/9207e288-1692-44d0-8c9d-e21ac58a2087/volumes" Apr 23 18:19:17.840699 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:17.840667 2576 generic.go:358] "Generic (PLEG): container finished" podID="0ea1c030-692c-4cc6-8a41-0925c141a190" containerID="32758aaed502724d4fbd4f8ca76e7350f7ae0603b331f85f7ce2603306d13904" exitCode=0 Apr 23 18:19:17.841186 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:17.840743 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" event={"ID":"0ea1c030-692c-4cc6-8a41-0925c141a190","Type":"ContainerDied","Data":"32758aaed502724d4fbd4f8ca76e7350f7ae0603b331f85f7ce2603306d13904"} Apr 23 18:19:18.846151 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:18.846112 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" event={"ID":"0ea1c030-692c-4cc6-8a41-0925c141a190","Type":"ContainerStarted","Data":"f03758f10baa16e8159e209d832ac433fb8d02a6410f9cebe791bce155132bcb"} Apr 23 18:19:18.846151 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:18.846154 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" event={"ID":"0ea1c030-692c-4cc6-8a41-0925c141a190","Type":"ContainerStarted","Data":"0ad8152585e61fbc4f74ecad920cbca5eef7c4d31b3ea0c458dba83b44de3f2c"} Apr 23 18:19:18.846726 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:18.846359 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" Apr 23 18:19:18.846726 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:18.846389 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" Apr 23 18:19:18.867225 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:18.867166 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" podStartSLOduration=6.867153051 podStartE2EDuration="6.867153051s" podCreationTimestamp="2026-04-23 18:19:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:19:18.865962812 +0000 UTC m=+1248.033290155" watchObservedRunningTime="2026-04-23 18:19:18.867153051 +0000 UTC m=+1248.034480370" Apr 23 18:19:24.855056 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:24.855029 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" Apr 23 18:19:54.858544 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:19:54.858502 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" Apr 23 18:20:02.593064 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.593030 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4"] Apr 23 18:20:02.593500 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.593336 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" podUID="0ea1c030-692c-4cc6-8a41-0925c141a190" containerName="kserve-container" containerID="cri-o://0ad8152585e61fbc4f74ecad920cbca5eef7c4d31b3ea0c458dba83b44de3f2c" gracePeriod=30 Apr 23 18:20:02.593500 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.593405 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" podUID="0ea1c030-692c-4cc6-8a41-0925c141a190" containerName="kube-rbac-proxy" containerID="cri-o://f03758f10baa16e8159e209d832ac433fb8d02a6410f9cebe791bce155132bcb" gracePeriod=30 Apr 23 18:20:02.687636 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.687598 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567"] Apr 23 18:20:02.687982 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.687969 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9207e288-1692-44d0-8c9d-e21ac58a2087" containerName="kserve-container" Apr 23 18:20:02.688035 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.687984 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9207e288-1692-44d0-8c9d-e21ac58a2087" containerName="kserve-container" Apr 23 18:20:02.688035 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.688000 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9207e288-1692-44d0-8c9d-e21ac58a2087" containerName="storage-initializer" Apr 23 18:20:02.688035 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.688006 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9207e288-1692-44d0-8c9d-e21ac58a2087" containerName="storage-initializer" Apr 23 18:20:02.688035 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.688015 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9207e288-1692-44d0-8c9d-e21ac58a2087" containerName="kube-rbac-proxy" Apr 23 18:20:02.688035 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.688020 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9207e288-1692-44d0-8c9d-e21ac58a2087" containerName="kube-rbac-proxy" Apr 23 18:20:02.688216 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.688076 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9207e288-1692-44d0-8c9d-e21ac58a2087" containerName="kserve-container" Apr 23 18:20:02.688216 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.688085 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9207e288-1692-44d0-8c9d-e21ac58a2087" containerName="kube-rbac-proxy" Apr 23 18:20:02.691400 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.691382 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:20:02.693340 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.693318 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-predictor-serving-cert\"" Apr 23 18:20:02.693518 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.693344 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\"" Apr 23 18:20:02.703844 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.703812 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567"] Apr 23 18:20:02.774567 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.774519 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e27c590c-e8c3-4443-bbe2-10461cacd2f9-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-6b96cd596-dv567\" (UID: \"e27c590c-e8c3-4443-bbe2-10461cacd2f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:20:02.774758 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.774685 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x26n4\" (UniqueName: \"kubernetes.io/projected/e27c590c-e8c3-4443-bbe2-10461cacd2f9-kube-api-access-x26n4\") pod \"isvc-sklearn-mcp-predictor-6b96cd596-dv567\" (UID: \"e27c590c-e8c3-4443-bbe2-10461cacd2f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:20:02.774758 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.774734 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e27c590c-e8c3-4443-bbe2-10461cacd2f9-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-6b96cd596-dv567\" (UID: \"e27c590c-e8c3-4443-bbe2-10461cacd2f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:20:02.774849 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.774761 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e27c590c-e8c3-4443-bbe2-10461cacd2f9-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-6b96cd596-dv567\" (UID: \"e27c590c-e8c3-4443-bbe2-10461cacd2f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:20:02.876328 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.876239 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x26n4\" (UniqueName: \"kubernetes.io/projected/e27c590c-e8c3-4443-bbe2-10461cacd2f9-kube-api-access-x26n4\") pod \"isvc-sklearn-mcp-predictor-6b96cd596-dv567\" (UID: \"e27c590c-e8c3-4443-bbe2-10461cacd2f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:20:02.876328 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.876286 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e27c590c-e8c3-4443-bbe2-10461cacd2f9-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-6b96cd596-dv567\" (UID: \"e27c590c-e8c3-4443-bbe2-10461cacd2f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:20:02.876328 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.876310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e27c590c-e8c3-4443-bbe2-10461cacd2f9-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-6b96cd596-dv567\" (UID: \"e27c590c-e8c3-4443-bbe2-10461cacd2f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:20:02.876328 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.876331 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e27c590c-e8c3-4443-bbe2-10461cacd2f9-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-6b96cd596-dv567\" (UID: \"e27c590c-e8c3-4443-bbe2-10461cacd2f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:20:02.876781 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.876758 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e27c590c-e8c3-4443-bbe2-10461cacd2f9-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-6b96cd596-dv567\" (UID: \"e27c590c-e8c3-4443-bbe2-10461cacd2f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:20:02.877031 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.877008 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e27c590c-e8c3-4443-bbe2-10461cacd2f9-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-6b96cd596-dv567\" (UID: \"e27c590c-e8c3-4443-bbe2-10461cacd2f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:20:02.879493 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.879435 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e27c590c-e8c3-4443-bbe2-10461cacd2f9-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-6b96cd596-dv567\" (UID: \"e27c590c-e8c3-4443-bbe2-10461cacd2f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:20:02.884258 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:02.884232 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x26n4\" (UniqueName: \"kubernetes.io/projected/e27c590c-e8c3-4443-bbe2-10461cacd2f9-kube-api-access-x26n4\") pod \"isvc-sklearn-mcp-predictor-6b96cd596-dv567\" (UID: \"e27c590c-e8c3-4443-bbe2-10461cacd2f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:20:03.004795 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:03.004758 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:20:03.008836 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:03.008803 2576 generic.go:358] "Generic (PLEG): container finished" podID="0ea1c030-692c-4cc6-8a41-0925c141a190" containerID="f03758f10baa16e8159e209d832ac433fb8d02a6410f9cebe791bce155132bcb" exitCode=2 Apr 23 18:20:03.008978 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:03.008875 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" event={"ID":"0ea1c030-692c-4cc6-8a41-0925c141a190","Type":"ContainerDied","Data":"f03758f10baa16e8159e209d832ac433fb8d02a6410f9cebe791bce155132bcb"} Apr 23 18:20:03.135106 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:03.135077 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567"] Apr 23 18:20:03.137580 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:20:03.137554 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode27c590c_e8c3_4443_bbe2_10461cacd2f9.slice/crio-4dc9e4ed9ae2a8a9fc2c13e48784bab6d0ba3d26afddb715b60d75174b4b7f79 WatchSource:0}: Error finding container 4dc9e4ed9ae2a8a9fc2c13e48784bab6d0ba3d26afddb715b60d75174b4b7f79: Status 404 returned error can't find the container with id 4dc9e4ed9ae2a8a9fc2c13e48784bab6d0ba3d26afddb715b60d75174b4b7f79 Apr 23 18:20:04.015919 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:04.015875 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" event={"ID":"e27c590c-e8c3-4443-bbe2-10461cacd2f9","Type":"ContainerStarted","Data":"39f72348700e1c273b838cb64bb20fbce1feefcd2c2832bac270252c78ed8aba"} Apr 23 18:20:04.015919 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:04.015921 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" event={"ID":"e27c590c-e8c3-4443-bbe2-10461cacd2f9","Type":"ContainerStarted","Data":"4dc9e4ed9ae2a8a9fc2c13e48784bab6d0ba3d26afddb715b60d75174b4b7f79"} Apr 23 18:20:04.850745 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:04.850702 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" podUID="0ea1c030-692c-4cc6-8a41-0925c141a190" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.41:8643/healthz\": dial tcp 10.132.0.41:8643: connect: connection refused" Apr 23 18:20:05.897643 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:05.897598 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" podUID="0ea1c030-692c-4cc6-8a41-0925c141a190" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.41:8080/v2/models/isvc-mlflow-v2-runtime/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 23 18:20:07.028722 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:07.028688 2576 generic.go:358] "Generic (PLEG): container finished" podID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerID="39f72348700e1c273b838cb64bb20fbce1feefcd2c2832bac270252c78ed8aba" exitCode=0 Apr 23 18:20:07.029175 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:07.028757 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" event={"ID":"e27c590c-e8c3-4443-bbe2-10461cacd2f9","Type":"ContainerDied","Data":"39f72348700e1c273b838cb64bb20fbce1feefcd2c2832bac270252c78ed8aba"} Apr 23 18:20:07.031077 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:07.031049 2576 generic.go:358] "Generic (PLEG): container finished" podID="0ea1c030-692c-4cc6-8a41-0925c141a190" containerID="0ad8152585e61fbc4f74ecad920cbca5eef7c4d31b3ea0c458dba83b44de3f2c" exitCode=0 Apr 23 18:20:07.031193 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:07.031085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" event={"ID":"0ea1c030-692c-4cc6-8a41-0925c141a190","Type":"ContainerDied","Data":"0ad8152585e61fbc4f74ecad920cbca5eef7c4d31b3ea0c458dba83b44de3f2c"} Apr 23 18:20:07.070238 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:07.070216 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" Apr 23 18:20:07.217719 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:07.217635 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ea1c030-692c-4cc6-8a41-0925c141a190-proxy-tls\") pod \"0ea1c030-692c-4cc6-8a41-0925c141a190\" (UID: \"0ea1c030-692c-4cc6-8a41-0925c141a190\") " Apr 23 18:20:07.217719 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:07.217671 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ea1c030-692c-4cc6-8a41-0925c141a190-kserve-provision-location\") pod \"0ea1c030-692c-4cc6-8a41-0925c141a190\" (UID: \"0ea1c030-692c-4cc6-8a41-0925c141a190\") " Apr 23 18:20:07.217719 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:07.217696 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlv5f\" (UniqueName: \"kubernetes.io/projected/0ea1c030-692c-4cc6-8a41-0925c141a190-kube-api-access-hlv5f\") pod \"0ea1c030-692c-4cc6-8a41-0925c141a190\" (UID: \"0ea1c030-692c-4cc6-8a41-0925c141a190\") " Apr 23 18:20:07.218012 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:07.217808 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0ea1c030-692c-4cc6-8a41-0925c141a190-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"0ea1c030-692c-4cc6-8a41-0925c141a190\" (UID: \"0ea1c030-692c-4cc6-8a41-0925c141a190\") " Apr 23 18:20:07.218099 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:07.218069 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ea1c030-692c-4cc6-8a41-0925c141a190-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0ea1c030-692c-4cc6-8a41-0925c141a190" (UID: "0ea1c030-692c-4cc6-8a41-0925c141a190"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:20:07.218169 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:07.218147 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea1c030-692c-4cc6-8a41-0925c141a190-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config") pod "0ea1c030-692c-4cc6-8a41-0925c141a190" (UID: "0ea1c030-692c-4cc6-8a41-0925c141a190"). InnerVolumeSpecName "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:20:07.219900 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:07.219873 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ea1c030-692c-4cc6-8a41-0925c141a190-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0ea1c030-692c-4cc6-8a41-0925c141a190" (UID: "0ea1c030-692c-4cc6-8a41-0925c141a190"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:20:07.220016 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:07.219997 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea1c030-692c-4cc6-8a41-0925c141a190-kube-api-access-hlv5f" (OuterVolumeSpecName: "kube-api-access-hlv5f") pod "0ea1c030-692c-4cc6-8a41-0925c141a190" (UID: "0ea1c030-692c-4cc6-8a41-0925c141a190"). InnerVolumeSpecName "kube-api-access-hlv5f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:20:07.318788 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:07.318751 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0ea1c030-692c-4cc6-8a41-0925c141a190-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:20:07.318788 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:07.318781 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ea1c030-692c-4cc6-8a41-0925c141a190-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:20:07.318788 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:07.318792 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ea1c030-692c-4cc6-8a41-0925c141a190-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:20:07.319017 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:07.318801 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hlv5f\" (UniqueName: \"kubernetes.io/projected/0ea1c030-692c-4cc6-8a41-0925c141a190-kube-api-access-hlv5f\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:20:08.036842 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:08.036802 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" event={"ID":"e27c590c-e8c3-4443-bbe2-10461cacd2f9","Type":"ContainerStarted","Data":"ac10237ae650deb6e864aa53954c2b03f878db4b48c1f0cb6e4bedc902d7329f"} Apr 23 18:20:08.039089 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:08.038991 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" event={"ID":"0ea1c030-692c-4cc6-8a41-0925c141a190","Type":"ContainerDied","Data":"7a05401b109e7277abf91ba1f336d531279b6a6cfaf854d8f2d71c8bfac5f1b8"} Apr 23 18:20:08.039089 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:08.039039 2576 scope.go:117] "RemoveContainer" containerID="f03758f10baa16e8159e209d832ac433fb8d02a6410f9cebe791bce155132bcb" Apr 23 18:20:08.039089 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:08.039067 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4" Apr 23 18:20:08.057378 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:08.057346 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4"] Apr 23 18:20:08.062176 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:08.062147 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-c28x4"] Apr 23 18:20:08.070629 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:08.070607 2576 scope.go:117] "RemoveContainer" containerID="0ad8152585e61fbc4f74ecad920cbca5eef7c4d31b3ea0c458dba83b44de3f2c" Apr 23 18:20:08.080741 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:08.080719 2576 scope.go:117] "RemoveContainer" containerID="32758aaed502724d4fbd4f8ca76e7350f7ae0603b331f85f7ce2603306d13904" Apr 23 18:20:09.045067 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:09.045026 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" event={"ID":"e27c590c-e8c3-4443-bbe2-10461cacd2f9","Type":"ContainerStarted","Data":"9e35db602efcb26f28a03f95ff0d422ace2b61b906e35828d0fef797b71cccb8"} Apr 23 18:20:09.045566 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:09.045073 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" event={"ID":"e27c590c-e8c3-4443-bbe2-10461cacd2f9","Type":"ContainerStarted","Data":"4895e6a3a01e4456c03274650534cec504fb21287c5b24233aa56732a8e7790f"} Apr 23 18:20:09.045566 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:09.045211 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:20:09.045566 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:09.045346 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:20:09.067593 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:09.067547 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" podStartSLOduration=5.328953655 podStartE2EDuration="7.067533805s" podCreationTimestamp="2026-04-23 18:20:02 +0000 UTC" firstStartedPulling="2026-04-23 18:20:07.107639525 +0000 UTC m=+1296.274966822" lastFinishedPulling="2026-04-23 18:20:08.846219673 +0000 UTC m=+1298.013546972" observedRunningTime="2026-04-23 18:20:09.065556721 +0000 UTC m=+1298.232884039" watchObservedRunningTime="2026-04-23 18:20:09.067533805 +0000 UTC m=+1298.234861125" Apr 23 18:20:09.415705 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:09.415671 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ea1c030-692c-4cc6-8a41-0925c141a190" path="/var/lib/kubelet/pods/0ea1c030-692c-4cc6-8a41-0925c141a190/volumes" Apr 23 18:20:10.051789 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:10.051760 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:20:16.060735 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:16.060705 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:20:46.062559 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:20:46.062476 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:21:16.063367 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:16.063326 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:21:22.924208 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:22.924174 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567"] Apr 23 18:21:22.924698 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:22.924541 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="kserve-container" containerID="cri-o://ac10237ae650deb6e864aa53954c2b03f878db4b48c1f0cb6e4bedc902d7329f" gracePeriod=30 Apr 23 18:21:22.924698 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:22.924580 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="kube-rbac-proxy" containerID="cri-o://9e35db602efcb26f28a03f95ff0d422ace2b61b906e35828d0fef797b71cccb8" gracePeriod=30 Apr 23 18:21:22.924698 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:22.924596 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="kserve-agent" containerID="cri-o://4895e6a3a01e4456c03274650534cec504fb21287c5b24233aa56732a8e7790f" gracePeriod=30 Apr 23 18:21:23.300265 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:23.300176 2576 generic.go:358] "Generic (PLEG): container finished" podID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerID="9e35db602efcb26f28a03f95ff0d422ace2b61b906e35828d0fef797b71cccb8" exitCode=2 Apr 23 18:21:23.300265 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:23.300246 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" event={"ID":"e27c590c-e8c3-4443-bbe2-10461cacd2f9","Type":"ContainerDied","Data":"9e35db602efcb26f28a03f95ff0d422ace2b61b906e35828d0fef797b71cccb8"} Apr 23 18:21:25.309519 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:25.309404 2576 generic.go:358] "Generic (PLEG): container finished" podID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerID="ac10237ae650deb6e864aa53954c2b03f878db4b48c1f0cb6e4bedc902d7329f" exitCode=0 Apr 23 18:21:25.309519 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:25.309491 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" event={"ID":"e27c590c-e8c3-4443-bbe2-10461cacd2f9","Type":"ContainerDied","Data":"ac10237ae650deb6e864aa53954c2b03f878db4b48c1f0cb6e4bedc902d7329f"} Apr 23 18:21:26.055910 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:26.055872 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.42:8643/healthz\": dial tcp 10.132.0.42:8643: connect: connection refused" Apr 23 18:21:26.061959 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:26.061931 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.42:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.42:8080: connect: connection refused" Apr 23 18:21:31.056070 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:31.056018 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.42:8643/healthz\": dial tcp 10.132.0.42:8643: connect: connection refused" Apr 23 18:21:36.056022 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:36.055976 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.42:8643/healthz\": dial tcp 10.132.0.42:8643: connect: connection refused" Apr 23 18:21:36.056415 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:36.056110 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:21:36.061533 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:36.061509 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.42:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.42:8080: connect: connection refused" Apr 23 18:21:41.056428 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:41.056382 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.42:8643/healthz\": dial tcp 10.132.0.42:8643: connect: connection refused" Apr 23 18:21:46.055990 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:46.055950 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.42:8643/healthz\": dial tcp 10.132.0.42:8643: connect: connection refused" Apr 23 18:21:46.061370 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:46.061340 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.42:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.42:8080: connect: connection refused" Apr 23 18:21:46.061483 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:46.061452 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:21:51.056521 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:51.056450 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.42:8643/healthz\": dial tcp 10.132.0.42:8643: connect: connection refused" Apr 23 18:21:53.070940 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.070908 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:21:53.115572 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.115538 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e27c590c-e8c3-4443-bbe2-10461cacd2f9-kserve-provision-location\") pod \"e27c590c-e8c3-4443-bbe2-10461cacd2f9\" (UID: \"e27c590c-e8c3-4443-bbe2-10461cacd2f9\") " Apr 23 18:21:53.115735 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.115583 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e27c590c-e8c3-4443-bbe2-10461cacd2f9-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"e27c590c-e8c3-4443-bbe2-10461cacd2f9\" (UID: \"e27c590c-e8c3-4443-bbe2-10461cacd2f9\") " Apr 23 18:21:53.115735 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.115661 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x26n4\" (UniqueName: \"kubernetes.io/projected/e27c590c-e8c3-4443-bbe2-10461cacd2f9-kube-api-access-x26n4\") pod \"e27c590c-e8c3-4443-bbe2-10461cacd2f9\" (UID: \"e27c590c-e8c3-4443-bbe2-10461cacd2f9\") " Apr 23 18:21:53.115735 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.115701 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e27c590c-e8c3-4443-bbe2-10461cacd2f9-proxy-tls\") pod \"e27c590c-e8c3-4443-bbe2-10461cacd2f9\" (UID: \"e27c590c-e8c3-4443-bbe2-10461cacd2f9\") " Apr 23 18:21:53.115920 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.115893 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e27c590c-e8c3-4443-bbe2-10461cacd2f9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e27c590c-e8c3-4443-bbe2-10461cacd2f9" (UID: "e27c590c-e8c3-4443-bbe2-10461cacd2f9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:21:53.115999 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.115979 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e27c590c-e8c3-4443-bbe2-10461cacd2f9-isvc-sklearn-mcp-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-mcp-kube-rbac-proxy-sar-config") pod "e27c590c-e8c3-4443-bbe2-10461cacd2f9" (UID: "e27c590c-e8c3-4443-bbe2-10461cacd2f9"). InnerVolumeSpecName "isvc-sklearn-mcp-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:21:53.117876 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.117854 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e27c590c-e8c3-4443-bbe2-10461cacd2f9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e27c590c-e8c3-4443-bbe2-10461cacd2f9" (UID: "e27c590c-e8c3-4443-bbe2-10461cacd2f9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:21:53.117978 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.117896 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e27c590c-e8c3-4443-bbe2-10461cacd2f9-kube-api-access-x26n4" (OuterVolumeSpecName: "kube-api-access-x26n4") pod "e27c590c-e8c3-4443-bbe2-10461cacd2f9" (UID: "e27c590c-e8c3-4443-bbe2-10461cacd2f9"). InnerVolumeSpecName "kube-api-access-x26n4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:21:53.216556 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.216453 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e27c590c-e8c3-4443-bbe2-10461cacd2f9-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:21:53.216556 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.216507 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e27c590c-e8c3-4443-bbe2-10461cacd2f9-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:21:53.216556 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.216520 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x26n4\" (UniqueName: \"kubernetes.io/projected/e27c590c-e8c3-4443-bbe2-10461cacd2f9-kube-api-access-x26n4\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:21:53.216556 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.216530 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e27c590c-e8c3-4443-bbe2-10461cacd2f9-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:21:53.404619 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.404586 2576 generic.go:358] "Generic (PLEG): container finished" podID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerID="4895e6a3a01e4456c03274650534cec504fb21287c5b24233aa56732a8e7790f" exitCode=137 Apr 23 18:21:53.404784 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.404677 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" Apr 23 18:21:53.404784 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.404669 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" event={"ID":"e27c590c-e8c3-4443-bbe2-10461cacd2f9","Type":"ContainerDied","Data":"4895e6a3a01e4456c03274650534cec504fb21287c5b24233aa56732a8e7790f"} Apr 23 18:21:53.404784 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.404781 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567" event={"ID":"e27c590c-e8c3-4443-bbe2-10461cacd2f9","Type":"ContainerDied","Data":"4dc9e4ed9ae2a8a9fc2c13e48784bab6d0ba3d26afddb715b60d75174b4b7f79"} Apr 23 18:21:53.404902 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.404801 2576 scope.go:117] "RemoveContainer" containerID="9e35db602efcb26f28a03f95ff0d422ace2b61b906e35828d0fef797b71cccb8" Apr 23 18:21:53.413422 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.413403 2576 scope.go:117] "RemoveContainer" containerID="4895e6a3a01e4456c03274650534cec504fb21287c5b24233aa56732a8e7790f" Apr 23 18:21:53.421301 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.421283 2576 scope.go:117] "RemoveContainer" containerID="ac10237ae650deb6e864aa53954c2b03f878db4b48c1f0cb6e4bedc902d7329f" Apr 23 18:21:53.428360 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.428336 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567"] Apr 23 18:21:53.430185 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.430168 2576 scope.go:117] "RemoveContainer" containerID="39f72348700e1c273b838cb64bb20fbce1feefcd2c2832bac270252c78ed8aba" Apr 23 18:21:53.432212 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.432192 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6b96cd596-dv567"] Apr 23 18:21:53.437683 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.437666 2576 scope.go:117] "RemoveContainer" containerID="9e35db602efcb26f28a03f95ff0d422ace2b61b906e35828d0fef797b71cccb8" Apr 23 18:21:53.437945 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:21:53.437925 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e35db602efcb26f28a03f95ff0d422ace2b61b906e35828d0fef797b71cccb8\": container with ID starting with 9e35db602efcb26f28a03f95ff0d422ace2b61b906e35828d0fef797b71cccb8 not found: ID does not exist" containerID="9e35db602efcb26f28a03f95ff0d422ace2b61b906e35828d0fef797b71cccb8" Apr 23 18:21:53.438023 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.437955 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e35db602efcb26f28a03f95ff0d422ace2b61b906e35828d0fef797b71cccb8"} err="failed to get container status \"9e35db602efcb26f28a03f95ff0d422ace2b61b906e35828d0fef797b71cccb8\": rpc error: code = NotFound desc = could not find container \"9e35db602efcb26f28a03f95ff0d422ace2b61b906e35828d0fef797b71cccb8\": container with ID starting with 9e35db602efcb26f28a03f95ff0d422ace2b61b906e35828d0fef797b71cccb8 not found: ID does not exist" Apr 23 18:21:53.438023 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.437973 2576 scope.go:117] "RemoveContainer" containerID="4895e6a3a01e4456c03274650534cec504fb21287c5b24233aa56732a8e7790f" Apr 23 18:21:53.438193 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:21:53.438177 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4895e6a3a01e4456c03274650534cec504fb21287c5b24233aa56732a8e7790f\": container with ID starting with 4895e6a3a01e4456c03274650534cec504fb21287c5b24233aa56732a8e7790f not found: ID does not exist" containerID="4895e6a3a01e4456c03274650534cec504fb21287c5b24233aa56732a8e7790f" Apr 23 18:21:53.438231 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.438203 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4895e6a3a01e4456c03274650534cec504fb21287c5b24233aa56732a8e7790f"} err="failed to get container status \"4895e6a3a01e4456c03274650534cec504fb21287c5b24233aa56732a8e7790f\": rpc error: code = NotFound desc = could not find container \"4895e6a3a01e4456c03274650534cec504fb21287c5b24233aa56732a8e7790f\": container with ID starting with 4895e6a3a01e4456c03274650534cec504fb21287c5b24233aa56732a8e7790f not found: ID does not exist" Apr 23 18:21:53.438231 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.438218 2576 scope.go:117] "RemoveContainer" containerID="ac10237ae650deb6e864aa53954c2b03f878db4b48c1f0cb6e4bedc902d7329f" Apr 23 18:21:53.438449 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:21:53.438435 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac10237ae650deb6e864aa53954c2b03f878db4b48c1f0cb6e4bedc902d7329f\": container with ID starting with ac10237ae650deb6e864aa53954c2b03f878db4b48c1f0cb6e4bedc902d7329f not found: ID does not exist" containerID="ac10237ae650deb6e864aa53954c2b03f878db4b48c1f0cb6e4bedc902d7329f" Apr 23 18:21:53.438519 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.438454 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac10237ae650deb6e864aa53954c2b03f878db4b48c1f0cb6e4bedc902d7329f"} err="failed to get container status \"ac10237ae650deb6e864aa53954c2b03f878db4b48c1f0cb6e4bedc902d7329f\": rpc error: code = NotFound desc = could not find container \"ac10237ae650deb6e864aa53954c2b03f878db4b48c1f0cb6e4bedc902d7329f\": container with ID starting with ac10237ae650deb6e864aa53954c2b03f878db4b48c1f0cb6e4bedc902d7329f not found: ID does not exist" Apr 23 18:21:53.438519 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.438487 2576 scope.go:117] "RemoveContainer" containerID="39f72348700e1c273b838cb64bb20fbce1feefcd2c2832bac270252c78ed8aba" Apr 23 18:21:53.438721 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:21:53.438701 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39f72348700e1c273b838cb64bb20fbce1feefcd2c2832bac270252c78ed8aba\": container with ID starting with 39f72348700e1c273b838cb64bb20fbce1feefcd2c2832bac270252c78ed8aba not found: ID does not exist" containerID="39f72348700e1c273b838cb64bb20fbce1feefcd2c2832bac270252c78ed8aba" Apr 23 18:21:53.438829 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:53.438724 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39f72348700e1c273b838cb64bb20fbce1feefcd2c2832bac270252c78ed8aba"} err="failed to get container status \"39f72348700e1c273b838cb64bb20fbce1feefcd2c2832bac270252c78ed8aba\": rpc error: code = NotFound desc = could not find container \"39f72348700e1c273b838cb64bb20fbce1feefcd2c2832bac270252c78ed8aba\": container with ID starting with 39f72348700e1c273b838cb64bb20fbce1feefcd2c2832bac270252c78ed8aba not found: ID does not exist" Apr 23 18:21:55.414805 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:21:55.414768 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" path="/var/lib/kubelet/pods/e27c590c-e8c3-4443-bbe2-10461cacd2f9/volumes" Apr 23 18:23:31.415413 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:23:31.415383 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:23:31.415990 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:23:31.415791 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:26:29.012784 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.012749 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg"] Apr 23 18:26:29.013254 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.013138 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="kserve-container" Apr 23 18:26:29.013254 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.013153 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="kserve-container" Apr 23 18:26:29.013254 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.013167 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ea1c030-692c-4cc6-8a41-0925c141a190" containerName="storage-initializer" Apr 23 18:26:29.013254 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.013173 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea1c030-692c-4cc6-8a41-0925c141a190" containerName="storage-initializer" Apr 23 18:26:29.013254 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.013188 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ea1c030-692c-4cc6-8a41-0925c141a190" containerName="kserve-container" Apr 23 18:26:29.013254 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.013194 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea1c030-692c-4cc6-8a41-0925c141a190" containerName="kserve-container" Apr 23 18:26:29.013254 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.013200 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="storage-initializer" Apr 23 18:26:29.013254 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.013205 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="storage-initializer" Apr 23 18:26:29.013254 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.013215 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ea1c030-692c-4cc6-8a41-0925c141a190" containerName="kube-rbac-proxy" Apr 23 18:26:29.013254 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.013220 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea1c030-692c-4cc6-8a41-0925c141a190" containerName="kube-rbac-proxy" Apr 23 18:26:29.013254 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.013226 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="kserve-agent" Apr 23 18:26:29.013254 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.013233 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="kserve-agent" Apr 23 18:26:29.013254 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.013254 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="kube-rbac-proxy" Apr 23 18:26:29.013254 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.013262 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="kube-rbac-proxy" Apr 23 18:26:29.013701 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.013311 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ea1c030-692c-4cc6-8a41-0925c141a190" containerName="kube-rbac-proxy" Apr 23 18:26:29.013701 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.013324 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="kube-rbac-proxy" Apr 23 18:26:29.013701 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.013336 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="kserve-agent" Apr 23 18:26:29.013701 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.013349 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ea1c030-692c-4cc6-8a41-0925c141a190" containerName="kserve-container" Apr 23 18:26:29.013701 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.013357 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e27c590c-e8c3-4443-bbe2-10461cacd2f9" containerName="kserve-container" Apr 23 18:26:29.016491 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.016475 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" Apr 23 18:26:29.018577 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.018551 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-kube-rbac-proxy-sar-config\"" Apr 23 18:26:29.018718 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.018625 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:26:29.019097 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.019076 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-predictor-serving-cert\"" Apr 23 18:26:29.019178 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.019081 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t4fg7\"" Apr 23 18:26:29.019385 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.019370 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 18:26:29.026846 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.026825 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg"] Apr 23 18:26:29.123648 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.123596 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f2a0aee-f558-42dc-99ad-a9a489426746-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-dttgg\" (UID: \"8f2a0aee-f558-42dc-99ad-a9a489426746\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" Apr 23 18:26:29.123849 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.123707 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f2a0aee-f558-42dc-99ad-a9a489426746-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-dttgg\" (UID: \"8f2a0aee-f558-42dc-99ad-a9a489426746\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" Apr 23 18:26:29.123849 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.123742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f2a0aee-f558-42dc-99ad-a9a489426746-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-dttgg\" (UID: \"8f2a0aee-f558-42dc-99ad-a9a489426746\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" Apr 23 18:26:29.123849 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.123785 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fc5l\" (UniqueName: \"kubernetes.io/projected/8f2a0aee-f558-42dc-99ad-a9a489426746-kube-api-access-2fc5l\") pod \"isvc-pmml-runtime-predictor-67bc544947-dttgg\" (UID: \"8f2a0aee-f558-42dc-99ad-a9a489426746\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" Apr 23 18:26:29.224668 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.224628 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f2a0aee-f558-42dc-99ad-a9a489426746-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-dttgg\" (UID: \"8f2a0aee-f558-42dc-99ad-a9a489426746\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" Apr 23 18:26:29.224668 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.224667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f2a0aee-f558-42dc-99ad-a9a489426746-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-dttgg\" (UID: \"8f2a0aee-f558-42dc-99ad-a9a489426746\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" Apr 23 18:26:29.224887 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.224712 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fc5l\" (UniqueName: \"kubernetes.io/projected/8f2a0aee-f558-42dc-99ad-a9a489426746-kube-api-access-2fc5l\") pod \"isvc-pmml-runtime-predictor-67bc544947-dttgg\" (UID: \"8f2a0aee-f558-42dc-99ad-a9a489426746\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" Apr 23 18:26:29.224887 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.224762 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f2a0aee-f558-42dc-99ad-a9a489426746-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-dttgg\" (UID: \"8f2a0aee-f558-42dc-99ad-a9a489426746\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" Apr 23 18:26:29.225119 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.225094 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f2a0aee-f558-42dc-99ad-a9a489426746-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-dttgg\" (UID: \"8f2a0aee-f558-42dc-99ad-a9a489426746\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" Apr 23 18:26:29.225343 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.225326 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f2a0aee-f558-42dc-99ad-a9a489426746-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-dttgg\" (UID: \"8f2a0aee-f558-42dc-99ad-a9a489426746\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" Apr 23 18:26:29.227447 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.227421 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f2a0aee-f558-42dc-99ad-a9a489426746-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-dttgg\" (UID: \"8f2a0aee-f558-42dc-99ad-a9a489426746\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" Apr 23 18:26:29.233358 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.233332 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fc5l\" (UniqueName: \"kubernetes.io/projected/8f2a0aee-f558-42dc-99ad-a9a489426746-kube-api-access-2fc5l\") pod \"isvc-pmml-runtime-predictor-67bc544947-dttgg\" (UID: \"8f2a0aee-f558-42dc-99ad-a9a489426746\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" Apr 23 18:26:29.328454 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.328418 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" Apr 23 18:26:29.457521 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.457339 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg"] Apr 23 18:26:29.460275 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:26:29.460252 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f2a0aee_f558_42dc_99ad_a9a489426746.slice/crio-92be9257bfdabfde6f8aeb28df7b66513b48ae5386a25399d4db6495dbe10ed2 WatchSource:0}: Error finding container 92be9257bfdabfde6f8aeb28df7b66513b48ae5386a25399d4db6495dbe10ed2: Status 404 returned error can't find the container with id 92be9257bfdabfde6f8aeb28df7b66513b48ae5386a25399d4db6495dbe10ed2 Apr 23 18:26:29.462153 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:29.462136 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:26:30.354348 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:30.354307 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" event={"ID":"8f2a0aee-f558-42dc-99ad-a9a489426746","Type":"ContainerStarted","Data":"cbc05df17dec03988d7bb4ee79a0f58808c96c931c003b34d35f217ca1b929da"} Apr 23 18:26:30.354348 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:30.354346 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" event={"ID":"8f2a0aee-f558-42dc-99ad-a9a489426746","Type":"ContainerStarted","Data":"92be9257bfdabfde6f8aeb28df7b66513b48ae5386a25399d4db6495dbe10ed2"} Apr 23 18:26:34.369232 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:34.369195 2576 generic.go:358] "Generic (PLEG): container finished" podID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerID="cbc05df17dec03988d7bb4ee79a0f58808c96c931c003b34d35f217ca1b929da" exitCode=0 Apr 23 18:26:34.369706 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:34.369271 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" event={"ID":"8f2a0aee-f558-42dc-99ad-a9a489426746","Type":"ContainerDied","Data":"cbc05df17dec03988d7bb4ee79a0f58808c96c931c003b34d35f217ca1b929da"} Apr 23 18:26:41.400137 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:41.400105 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" event={"ID":"8f2a0aee-f558-42dc-99ad-a9a489426746","Type":"ContainerStarted","Data":"6999d9f64ec2442834dfa113095579e9c7eb934e524612570f942e035f7dd453"} Apr 23 18:26:41.400554 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:41.400146 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" event={"ID":"8f2a0aee-f558-42dc-99ad-a9a489426746","Type":"ContainerStarted","Data":"84dff9fcd1b56beab0a32a1dae42e5db8989a6852a2f7fe28a36196de7f82210"} Apr 23 18:26:41.400554 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:41.400437 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" Apr 23 18:26:41.400651 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:41.400599 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" Apr 23 18:26:41.401852 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:41.401826 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" podUID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 23 18:26:41.431147 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:41.431104 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" podStartSLOduration=6.57416314 podStartE2EDuration="13.431092528s" podCreationTimestamp="2026-04-23 18:26:28 +0000 UTC" firstStartedPulling="2026-04-23 18:26:34.370404302 +0000 UTC m=+1683.537731599" lastFinishedPulling="2026-04-23 18:26:41.227333689 +0000 UTC m=+1690.394660987" observedRunningTime="2026-04-23 18:26:41.430159755 +0000 UTC m=+1690.597487074" watchObservedRunningTime="2026-04-23 18:26:41.431092528 +0000 UTC m=+1690.598419847" Apr 23 18:26:42.403550 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:42.403513 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" podUID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 23 18:26:47.407985 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:47.407956 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" Apr 23 18:26:47.408595 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:47.408566 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" podUID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 23 18:26:57.409123 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:26:57.409080 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" podUID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 23 18:27:07.408599 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:27:07.408559 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" podUID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 23 18:27:17.408591 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:27:17.408550 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" podUID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 23 18:27:27.409228 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:27:27.409186 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" podUID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 23 18:27:37.408556 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:27:37.408515 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" podUID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 23 18:27:47.408522 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:27:47.408480 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" podUID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 23 18:27:57.408834 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:27:57.408795 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" podUID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 23 18:28:06.411380 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:06.411343 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" Apr 23 18:28:09.956912 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:09.956833 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg"] Apr 23 18:28:09.957325 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:09.957105 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" podUID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerName="kserve-container" containerID="cri-o://84dff9fcd1b56beab0a32a1dae42e5db8989a6852a2f7fe28a36196de7f82210" gracePeriod=30 Apr 23 18:28:09.957325 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:09.957150 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" podUID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerName="kube-rbac-proxy" containerID="cri-o://6999d9f64ec2442834dfa113095579e9c7eb934e524612570f942e035f7dd453" gracePeriod=30 Apr 23 18:28:10.700352 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:10.700309 2576 generic.go:358] "Generic (PLEG): container finished" podID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerID="6999d9f64ec2442834dfa113095579e9c7eb934e524612570f942e035f7dd453" exitCode=2 Apr 23 18:28:10.700546 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:10.700376 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" event={"ID":"8f2a0aee-f558-42dc-99ad-a9a489426746","Type":"ContainerDied","Data":"6999d9f64ec2442834dfa113095579e9c7eb934e524612570f942e035f7dd453"} Apr 23 18:28:12.403758 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:12.403717 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" podUID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.43:8643/healthz\": dial tcp 10.132.0.43:8643: connect: connection refused" Apr 23 18:28:13.606729 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.606696 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" Apr 23 18:28:13.707095 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.707016 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f2a0aee-f558-42dc-99ad-a9a489426746-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"8f2a0aee-f558-42dc-99ad-a9a489426746\" (UID: \"8f2a0aee-f558-42dc-99ad-a9a489426746\") " Apr 23 18:28:13.707095 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.707058 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f2a0aee-f558-42dc-99ad-a9a489426746-kserve-provision-location\") pod \"8f2a0aee-f558-42dc-99ad-a9a489426746\" (UID: \"8f2a0aee-f558-42dc-99ad-a9a489426746\") " Apr 23 18:28:13.707282 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.707119 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fc5l\" (UniqueName: \"kubernetes.io/projected/8f2a0aee-f558-42dc-99ad-a9a489426746-kube-api-access-2fc5l\") pod \"8f2a0aee-f558-42dc-99ad-a9a489426746\" (UID: \"8f2a0aee-f558-42dc-99ad-a9a489426746\") " Apr 23 18:28:13.707282 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.707168 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f2a0aee-f558-42dc-99ad-a9a489426746-proxy-tls\") pod \"8f2a0aee-f558-42dc-99ad-a9a489426746\" (UID: \"8f2a0aee-f558-42dc-99ad-a9a489426746\") " Apr 23 18:28:13.707501 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.707447 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f2a0aee-f558-42dc-99ad-a9a489426746-isvc-pmml-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-runtime-kube-rbac-proxy-sar-config") pod "8f2a0aee-f558-42dc-99ad-a9a489426746" (UID: "8f2a0aee-f558-42dc-99ad-a9a489426746"). InnerVolumeSpecName "isvc-pmml-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:28:13.707596 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.707535 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f2a0aee-f558-42dc-99ad-a9a489426746-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8f2a0aee-f558-42dc-99ad-a9a489426746" (UID: "8f2a0aee-f558-42dc-99ad-a9a489426746"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:28:13.709607 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.709578 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f2a0aee-f558-42dc-99ad-a9a489426746-kube-api-access-2fc5l" (OuterVolumeSpecName: "kube-api-access-2fc5l") pod "8f2a0aee-f558-42dc-99ad-a9a489426746" (UID: "8f2a0aee-f558-42dc-99ad-a9a489426746"). InnerVolumeSpecName "kube-api-access-2fc5l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:28:13.709821 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.709778 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f2a0aee-f558-42dc-99ad-a9a489426746-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8f2a0aee-f558-42dc-99ad-a9a489426746" (UID: "8f2a0aee-f558-42dc-99ad-a9a489426746"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:28:13.712258 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.712235 2576 generic.go:358] "Generic (PLEG): container finished" podID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerID="84dff9fcd1b56beab0a32a1dae42e5db8989a6852a2f7fe28a36196de7f82210" exitCode=0 Apr 23 18:28:13.712350 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.712331 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" Apr 23 18:28:13.712350 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.712326 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" event={"ID":"8f2a0aee-f558-42dc-99ad-a9a489426746","Type":"ContainerDied","Data":"84dff9fcd1b56beab0a32a1dae42e5db8989a6852a2f7fe28a36196de7f82210"} Apr 23 18:28:13.712421 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.712376 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg" event={"ID":"8f2a0aee-f558-42dc-99ad-a9a489426746","Type":"ContainerDied","Data":"92be9257bfdabfde6f8aeb28df7b66513b48ae5386a25399d4db6495dbe10ed2"} Apr 23 18:28:13.712421 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.712399 2576 scope.go:117] "RemoveContainer" containerID="6999d9f64ec2442834dfa113095579e9c7eb934e524612570f942e035f7dd453" Apr 23 18:28:13.726016 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.725999 2576 scope.go:117] "RemoveContainer" containerID="84dff9fcd1b56beab0a32a1dae42e5db8989a6852a2f7fe28a36196de7f82210" Apr 23 18:28:13.733659 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.733639 2576 scope.go:117] "RemoveContainer" containerID="cbc05df17dec03988d7bb4ee79a0f58808c96c931c003b34d35f217ca1b929da" Apr 23 18:28:13.736795 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.736775 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg"] Apr 23 18:28:13.740307 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.740283 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-dttgg"] Apr 23 18:28:13.741524 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.741513 2576 scope.go:117] "RemoveContainer" containerID="6999d9f64ec2442834dfa113095579e9c7eb934e524612570f942e035f7dd453" Apr 23 18:28:13.741788 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:28:13.741767 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6999d9f64ec2442834dfa113095579e9c7eb934e524612570f942e035f7dd453\": container with ID starting with 6999d9f64ec2442834dfa113095579e9c7eb934e524612570f942e035f7dd453 not found: ID does not exist" containerID="6999d9f64ec2442834dfa113095579e9c7eb934e524612570f942e035f7dd453" Apr 23 18:28:13.741870 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.741798 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6999d9f64ec2442834dfa113095579e9c7eb934e524612570f942e035f7dd453"} err="failed to get container status \"6999d9f64ec2442834dfa113095579e9c7eb934e524612570f942e035f7dd453\": rpc error: code = NotFound desc = could not find container \"6999d9f64ec2442834dfa113095579e9c7eb934e524612570f942e035f7dd453\": container with ID starting with 6999d9f64ec2442834dfa113095579e9c7eb934e524612570f942e035f7dd453 not found: ID does not exist" Apr 23 18:28:13.741870 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.741822 2576 scope.go:117] "RemoveContainer" containerID="84dff9fcd1b56beab0a32a1dae42e5db8989a6852a2f7fe28a36196de7f82210" Apr 23 18:28:13.742099 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:28:13.742083 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84dff9fcd1b56beab0a32a1dae42e5db8989a6852a2f7fe28a36196de7f82210\": container with ID starting with 84dff9fcd1b56beab0a32a1dae42e5db8989a6852a2f7fe28a36196de7f82210 not found: ID does not exist" containerID="84dff9fcd1b56beab0a32a1dae42e5db8989a6852a2f7fe28a36196de7f82210" Apr 23 18:28:13.742143 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.742104 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84dff9fcd1b56beab0a32a1dae42e5db8989a6852a2f7fe28a36196de7f82210"} err="failed to get container status \"84dff9fcd1b56beab0a32a1dae42e5db8989a6852a2f7fe28a36196de7f82210\": rpc error: code = NotFound desc = could not find container \"84dff9fcd1b56beab0a32a1dae42e5db8989a6852a2f7fe28a36196de7f82210\": container with ID starting with 84dff9fcd1b56beab0a32a1dae42e5db8989a6852a2f7fe28a36196de7f82210 not found: ID does not exist" Apr 23 18:28:13.742143 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.742120 2576 scope.go:117] "RemoveContainer" containerID="cbc05df17dec03988d7bb4ee79a0f58808c96c931c003b34d35f217ca1b929da" Apr 23 18:28:13.742330 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:28:13.742313 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbc05df17dec03988d7bb4ee79a0f58808c96c931c003b34d35f217ca1b929da\": container with ID starting with cbc05df17dec03988d7bb4ee79a0f58808c96c931c003b34d35f217ca1b929da not found: ID does not exist" containerID="cbc05df17dec03988d7bb4ee79a0f58808c96c931c003b34d35f217ca1b929da" Apr 23 18:28:13.742371 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.742336 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbc05df17dec03988d7bb4ee79a0f58808c96c931c003b34d35f217ca1b929da"} err="failed to get container status \"cbc05df17dec03988d7bb4ee79a0f58808c96c931c003b34d35f217ca1b929da\": rpc error: code = NotFound desc = could not find container \"cbc05df17dec03988d7bb4ee79a0f58808c96c931c003b34d35f217ca1b929da\": container with ID starting with cbc05df17dec03988d7bb4ee79a0f58808c96c931c003b34d35f217ca1b929da not found: ID does not exist" Apr 23 18:28:13.808057 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.808028 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2fc5l\" (UniqueName: \"kubernetes.io/projected/8f2a0aee-f558-42dc-99ad-a9a489426746-kube-api-access-2fc5l\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:28:13.808057 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.808054 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f2a0aee-f558-42dc-99ad-a9a489426746-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:28:13.808218 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.808066 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f2a0aee-f558-42dc-99ad-a9a489426746-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:28:13.808218 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:13.808075 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f2a0aee-f558-42dc-99ad-a9a489426746-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:28:15.415414 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:15.415379 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f2a0aee-f558-42dc-99ad-a9a489426746" path="/var/lib/kubelet/pods/8f2a0aee-f558-42dc-99ad-a9a489426746/volumes" Apr 23 18:28:31.440639 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:31.440609 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:28:31.442362 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:28:31.442343 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:29:41.194319 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.194231 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk"] Apr 23 18:29:41.196739 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.194604 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerName="storage-initializer" Apr 23 18:29:41.196739 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.194615 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerName="storage-initializer" Apr 23 18:29:41.196739 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.194626 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerName="kserve-container" Apr 23 18:29:41.196739 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.194631 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerName="kserve-container" Apr 23 18:29:41.196739 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.194639 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerName="kube-rbac-proxy" Apr 23 18:29:41.196739 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.194644 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerName="kube-rbac-proxy" Apr 23 18:29:41.196739 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.194693 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerName="kserve-container" Apr 23 18:29:41.196739 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.194704 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f2a0aee-f558-42dc-99ad-a9a489426746" containerName="kube-rbac-proxy" Apr 23 18:29:41.197675 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.197657 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" Apr 23 18:29:41.199664 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.199642 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-a7ec27-predictor-serving-cert\"" Apr 23 18:29:41.199766 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.199712 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t4fg7\"" Apr 23 18:29:41.200104 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.200083 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:29:41.200165 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.200152 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-a7ec27-kube-rbac-proxy-sar-config\"" Apr 23 18:29:41.200223 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.200185 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 18:29:41.207539 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.207516 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk"] Apr 23 18:29:41.354062 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.354025 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b95792bc-8615-4390-a0a9-cfbc6696207b-proxy-tls\") pod \"isvc-primary-a7ec27-predictor-5c556c75f4-4spnk\" (UID: \"b95792bc-8615-4390-a0a9-cfbc6696207b\") " pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" Apr 23 18:29:41.354062 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.354076 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b95792bc-8615-4390-a0a9-cfbc6696207b-kserve-provision-location\") pod \"isvc-primary-a7ec27-predictor-5c556c75f4-4spnk\" (UID: \"b95792bc-8615-4390-a0a9-cfbc6696207b\") " pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" Apr 23 18:29:41.354284 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.354179 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6rcc\" (UniqueName: \"kubernetes.io/projected/b95792bc-8615-4390-a0a9-cfbc6696207b-kube-api-access-r6rcc\") pod \"isvc-primary-a7ec27-predictor-5c556c75f4-4spnk\" (UID: \"b95792bc-8615-4390-a0a9-cfbc6696207b\") " pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" Apr 23 18:29:41.354284 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.354250 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-a7ec27-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b95792bc-8615-4390-a0a9-cfbc6696207b-isvc-primary-a7ec27-kube-rbac-proxy-sar-config\") pod \"isvc-primary-a7ec27-predictor-5c556c75f4-4spnk\" (UID: \"b95792bc-8615-4390-a0a9-cfbc6696207b\") " pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" Apr 23 18:29:41.455032 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.454926 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b95792bc-8615-4390-a0a9-cfbc6696207b-kserve-provision-location\") pod \"isvc-primary-a7ec27-predictor-5c556c75f4-4spnk\" (UID: \"b95792bc-8615-4390-a0a9-cfbc6696207b\") " pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" Apr 23 18:29:41.455032 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.455005 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6rcc\" (UniqueName: \"kubernetes.io/projected/b95792bc-8615-4390-a0a9-cfbc6696207b-kube-api-access-r6rcc\") pod \"isvc-primary-a7ec27-predictor-5c556c75f4-4spnk\" (UID: \"b95792bc-8615-4390-a0a9-cfbc6696207b\") " pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" Apr 23 18:29:41.455272 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.455076 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-a7ec27-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b95792bc-8615-4390-a0a9-cfbc6696207b-isvc-primary-a7ec27-kube-rbac-proxy-sar-config\") pod \"isvc-primary-a7ec27-predictor-5c556c75f4-4spnk\" (UID: \"b95792bc-8615-4390-a0a9-cfbc6696207b\") " pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" Apr 23 18:29:41.455272 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.455110 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b95792bc-8615-4390-a0a9-cfbc6696207b-proxy-tls\") pod \"isvc-primary-a7ec27-predictor-5c556c75f4-4spnk\" (UID: \"b95792bc-8615-4390-a0a9-cfbc6696207b\") " pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" Apr 23 18:29:41.455385 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.455358 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b95792bc-8615-4390-a0a9-cfbc6696207b-kserve-provision-location\") pod \"isvc-primary-a7ec27-predictor-5c556c75f4-4spnk\" (UID: \"b95792bc-8615-4390-a0a9-cfbc6696207b\") " pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" Apr 23 18:29:41.455722 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.455700 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-a7ec27-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b95792bc-8615-4390-a0a9-cfbc6696207b-isvc-primary-a7ec27-kube-rbac-proxy-sar-config\") pod \"isvc-primary-a7ec27-predictor-5c556c75f4-4spnk\" (UID: \"b95792bc-8615-4390-a0a9-cfbc6696207b\") " pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" Apr 23 18:29:41.457807 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.457782 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b95792bc-8615-4390-a0a9-cfbc6696207b-proxy-tls\") pod \"isvc-primary-a7ec27-predictor-5c556c75f4-4spnk\" (UID: \"b95792bc-8615-4390-a0a9-cfbc6696207b\") " pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" Apr 23 18:29:41.463285 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.463265 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6rcc\" (UniqueName: \"kubernetes.io/projected/b95792bc-8615-4390-a0a9-cfbc6696207b-kube-api-access-r6rcc\") pod \"isvc-primary-a7ec27-predictor-5c556c75f4-4spnk\" (UID: \"b95792bc-8615-4390-a0a9-cfbc6696207b\") " pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" Apr 23 18:29:41.509636 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.509603 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" Apr 23 18:29:41.633618 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:41.633577 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk"] Apr 23 18:29:41.636000 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:29:41.635970 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb95792bc_8615_4390_a0a9_cfbc6696207b.slice/crio-0c87adc4c13b9cdb002802f72bc407a26019d7c8a90ee79bbdf1f0f57923b046 WatchSource:0}: Error finding container 0c87adc4c13b9cdb002802f72bc407a26019d7c8a90ee79bbdf1f0f57923b046: Status 404 returned error can't find the container with id 0c87adc4c13b9cdb002802f72bc407a26019d7c8a90ee79bbdf1f0f57923b046 Apr 23 18:29:42.011297 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:42.011256 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" event={"ID":"b95792bc-8615-4390-a0a9-cfbc6696207b","Type":"ContainerStarted","Data":"5a59c7bf6d8e8a98155a6d9514d86b09bc1c2c5176337f398dae19491f4ac587"} Apr 23 18:29:42.011297 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:42.011294 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" event={"ID":"b95792bc-8615-4390-a0a9-cfbc6696207b","Type":"ContainerStarted","Data":"0c87adc4c13b9cdb002802f72bc407a26019d7c8a90ee79bbdf1f0f57923b046"} Apr 23 18:29:46.025162 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:46.025127 2576 generic.go:358] "Generic (PLEG): container finished" podID="b95792bc-8615-4390-a0a9-cfbc6696207b" containerID="5a59c7bf6d8e8a98155a6d9514d86b09bc1c2c5176337f398dae19491f4ac587" exitCode=0 Apr 23 18:29:46.025560 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:46.025209 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" event={"ID":"b95792bc-8615-4390-a0a9-cfbc6696207b","Type":"ContainerDied","Data":"5a59c7bf6d8e8a98155a6d9514d86b09bc1c2c5176337f398dae19491f4ac587"} Apr 23 18:29:47.030337 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:47.030301 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" event={"ID":"b95792bc-8615-4390-a0a9-cfbc6696207b","Type":"ContainerStarted","Data":"584394899c5b7a5074a236f2c51778641bf8f8a76e6aec284416073c23802499"} Apr 23 18:29:47.030337 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:47.030339 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" event={"ID":"b95792bc-8615-4390-a0a9-cfbc6696207b","Type":"ContainerStarted","Data":"93c0588a5564dfeae88d03292a47caf2f5953e470d26beb9324c593ae143e5e4"} Apr 23 18:29:47.030788 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:47.030566 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" Apr 23 18:29:47.060769 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:47.060722 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" podStartSLOduration=6.0607075009999996 podStartE2EDuration="6.060707501s" podCreationTimestamp="2026-04-23 18:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:29:47.057300326 +0000 UTC m=+1876.224627646" watchObservedRunningTime="2026-04-23 18:29:47.060707501 +0000 UTC m=+1876.228034820" Apr 23 18:29:48.033953 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:48.033916 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" Apr 23 18:29:48.035220 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:48.035192 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" podUID="b95792bc-8615-4390-a0a9-cfbc6696207b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 23 18:29:49.037114 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:49.037077 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" podUID="b95792bc-8615-4390-a0a9-cfbc6696207b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 23 18:29:54.041414 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:54.041379 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" Apr 23 18:29:54.042030 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:29:54.041999 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" podUID="b95792bc-8615-4390-a0a9-cfbc6696207b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 23 18:30:04.042183 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:30:04.042143 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" podUID="b95792bc-8615-4390-a0a9-cfbc6696207b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 23 18:30:14.042710 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:30:14.042667 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" podUID="b95792bc-8615-4390-a0a9-cfbc6696207b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 23 18:30:24.042223 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:30:24.042182 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" podUID="b95792bc-8615-4390-a0a9-cfbc6696207b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 23 18:30:34.042772 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:30:34.042726 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" podUID="b95792bc-8615-4390-a0a9-cfbc6696207b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 23 18:30:44.042149 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:30:44.042112 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" podUID="b95792bc-8615-4390-a0a9-cfbc6696207b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 23 18:30:54.043414 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:30:54.043386 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" Apr 23 18:31:01.364016 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.363964 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c"] Apr 23 18:31:01.367455 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.367439 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" Apr 23 18:31:01.369959 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.369933 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-a7ec27-predictor-serving-cert\"" Apr 23 18:31:01.370093 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.369958 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 23 18:31:01.370093 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.369988 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-a7ec27-dockercfg-9bcfw\"" Apr 23 18:31:01.370309 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.370292 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-a7ec27\"" Apr 23 18:31:01.370359 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.370334 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-a7ec27-kube-rbac-proxy-sar-config\"" Apr 23 18:31:01.377695 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.377672 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c"] Apr 23 18:31:01.532084 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.532049 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-kserve-provision-location\") pod \"isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c\" (UID: \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\") " pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" Apr 23 18:31:01.532286 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.532094 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-proxy-tls\") pod \"isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c\" (UID: \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\") " pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" Apr 23 18:31:01.532286 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.532149 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-a7ec27-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-isvc-secondary-a7ec27-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c\" (UID: \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\") " pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" Apr 23 18:31:01.532286 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.532165 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfjtf\" (UniqueName: \"kubernetes.io/projected/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-kube-api-access-mfjtf\") pod \"isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c\" (UID: \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\") " pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" Apr 23 18:31:01.532286 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.532256 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-cabundle-cert\") pod \"isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c\" (UID: \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\") " pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" Apr 23 18:31:01.632961 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.632868 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-a7ec27-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-isvc-secondary-a7ec27-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c\" (UID: \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\") " pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" Apr 23 18:31:01.632961 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.632905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfjtf\" (UniqueName: \"kubernetes.io/projected/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-kube-api-access-mfjtf\") pod \"isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c\" (UID: \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\") " pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" Apr 23 18:31:01.632961 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.632946 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-cabundle-cert\") pod \"isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c\" (UID: \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\") " pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" Apr 23 18:31:01.633234 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.632980 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-kserve-provision-location\") pod \"isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c\" (UID: \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\") " pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" Apr 23 18:31:01.633234 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.633027 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-proxy-tls\") pod \"isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c\" (UID: \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\") " pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" Apr 23 18:31:01.633515 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.633490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-kserve-provision-location\") pod \"isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c\" (UID: \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\") " pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" Apr 23 18:31:01.633600 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.633572 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-a7ec27-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-isvc-secondary-a7ec27-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c\" (UID: \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\") " pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" Apr 23 18:31:01.633681 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.633665 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-cabundle-cert\") pod \"isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c\" (UID: \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\") " pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" Apr 23 18:31:01.635685 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.635665 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-proxy-tls\") pod \"isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c\" (UID: \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\") " pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" Apr 23 18:31:01.640478 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.640440 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfjtf\" (UniqueName: \"kubernetes.io/projected/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-kube-api-access-mfjtf\") pod \"isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c\" (UID: \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\") " pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" Apr 23 18:31:01.679067 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.679026 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" Apr 23 18:31:01.809513 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:01.809475 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c"] Apr 23 18:31:01.812757 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:31:01.812729 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dc7dffb_e6b1_40b1_af37_9823a13f8c94.slice/crio-d68a5aae43699bed979250675297a4d2c6ebc1ce9214817d2b158ea781619c91 WatchSource:0}: Error finding container d68a5aae43699bed979250675297a4d2c6ebc1ce9214817d2b158ea781619c91: Status 404 returned error can't find the container with id d68a5aae43699bed979250675297a4d2c6ebc1ce9214817d2b158ea781619c91 Apr 23 18:31:02.290945 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:02.290901 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" event={"ID":"3dc7dffb-e6b1-40b1-af37-9823a13f8c94","Type":"ContainerStarted","Data":"f79c25d7f262bfb9b8d0f30302c0efe8840ea8131f918711cfdd780826101e27"} Apr 23 18:31:02.290945 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:02.290941 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" event={"ID":"3dc7dffb-e6b1-40b1-af37-9823a13f8c94","Type":"ContainerStarted","Data":"d68a5aae43699bed979250675297a4d2c6ebc1ce9214817d2b158ea781619c91"} Apr 23 18:31:07.310520 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:07.310495 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c_3dc7dffb-e6b1-40b1-af37-9823a13f8c94/storage-initializer/0.log" Apr 23 18:31:07.310882 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:07.310534 2576 generic.go:358] "Generic (PLEG): container finished" podID="3dc7dffb-e6b1-40b1-af37-9823a13f8c94" containerID="f79c25d7f262bfb9b8d0f30302c0efe8840ea8131f918711cfdd780826101e27" exitCode=1 Apr 23 18:31:07.310882 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:07.310577 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" event={"ID":"3dc7dffb-e6b1-40b1-af37-9823a13f8c94","Type":"ContainerDied","Data":"f79c25d7f262bfb9b8d0f30302c0efe8840ea8131f918711cfdd780826101e27"} Apr 23 18:31:08.315887 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:08.315860 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c_3dc7dffb-e6b1-40b1-af37-9823a13f8c94/storage-initializer/0.log" Apr 23 18:31:08.316271 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:08.315928 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" event={"ID":"3dc7dffb-e6b1-40b1-af37-9823a13f8c94","Type":"ContainerStarted","Data":"39faba0cf4af1fe88138606e6fb394e33558d567eff73f08886ab64ac1daf85e"} Apr 23 18:31:12.331747 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:12.331717 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c_3dc7dffb-e6b1-40b1-af37-9823a13f8c94/storage-initializer/1.log" Apr 23 18:31:12.332195 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:12.332100 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c_3dc7dffb-e6b1-40b1-af37-9823a13f8c94/storage-initializer/0.log" Apr 23 18:31:12.332195 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:12.332132 2576 generic.go:358] "Generic (PLEG): container finished" podID="3dc7dffb-e6b1-40b1-af37-9823a13f8c94" containerID="39faba0cf4af1fe88138606e6fb394e33558d567eff73f08886ab64ac1daf85e" exitCode=1 Apr 23 18:31:12.332195 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:12.332169 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" event={"ID":"3dc7dffb-e6b1-40b1-af37-9823a13f8c94","Type":"ContainerDied","Data":"39faba0cf4af1fe88138606e6fb394e33558d567eff73f08886ab64ac1daf85e"} Apr 23 18:31:12.332347 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:12.332199 2576 scope.go:117] "RemoveContainer" containerID="f79c25d7f262bfb9b8d0f30302c0efe8840ea8131f918711cfdd780826101e27" Apr 23 18:31:12.332709 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:12.332678 2576 scope.go:117] "RemoveContainer" containerID="f79c25d7f262bfb9b8d0f30302c0efe8840ea8131f918711cfdd780826101e27" Apr 23 18:31:12.344823 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:31:12.344790 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c_kserve-ci-e2e-test_3dc7dffb-e6b1-40b1-af37-9823a13f8c94_0 in pod sandbox d68a5aae43699bed979250675297a4d2c6ebc1ce9214817d2b158ea781619c91 from index: no such id: 'f79c25d7f262bfb9b8d0f30302c0efe8840ea8131f918711cfdd780826101e27'" containerID="f79c25d7f262bfb9b8d0f30302c0efe8840ea8131f918711cfdd780826101e27" Apr 23 18:31:12.344903 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:12.344839 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f79c25d7f262bfb9b8d0f30302c0efe8840ea8131f918711cfdd780826101e27"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c_kserve-ci-e2e-test_3dc7dffb-e6b1-40b1-af37-9823a13f8c94_0 in pod sandbox d68a5aae43699bed979250675297a4d2c6ebc1ce9214817d2b158ea781619c91 from index: no such id: 'f79c25d7f262bfb9b8d0f30302c0efe8840ea8131f918711cfdd780826101e27'" Apr 23 18:31:12.345053 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:31:12.345032 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c_kserve-ci-e2e-test(3dc7dffb-e6b1-40b1-af37-9823a13f8c94)\"" pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" podUID="3dc7dffb-e6b1-40b1-af37-9823a13f8c94" Apr 23 18:31:13.337299 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:13.337270 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c_3dc7dffb-e6b1-40b1-af37-9823a13f8c94/storage-initializer/1.log" Apr 23 18:31:19.426868 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.426831 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c"] Apr 23 18:31:19.489443 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.489409 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk"] Apr 23 18:31:19.489890 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.489837 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" podUID="b95792bc-8615-4390-a0a9-cfbc6696207b" containerName="kserve-container" containerID="cri-o://93c0588a5564dfeae88d03292a47caf2f5953e470d26beb9324c593ae143e5e4" gracePeriod=30 Apr 23 18:31:19.489890 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.489868 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" podUID="b95792bc-8615-4390-a0a9-cfbc6696207b" containerName="kube-rbac-proxy" containerID="cri-o://584394899c5b7a5074a236f2c51778641bf8f8a76e6aec284416073c23802499" gracePeriod=30 Apr 23 18:31:19.551092 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.551048 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s"] Apr 23 18:31:19.556799 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.556777 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" Apr 23 18:31:19.558933 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.558905 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-760fb5\"" Apr 23 18:31:19.559071 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.558943 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-760fb5-kube-rbac-proxy-sar-config\"" Apr 23 18:31:19.559071 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.558905 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-760fb5-predictor-serving-cert\"" Apr 23 18:31:19.559215 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.559201 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-760fb5-dockercfg-95kmx\"" Apr 23 18:31:19.565193 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.565165 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s"] Apr 23 18:31:19.612836 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.612814 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c_3dc7dffb-e6b1-40b1-af37-9823a13f8c94/storage-initializer/1.log" Apr 23 18:31:19.612923 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.612881 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" Apr 23 18:31:19.680316 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.680222 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-proxy-tls\") pod \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\" (UID: \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\") " Apr 23 18:31:19.680316 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.680308 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfjtf\" (UniqueName: \"kubernetes.io/projected/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-kube-api-access-mfjtf\") pod \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\" (UID: \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\") " Apr 23 18:31:19.680605 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.680385 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-cabundle-cert\") pod \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\" (UID: \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\") " Apr 23 18:31:19.680605 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.680514 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-a7ec27-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-isvc-secondary-a7ec27-kube-rbac-proxy-sar-config\") pod \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\" (UID: \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\") " Apr 23 18:31:19.680605 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.680563 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-kserve-provision-location\") pod \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\" (UID: \"3dc7dffb-e6b1-40b1-af37-9823a13f8c94\") " Apr 23 18:31:19.680793 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.680767 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "3dc7dffb-e6b1-40b1-af37-9823a13f8c94" (UID: "3dc7dffb-e6b1-40b1-af37-9823a13f8c94"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:31:19.681041 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.680860 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-isvc-secondary-a7ec27-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-a7ec27-kube-rbac-proxy-sar-config") pod "3dc7dffb-e6b1-40b1-af37-9823a13f8c94" (UID: "3dc7dffb-e6b1-40b1-af37-9823a13f8c94"). InnerVolumeSpecName "isvc-secondary-a7ec27-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:31:19.681154 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.680956 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3dc7dffb-e6b1-40b1-af37-9823a13f8c94" (UID: "3dc7dffb-e6b1-40b1-af37-9823a13f8c94"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:31:19.681154 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.680823 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e9d9f63-f6c9-4dec-81f2-822b78a62332-kserve-provision-location\") pod \"isvc-init-fail-760fb5-predictor-844fcfcd87-t866s\" (UID: \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\") " pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" Apr 23 18:31:19.681394 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.681276 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-760fb5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9e9d9f63-f6c9-4dec-81f2-822b78a62332-isvc-init-fail-760fb5-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-760fb5-predictor-844fcfcd87-t866s\" (UID: \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\") " pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" Apr 23 18:31:19.681479 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.681438 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9e9d9f63-f6c9-4dec-81f2-822b78a62332-cabundle-cert\") pod \"isvc-init-fail-760fb5-predictor-844fcfcd87-t866s\" (UID: \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\") " pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" Apr 23 18:31:19.681529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.681510 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwkwg\" (UniqueName: \"kubernetes.io/projected/9e9d9f63-f6c9-4dec-81f2-822b78a62332-kube-api-access-hwkwg\") pod \"isvc-init-fail-760fb5-predictor-844fcfcd87-t866s\" (UID: \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\") " pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" Apr 23 18:31:19.681583 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.681548 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9e9d9f63-f6c9-4dec-81f2-822b78a62332-proxy-tls\") pod \"isvc-init-fail-760fb5-predictor-844fcfcd87-t866s\" (UID: \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\") " pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" Apr 23 18:31:19.681676 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.681659 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-cabundle-cert\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:31:19.681734 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.681684 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-a7ec27-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-isvc-secondary-a7ec27-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:31:19.681734 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.681700 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:31:19.682850 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.682824 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-kube-api-access-mfjtf" (OuterVolumeSpecName: "kube-api-access-mfjtf") pod "3dc7dffb-e6b1-40b1-af37-9823a13f8c94" (UID: "3dc7dffb-e6b1-40b1-af37-9823a13f8c94"). InnerVolumeSpecName "kube-api-access-mfjtf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:31:19.682927 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.682899 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3dc7dffb-e6b1-40b1-af37-9823a13f8c94" (UID: "3dc7dffb-e6b1-40b1-af37-9823a13f8c94"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:31:19.782492 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.782445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e9d9f63-f6c9-4dec-81f2-822b78a62332-kserve-provision-location\") pod \"isvc-init-fail-760fb5-predictor-844fcfcd87-t866s\" (UID: \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\") " pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" Apr 23 18:31:19.782666 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.782508 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-760fb5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9e9d9f63-f6c9-4dec-81f2-822b78a62332-isvc-init-fail-760fb5-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-760fb5-predictor-844fcfcd87-t866s\" (UID: \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\") " pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" Apr 23 18:31:19.782666 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.782549 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9e9d9f63-f6c9-4dec-81f2-822b78a62332-cabundle-cert\") pod \"isvc-init-fail-760fb5-predictor-844fcfcd87-t866s\" (UID: \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\") " pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" Apr 23 18:31:19.782666 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.782574 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwkwg\" (UniqueName: \"kubernetes.io/projected/9e9d9f63-f6c9-4dec-81f2-822b78a62332-kube-api-access-hwkwg\") pod \"isvc-init-fail-760fb5-predictor-844fcfcd87-t866s\" (UID: \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\") " pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" Apr 23 18:31:19.782666 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.782604 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9e9d9f63-f6c9-4dec-81f2-822b78a62332-proxy-tls\") pod \"isvc-init-fail-760fb5-predictor-844fcfcd87-t866s\" (UID: \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\") " pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" Apr 23 18:31:19.782878 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.782673 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:31:19.782878 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.782690 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mfjtf\" (UniqueName: \"kubernetes.io/projected/3dc7dffb-e6b1-40b1-af37-9823a13f8c94-kube-api-access-mfjtf\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:31:19.782878 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:31:19.782808 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-serving-cert: secret "isvc-init-fail-760fb5-predictor-serving-cert" not found Apr 23 18:31:19.783045 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:31:19.782877 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e9d9f63-f6c9-4dec-81f2-822b78a62332-proxy-tls podName:9e9d9f63-f6c9-4dec-81f2-822b78a62332 nodeName:}" failed. No retries permitted until 2026-04-23 18:31:20.282856427 +0000 UTC m=+1969.450183740 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9e9d9f63-f6c9-4dec-81f2-822b78a62332-proxy-tls") pod "isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" (UID: "9e9d9f63-f6c9-4dec-81f2-822b78a62332") : secret "isvc-init-fail-760fb5-predictor-serving-cert" not found Apr 23 18:31:19.783045 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.782898 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e9d9f63-f6c9-4dec-81f2-822b78a62332-kserve-provision-location\") pod \"isvc-init-fail-760fb5-predictor-844fcfcd87-t866s\" (UID: \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\") " pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" Apr 23 18:31:19.783252 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.783228 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9e9d9f63-f6c9-4dec-81f2-822b78a62332-cabundle-cert\") pod \"isvc-init-fail-760fb5-predictor-844fcfcd87-t866s\" (UID: \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\") " pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" Apr 23 18:31:19.783292 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.783234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-760fb5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9e9d9f63-f6c9-4dec-81f2-822b78a62332-isvc-init-fail-760fb5-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-760fb5-predictor-844fcfcd87-t866s\" (UID: \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\") " pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" Apr 23 18:31:19.792719 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:19.792692 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwkwg\" (UniqueName: \"kubernetes.io/projected/9e9d9f63-f6c9-4dec-81f2-822b78a62332-kube-api-access-hwkwg\") pod \"isvc-init-fail-760fb5-predictor-844fcfcd87-t866s\" (UID: \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\") " pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" Apr 23 18:31:20.287134 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:20.287092 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9e9d9f63-f6c9-4dec-81f2-822b78a62332-proxy-tls\") pod \"isvc-init-fail-760fb5-predictor-844fcfcd87-t866s\" (UID: \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\") " pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" Apr 23 18:31:20.289762 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:20.289741 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9e9d9f63-f6c9-4dec-81f2-822b78a62332-proxy-tls\") pod \"isvc-init-fail-760fb5-predictor-844fcfcd87-t866s\" (UID: \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\") " pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" Apr 23 18:31:20.364551 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:20.364515 2576 generic.go:358] "Generic (PLEG): container finished" podID="b95792bc-8615-4390-a0a9-cfbc6696207b" containerID="584394899c5b7a5074a236f2c51778641bf8f8a76e6aec284416073c23802499" exitCode=2 Apr 23 18:31:20.364719 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:20.364587 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" event={"ID":"b95792bc-8615-4390-a0a9-cfbc6696207b","Type":"ContainerDied","Data":"584394899c5b7a5074a236f2c51778641bf8f8a76e6aec284416073c23802499"} Apr 23 18:31:20.365691 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:20.365673 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c_3dc7dffb-e6b1-40b1-af37-9823a13f8c94/storage-initializer/1.log" Apr 23 18:31:20.365812 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:20.365727 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" event={"ID":"3dc7dffb-e6b1-40b1-af37-9823a13f8c94","Type":"ContainerDied","Data":"d68a5aae43699bed979250675297a4d2c6ebc1ce9214817d2b158ea781619c91"} Apr 23 18:31:20.365812 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:20.365751 2576 scope.go:117] "RemoveContainer" containerID="39faba0cf4af1fe88138606e6fb394e33558d567eff73f08886ab64ac1daf85e" Apr 23 18:31:20.365812 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:20.365784 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c" Apr 23 18:31:20.400849 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:20.400818 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c"] Apr 23 18:31:20.406667 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:20.406640 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a7ec27-predictor-64bc5f5c49-4ht5c"] Apr 23 18:31:20.477267 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:20.477235 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" Apr 23 18:31:20.602847 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:20.602816 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s"] Apr 23 18:31:20.605612 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:31:20.605578 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e9d9f63_f6c9_4dec_81f2_822b78a62332.slice/crio-2c5a02d741dc27c6ef41664f01d1bace1f5b32f50027ea3eb356e9086775eeb3 WatchSource:0}: Error finding container 2c5a02d741dc27c6ef41664f01d1bace1f5b32f50027ea3eb356e9086775eeb3: Status 404 returned error can't find the container with id 2c5a02d741dc27c6ef41664f01d1bace1f5b32f50027ea3eb356e9086775eeb3 Apr 23 18:31:21.373622 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:21.373576 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" event={"ID":"9e9d9f63-f6c9-4dec-81f2-822b78a62332","Type":"ContainerStarted","Data":"ac9e0ec0488a0bdbe4fdcee645d984d01ed35df30ee6f2312c6d4a666622f846"} Apr 23 18:31:21.373622 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:21.373620 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" event={"ID":"9e9d9f63-f6c9-4dec-81f2-822b78a62332","Type":"ContainerStarted","Data":"2c5a02d741dc27c6ef41664f01d1bace1f5b32f50027ea3eb356e9086775eeb3"} Apr 23 18:31:21.414960 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:21.414927 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dc7dffb-e6b1-40b1-af37-9823a13f8c94" path="/var/lib/kubelet/pods/3dc7dffb-e6b1-40b1-af37-9823a13f8c94/volumes" Apr 23 18:31:23.834058 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:23.834037 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" Apr 23 18:31:23.923533 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:23.923503 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-a7ec27-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b95792bc-8615-4390-a0a9-cfbc6696207b-isvc-primary-a7ec27-kube-rbac-proxy-sar-config\") pod \"b95792bc-8615-4390-a0a9-cfbc6696207b\" (UID: \"b95792bc-8615-4390-a0a9-cfbc6696207b\") " Apr 23 18:31:23.923700 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:23.923563 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6rcc\" (UniqueName: \"kubernetes.io/projected/b95792bc-8615-4390-a0a9-cfbc6696207b-kube-api-access-r6rcc\") pod \"b95792bc-8615-4390-a0a9-cfbc6696207b\" (UID: \"b95792bc-8615-4390-a0a9-cfbc6696207b\") " Apr 23 18:31:23.923700 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:23.923597 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b95792bc-8615-4390-a0a9-cfbc6696207b-kserve-provision-location\") pod \"b95792bc-8615-4390-a0a9-cfbc6696207b\" (UID: \"b95792bc-8615-4390-a0a9-cfbc6696207b\") " Apr 23 18:31:23.923700 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:23.923628 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b95792bc-8615-4390-a0a9-cfbc6696207b-proxy-tls\") pod \"b95792bc-8615-4390-a0a9-cfbc6696207b\" (UID: \"b95792bc-8615-4390-a0a9-cfbc6696207b\") " Apr 23 18:31:23.923921 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:23.923898 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b95792bc-8615-4390-a0a9-cfbc6696207b-isvc-primary-a7ec27-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-a7ec27-kube-rbac-proxy-sar-config") pod "b95792bc-8615-4390-a0a9-cfbc6696207b" (UID: "b95792bc-8615-4390-a0a9-cfbc6696207b"). InnerVolumeSpecName "isvc-primary-a7ec27-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:31:23.924000 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:23.923964 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b95792bc-8615-4390-a0a9-cfbc6696207b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b95792bc-8615-4390-a0a9-cfbc6696207b" (UID: "b95792bc-8615-4390-a0a9-cfbc6696207b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:31:23.925836 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:23.925813 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b95792bc-8615-4390-a0a9-cfbc6696207b-kube-api-access-r6rcc" (OuterVolumeSpecName: "kube-api-access-r6rcc") pod "b95792bc-8615-4390-a0a9-cfbc6696207b" (UID: "b95792bc-8615-4390-a0a9-cfbc6696207b"). InnerVolumeSpecName "kube-api-access-r6rcc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:31:23.925926 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:23.925871 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b95792bc-8615-4390-a0a9-cfbc6696207b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b95792bc-8615-4390-a0a9-cfbc6696207b" (UID: "b95792bc-8615-4390-a0a9-cfbc6696207b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:31:24.025156 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:24.025118 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r6rcc\" (UniqueName: \"kubernetes.io/projected/b95792bc-8615-4390-a0a9-cfbc6696207b-kube-api-access-r6rcc\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:31:24.025156 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:24.025148 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b95792bc-8615-4390-a0a9-cfbc6696207b-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:31:24.025156 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:24.025159 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b95792bc-8615-4390-a0a9-cfbc6696207b-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:31:24.025380 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:24.025168 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-a7ec27-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b95792bc-8615-4390-a0a9-cfbc6696207b-isvc-primary-a7ec27-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:31:24.386798 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:24.386762 2576 generic.go:358] "Generic (PLEG): container finished" podID="b95792bc-8615-4390-a0a9-cfbc6696207b" containerID="93c0588a5564dfeae88d03292a47caf2f5953e470d26beb9324c593ae143e5e4" exitCode=0 Apr 23 18:31:24.387006 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:24.386841 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" Apr 23 18:31:24.387006 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:24.386849 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" event={"ID":"b95792bc-8615-4390-a0a9-cfbc6696207b","Type":"ContainerDied","Data":"93c0588a5564dfeae88d03292a47caf2f5953e470d26beb9324c593ae143e5e4"} Apr 23 18:31:24.387006 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:24.386894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk" event={"ID":"b95792bc-8615-4390-a0a9-cfbc6696207b","Type":"ContainerDied","Data":"0c87adc4c13b9cdb002802f72bc407a26019d7c8a90ee79bbdf1f0f57923b046"} Apr 23 18:31:24.387006 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:24.386915 2576 scope.go:117] "RemoveContainer" containerID="584394899c5b7a5074a236f2c51778641bf8f8a76e6aec284416073c23802499" Apr 23 18:31:24.395686 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:24.395669 2576 scope.go:117] "RemoveContainer" containerID="93c0588a5564dfeae88d03292a47caf2f5953e470d26beb9324c593ae143e5e4" Apr 23 18:31:24.402995 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:24.402979 2576 scope.go:117] "RemoveContainer" containerID="5a59c7bf6d8e8a98155a6d9514d86b09bc1c2c5176337f398dae19491f4ac587" Apr 23 18:31:24.408556 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:24.408533 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk"] Apr 23 18:31:24.412033 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:24.412016 2576 scope.go:117] "RemoveContainer" containerID="584394899c5b7a5074a236f2c51778641bf8f8a76e6aec284416073c23802499" Apr 23 18:31:24.412298 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:31:24.412280 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"584394899c5b7a5074a236f2c51778641bf8f8a76e6aec284416073c23802499\": container with ID starting with 584394899c5b7a5074a236f2c51778641bf8f8a76e6aec284416073c23802499 not found: ID does not exist" containerID="584394899c5b7a5074a236f2c51778641bf8f8a76e6aec284416073c23802499" Apr 23 18:31:24.412369 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:24.412335 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a7ec27-predictor-5c556c75f4-4spnk"] Apr 23 18:31:24.412369 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:24.412312 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"584394899c5b7a5074a236f2c51778641bf8f8a76e6aec284416073c23802499"} err="failed to get container status \"584394899c5b7a5074a236f2c51778641bf8f8a76e6aec284416073c23802499\": rpc error: code = NotFound desc = could not find container \"584394899c5b7a5074a236f2c51778641bf8f8a76e6aec284416073c23802499\": container with ID starting with 584394899c5b7a5074a236f2c51778641bf8f8a76e6aec284416073c23802499 not found: ID does not exist" Apr 23 18:31:24.412455 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:24.412377 2576 scope.go:117] "RemoveContainer" containerID="93c0588a5564dfeae88d03292a47caf2f5953e470d26beb9324c593ae143e5e4" Apr 23 18:31:24.412686 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:31:24.412667 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93c0588a5564dfeae88d03292a47caf2f5953e470d26beb9324c593ae143e5e4\": container with ID starting with 93c0588a5564dfeae88d03292a47caf2f5953e470d26beb9324c593ae143e5e4 not found: ID does not exist" containerID="93c0588a5564dfeae88d03292a47caf2f5953e470d26beb9324c593ae143e5e4" Apr 23 18:31:24.412730 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:24.412692 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93c0588a5564dfeae88d03292a47caf2f5953e470d26beb9324c593ae143e5e4"} err="failed to get container status \"93c0588a5564dfeae88d03292a47caf2f5953e470d26beb9324c593ae143e5e4\": rpc error: code = NotFound desc = could not find container \"93c0588a5564dfeae88d03292a47caf2f5953e470d26beb9324c593ae143e5e4\": container with ID starting with 93c0588a5564dfeae88d03292a47caf2f5953e470d26beb9324c593ae143e5e4 not found: ID does not exist" Apr 23 18:31:24.412730 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:24.412707 2576 scope.go:117] "RemoveContainer" containerID="5a59c7bf6d8e8a98155a6d9514d86b09bc1c2c5176337f398dae19491f4ac587" Apr 23 18:31:24.412959 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:31:24.412936 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a59c7bf6d8e8a98155a6d9514d86b09bc1c2c5176337f398dae19491f4ac587\": container with ID starting with 5a59c7bf6d8e8a98155a6d9514d86b09bc1c2c5176337f398dae19491f4ac587 not found: ID does not exist" containerID="5a59c7bf6d8e8a98155a6d9514d86b09bc1c2c5176337f398dae19491f4ac587" Apr 23 18:31:24.413000 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:24.412966 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a59c7bf6d8e8a98155a6d9514d86b09bc1c2c5176337f398dae19491f4ac587"} err="failed to get container status \"5a59c7bf6d8e8a98155a6d9514d86b09bc1c2c5176337f398dae19491f4ac587\": rpc error: code = NotFound desc = could not find container \"5a59c7bf6d8e8a98155a6d9514d86b09bc1c2c5176337f398dae19491f4ac587\": container with ID starting with 5a59c7bf6d8e8a98155a6d9514d86b09bc1c2c5176337f398dae19491f4ac587 not found: ID does not exist" Apr 23 18:31:25.391783 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:25.391756 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-760fb5-predictor-844fcfcd87-t866s_9e9d9f63-f6c9-4dec-81f2-822b78a62332/storage-initializer/0.log" Apr 23 18:31:25.392257 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:25.391792 2576 generic.go:358] "Generic (PLEG): container finished" podID="9e9d9f63-f6c9-4dec-81f2-822b78a62332" containerID="ac9e0ec0488a0bdbe4fdcee645d984d01ed35df30ee6f2312c6d4a666622f846" exitCode=1 Apr 23 18:31:25.392257 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:25.391872 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" event={"ID":"9e9d9f63-f6c9-4dec-81f2-822b78a62332","Type":"ContainerDied","Data":"ac9e0ec0488a0bdbe4fdcee645d984d01ed35df30ee6f2312c6d4a666622f846"} Apr 23 18:31:25.417249 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:25.417223 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b95792bc-8615-4390-a0a9-cfbc6696207b" path="/var/lib/kubelet/pods/b95792bc-8615-4390-a0a9-cfbc6696207b/volumes" Apr 23 18:31:26.398283 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:26.398257 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-760fb5-predictor-844fcfcd87-t866s_9e9d9f63-f6c9-4dec-81f2-822b78a62332/storage-initializer/0.log" Apr 23 18:31:26.398680 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:26.398379 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" event={"ID":"9e9d9f63-f6c9-4dec-81f2-822b78a62332","Type":"ContainerStarted","Data":"e4ec344ace161bf0409c734210e3d1dbf635b9e4ad719cfec54449e49311e984"} Apr 23 18:31:29.409953 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:29.409923 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-760fb5-predictor-844fcfcd87-t866s_9e9d9f63-f6c9-4dec-81f2-822b78a62332/storage-initializer/1.log" Apr 23 18:31:29.410377 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:29.410248 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-760fb5-predictor-844fcfcd87-t866s_9e9d9f63-f6c9-4dec-81f2-822b78a62332/storage-initializer/0.log" Apr 23 18:31:29.410377 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:29.410274 2576 generic.go:358] "Generic (PLEG): container finished" podID="9e9d9f63-f6c9-4dec-81f2-822b78a62332" containerID="e4ec344ace161bf0409c734210e3d1dbf635b9e4ad719cfec54449e49311e984" exitCode=1 Apr 23 18:31:29.414203 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:29.414172 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" event={"ID":"9e9d9f63-f6c9-4dec-81f2-822b78a62332","Type":"ContainerDied","Data":"e4ec344ace161bf0409c734210e3d1dbf635b9e4ad719cfec54449e49311e984"} Apr 23 18:31:29.414330 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:29.414215 2576 scope.go:117] "RemoveContainer" containerID="ac9e0ec0488a0bdbe4fdcee645d984d01ed35df30ee6f2312c6d4a666622f846" Apr 23 18:31:29.414683 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:29.414664 2576 scope.go:117] "RemoveContainer" containerID="ac9e0ec0488a0bdbe4fdcee645d984d01ed35df30ee6f2312c6d4a666622f846" Apr 23 18:31:29.425168 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:31:29.425133 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-760fb5-predictor-844fcfcd87-t866s_kserve-ci-e2e-test_9e9d9f63-f6c9-4dec-81f2-822b78a62332_0 in pod sandbox 2c5a02d741dc27c6ef41664f01d1bace1f5b32f50027ea3eb356e9086775eeb3 from index: no such id: 'ac9e0ec0488a0bdbe4fdcee645d984d01ed35df30ee6f2312c6d4a666622f846'" containerID="ac9e0ec0488a0bdbe4fdcee645d984d01ed35df30ee6f2312c6d4a666622f846" Apr 23 18:31:29.425278 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:31:29.425185 2576 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-760fb5-predictor-844fcfcd87-t866s_kserve-ci-e2e-test_9e9d9f63-f6c9-4dec-81f2-822b78a62332_0 in pod sandbox 2c5a02d741dc27c6ef41664f01d1bace1f5b32f50027ea3eb356e9086775eeb3 from index: no such id: 'ac9e0ec0488a0bdbe4fdcee645d984d01ed35df30ee6f2312c6d4a666622f846'; Skipping pod \"isvc-init-fail-760fb5-predictor-844fcfcd87-t866s_kserve-ci-e2e-test(9e9d9f63-f6c9-4dec-81f2-822b78a62332)\"" logger="UnhandledError" Apr 23 18:31:29.426534 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:31:29.426513 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-init-fail-760fb5-predictor-844fcfcd87-t866s_kserve-ci-e2e-test(9e9d9f63-f6c9-4dec-81f2-822b78a62332)\"" pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" podUID="9e9d9f63-f6c9-4dec-81f2-822b78a62332" Apr 23 18:31:29.560214 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:29.560184 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s"] Apr 23 18:31:30.414835 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:30.414808 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-760fb5-predictor-844fcfcd87-t866s_9e9d9f63-f6c9-4dec-81f2-822b78a62332/storage-initializer/1.log" Apr 23 18:31:30.542750 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:30.542730 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-760fb5-predictor-844fcfcd87-t866s_9e9d9f63-f6c9-4dec-81f2-822b78a62332/storage-initializer/1.log" Apr 23 18:31:30.542860 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:30.542797 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" Apr 23 18:31:30.680718 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:30.680637 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwkwg\" (UniqueName: \"kubernetes.io/projected/9e9d9f63-f6c9-4dec-81f2-822b78a62332-kube-api-access-hwkwg\") pod \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\" (UID: \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\") " Apr 23 18:31:30.680718 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:30.680673 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e9d9f63-f6c9-4dec-81f2-822b78a62332-kserve-provision-location\") pod \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\" (UID: \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\") " Apr 23 18:31:30.680718 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:30.680700 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9e9d9f63-f6c9-4dec-81f2-822b78a62332-cabundle-cert\") pod \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\" (UID: \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\") " Apr 23 18:31:30.680718 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:30.680721 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-760fb5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9e9d9f63-f6c9-4dec-81f2-822b78a62332-isvc-init-fail-760fb5-kube-rbac-proxy-sar-config\") pod \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\" (UID: \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\") " Apr 23 18:31:30.680985 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:30.680756 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9e9d9f63-f6c9-4dec-81f2-822b78a62332-proxy-tls\") pod \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\" (UID: \"9e9d9f63-f6c9-4dec-81f2-822b78a62332\") " Apr 23 18:31:30.681038 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:30.681016 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e9d9f63-f6c9-4dec-81f2-822b78a62332-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9e9d9f63-f6c9-4dec-81f2-822b78a62332" (UID: "9e9d9f63-f6c9-4dec-81f2-822b78a62332"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:31:30.681114 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:30.681097 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e9d9f63-f6c9-4dec-81f2-822b78a62332-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "9e9d9f63-f6c9-4dec-81f2-822b78a62332" (UID: "9e9d9f63-f6c9-4dec-81f2-822b78a62332"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:31:30.681197 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:30.681174 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e9d9f63-f6c9-4dec-81f2-822b78a62332-isvc-init-fail-760fb5-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-760fb5-kube-rbac-proxy-sar-config") pod "9e9d9f63-f6c9-4dec-81f2-822b78a62332" (UID: "9e9d9f63-f6c9-4dec-81f2-822b78a62332"). InnerVolumeSpecName "isvc-init-fail-760fb5-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:31:30.683098 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:30.683074 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e9d9f63-f6c9-4dec-81f2-822b78a62332-kube-api-access-hwkwg" (OuterVolumeSpecName: "kube-api-access-hwkwg") pod "9e9d9f63-f6c9-4dec-81f2-822b78a62332" (UID: "9e9d9f63-f6c9-4dec-81f2-822b78a62332"). InnerVolumeSpecName "kube-api-access-hwkwg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:31:30.683220 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:30.683110 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e9d9f63-f6c9-4dec-81f2-822b78a62332-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9e9d9f63-f6c9-4dec-81f2-822b78a62332" (UID: "9e9d9f63-f6c9-4dec-81f2-822b78a62332"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:31:30.781955 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:30.781917 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9e9d9f63-f6c9-4dec-81f2-822b78a62332-cabundle-cert\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:31:30.781955 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:30.781949 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-760fb5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9e9d9f63-f6c9-4dec-81f2-822b78a62332-isvc-init-fail-760fb5-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:31:30.781955 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:30.781961 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9e9d9f63-f6c9-4dec-81f2-822b78a62332-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:31:30.782184 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:30.781972 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hwkwg\" (UniqueName: \"kubernetes.io/projected/9e9d9f63-f6c9-4dec-81f2-822b78a62332-kube-api-access-hwkwg\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:31:30.782184 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:30.781981 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e9d9f63-f6c9-4dec-81f2-822b78a62332-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:31:31.418610 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:31.418583 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-760fb5-predictor-844fcfcd87-t866s_9e9d9f63-f6c9-4dec-81f2-822b78a62332/storage-initializer/1.log" Apr 23 18:31:31.419083 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:31.418664 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" event={"ID":"9e9d9f63-f6c9-4dec-81f2-822b78a62332","Type":"ContainerDied","Data":"2c5a02d741dc27c6ef41664f01d1bace1f5b32f50027ea3eb356e9086775eeb3"} Apr 23 18:31:31.419083 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:31.418692 2576 scope.go:117] "RemoveContainer" containerID="e4ec344ace161bf0409c734210e3d1dbf635b9e4ad719cfec54449e49311e984" Apr 23 18:31:31.419083 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:31.418703 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s" Apr 23 18:31:31.453108 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:31.453077 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s"] Apr 23 18:31:31.456632 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:31.456605 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-760fb5-predictor-844fcfcd87-t866s"] Apr 23 18:31:33.415490 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:31:33.415438 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e9d9f63-f6c9-4dec-81f2-822b78a62332" path="/var/lib/kubelet/pods/9e9d9f63-f6c9-4dec-81f2-822b78a62332/volumes" Apr 23 18:33:31.464621 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:33:31.464590 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:33:31.468230 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:33:31.468210 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:38:31.488208 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:38:31.488181 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:38:31.493961 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:38:31.493940 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:40:53.163205 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.163168 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b"] Apr 23 18:40:53.164529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.163594 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b95792bc-8615-4390-a0a9-cfbc6696207b" containerName="storage-initializer" Apr 23 18:40:53.164529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.163608 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b95792bc-8615-4390-a0a9-cfbc6696207b" containerName="storage-initializer" Apr 23 18:40:53.164529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.163616 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b95792bc-8615-4390-a0a9-cfbc6696207b" containerName="kube-rbac-proxy" Apr 23 18:40:53.164529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.163622 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b95792bc-8615-4390-a0a9-cfbc6696207b" containerName="kube-rbac-proxy" Apr 23 18:40:53.164529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.163629 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e9d9f63-f6c9-4dec-81f2-822b78a62332" containerName="storage-initializer" Apr 23 18:40:53.164529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.163635 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e9d9f63-f6c9-4dec-81f2-822b78a62332" containerName="storage-initializer" Apr 23 18:40:53.164529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.163646 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e9d9f63-f6c9-4dec-81f2-822b78a62332" containerName="storage-initializer" Apr 23 18:40:53.164529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.163655 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e9d9f63-f6c9-4dec-81f2-822b78a62332" containerName="storage-initializer" Apr 23 18:40:53.164529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.163671 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b95792bc-8615-4390-a0a9-cfbc6696207b" containerName="kserve-container" Apr 23 18:40:53.164529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.163679 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b95792bc-8615-4390-a0a9-cfbc6696207b" containerName="kserve-container" Apr 23 18:40:53.164529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.163699 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3dc7dffb-e6b1-40b1-af37-9823a13f8c94" containerName="storage-initializer" Apr 23 18:40:53.164529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.163705 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc7dffb-e6b1-40b1-af37-9823a13f8c94" containerName="storage-initializer" Apr 23 18:40:53.164529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.163713 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3dc7dffb-e6b1-40b1-af37-9823a13f8c94" containerName="storage-initializer" Apr 23 18:40:53.164529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.163718 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc7dffb-e6b1-40b1-af37-9823a13f8c94" containerName="storage-initializer" Apr 23 18:40:53.164529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.163782 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e9d9f63-f6c9-4dec-81f2-822b78a62332" containerName="storage-initializer" Apr 23 18:40:53.164529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.163790 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3dc7dffb-e6b1-40b1-af37-9823a13f8c94" containerName="storage-initializer" Apr 23 18:40:53.164529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.163797 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b95792bc-8615-4390-a0a9-cfbc6696207b" containerName="kserve-container" Apr 23 18:40:53.164529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.163807 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b95792bc-8615-4390-a0a9-cfbc6696207b" containerName="kube-rbac-proxy" Apr 23 18:40:53.164529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.163918 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e9d9f63-f6c9-4dec-81f2-822b78a62332" containerName="storage-initializer" Apr 23 18:40:53.164529 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.163926 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3dc7dffb-e6b1-40b1-af37-9823a13f8c94" containerName="storage-initializer" Apr 23 18:40:53.167114 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.167096 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" Apr 23 18:40:53.169399 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.169376 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-predictor-serving-cert\"" Apr 23 18:40:53.169532 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.169380 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-kube-rbac-proxy-sar-config\"" Apr 23 18:40:53.169532 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.169418 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:40:53.169532 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.169388 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t4fg7\"" Apr 23 18:40:53.170269 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.170248 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 18:40:53.178257 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.178236 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b"] Apr 23 18:40:53.261677 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.261640 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ec770698-9d17-4526-b07d-f8a324c46007-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-67d5fd9dd5-ht89b\" (UID: \"ec770698-9d17-4526-b07d-f8a324c46007\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" Apr 23 18:40:53.261860 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.261709 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrmb4\" (UniqueName: \"kubernetes.io/projected/ec770698-9d17-4526-b07d-f8a324c46007-kube-api-access-wrmb4\") pod \"isvc-sklearn-predictor-67d5fd9dd5-ht89b\" (UID: \"ec770698-9d17-4526-b07d-f8a324c46007\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" Apr 23 18:40:53.261860 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.261755 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec770698-9d17-4526-b07d-f8a324c46007-proxy-tls\") pod \"isvc-sklearn-predictor-67d5fd9dd5-ht89b\" (UID: \"ec770698-9d17-4526-b07d-f8a324c46007\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" Apr 23 18:40:53.261860 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.261801 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec770698-9d17-4526-b07d-f8a324c46007-kserve-provision-location\") pod \"isvc-sklearn-predictor-67d5fd9dd5-ht89b\" (UID: \"ec770698-9d17-4526-b07d-f8a324c46007\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" Apr 23 18:40:53.363178 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.363138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrmb4\" (UniqueName: \"kubernetes.io/projected/ec770698-9d17-4526-b07d-f8a324c46007-kube-api-access-wrmb4\") pod \"isvc-sklearn-predictor-67d5fd9dd5-ht89b\" (UID: \"ec770698-9d17-4526-b07d-f8a324c46007\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" Apr 23 18:40:53.363350 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.363205 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec770698-9d17-4526-b07d-f8a324c46007-proxy-tls\") pod \"isvc-sklearn-predictor-67d5fd9dd5-ht89b\" (UID: \"ec770698-9d17-4526-b07d-f8a324c46007\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" Apr 23 18:40:53.363350 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.363242 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec770698-9d17-4526-b07d-f8a324c46007-kserve-provision-location\") pod \"isvc-sklearn-predictor-67d5fd9dd5-ht89b\" (UID: \"ec770698-9d17-4526-b07d-f8a324c46007\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" Apr 23 18:40:53.363350 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.363273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ec770698-9d17-4526-b07d-f8a324c46007-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-67d5fd9dd5-ht89b\" (UID: \"ec770698-9d17-4526-b07d-f8a324c46007\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" Apr 23 18:40:53.363350 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:40:53.363319 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-predictor-serving-cert: secret "isvc-sklearn-predictor-serving-cert" not found Apr 23 18:40:53.363571 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:40:53.363389 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec770698-9d17-4526-b07d-f8a324c46007-proxy-tls podName:ec770698-9d17-4526-b07d-f8a324c46007 nodeName:}" failed. No retries permitted until 2026-04-23 18:40:53.863366813 +0000 UTC m=+2543.030694114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ec770698-9d17-4526-b07d-f8a324c46007-proxy-tls") pod "isvc-sklearn-predictor-67d5fd9dd5-ht89b" (UID: "ec770698-9d17-4526-b07d-f8a324c46007") : secret "isvc-sklearn-predictor-serving-cert" not found Apr 23 18:40:53.363743 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.363723 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec770698-9d17-4526-b07d-f8a324c46007-kserve-provision-location\") pod \"isvc-sklearn-predictor-67d5fd9dd5-ht89b\" (UID: \"ec770698-9d17-4526-b07d-f8a324c46007\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" Apr 23 18:40:53.363974 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.363957 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ec770698-9d17-4526-b07d-f8a324c46007-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-67d5fd9dd5-ht89b\" (UID: \"ec770698-9d17-4526-b07d-f8a324c46007\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" Apr 23 18:40:53.371843 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.371822 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrmb4\" (UniqueName: \"kubernetes.io/projected/ec770698-9d17-4526-b07d-f8a324c46007-kube-api-access-wrmb4\") pod \"isvc-sklearn-predictor-67d5fd9dd5-ht89b\" (UID: \"ec770698-9d17-4526-b07d-f8a324c46007\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" Apr 23 18:40:53.868512 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.868451 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec770698-9d17-4526-b07d-f8a324c46007-proxy-tls\") pod \"isvc-sklearn-predictor-67d5fd9dd5-ht89b\" (UID: \"ec770698-9d17-4526-b07d-f8a324c46007\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" Apr 23 18:40:53.871007 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:53.870986 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec770698-9d17-4526-b07d-f8a324c46007-proxy-tls\") pod \"isvc-sklearn-predictor-67d5fd9dd5-ht89b\" (UID: \"ec770698-9d17-4526-b07d-f8a324c46007\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" Apr 23 18:40:54.078221 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:54.078187 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" Apr 23 18:40:54.203775 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:54.203744 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b"] Apr 23 18:40:54.206686 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:40:54.206656 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec770698_9d17_4526_b07d_f8a324c46007.slice/crio-22fac79cf2a13b6aa587fc80b99a436264dc5f2b373bd63c5ab85934a13b384e WatchSource:0}: Error finding container 22fac79cf2a13b6aa587fc80b99a436264dc5f2b373bd63c5ab85934a13b384e: Status 404 returned error can't find the container with id 22fac79cf2a13b6aa587fc80b99a436264dc5f2b373bd63c5ab85934a13b384e Apr 23 18:40:54.208587 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:54.208566 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:40:54.333757 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:54.333719 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" event={"ID":"ec770698-9d17-4526-b07d-f8a324c46007","Type":"ContainerStarted","Data":"4873f328e235a721d7d80a82635cd1019f5ed15b68549eab18f6fc2672d9982c"} Apr 23 18:40:54.333757 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:54.333756 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" event={"ID":"ec770698-9d17-4526-b07d-f8a324c46007","Type":"ContainerStarted","Data":"22fac79cf2a13b6aa587fc80b99a436264dc5f2b373bd63c5ab85934a13b384e"} Apr 23 18:40:58.349095 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:58.349064 2576 generic.go:358] "Generic (PLEG): container finished" podID="ec770698-9d17-4526-b07d-f8a324c46007" containerID="4873f328e235a721d7d80a82635cd1019f5ed15b68549eab18f6fc2672d9982c" exitCode=0 Apr 23 18:40:58.349441 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:58.349148 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" event={"ID":"ec770698-9d17-4526-b07d-f8a324c46007","Type":"ContainerDied","Data":"4873f328e235a721d7d80a82635cd1019f5ed15b68549eab18f6fc2672d9982c"} Apr 23 18:40:59.355221 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:59.355182 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" event={"ID":"ec770698-9d17-4526-b07d-f8a324c46007","Type":"ContainerStarted","Data":"7fc74bf74004d5af1f1018f22e0ff8cc12334dfd9286f41c4c326af99a5d4600"} Apr 23 18:40:59.355221 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:59.355224 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" event={"ID":"ec770698-9d17-4526-b07d-f8a324c46007","Type":"ContainerStarted","Data":"b11534a183988bae8942a1472c9e33bc4d3425c5f9bb2b91a511ef4e77dd6518"} Apr 23 18:40:59.355699 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:59.355508 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" Apr 23 18:40:59.355699 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:59.355616 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" Apr 23 18:40:59.356669 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:59.356640 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" podUID="ec770698-9d17-4526-b07d-f8a324c46007" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 23 18:40:59.372598 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:40:59.372557 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" podStartSLOduration=6.372544639 podStartE2EDuration="6.372544639s" podCreationTimestamp="2026-04-23 18:40:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:40:59.372398861 +0000 UTC m=+2548.539726179" watchObservedRunningTime="2026-04-23 18:40:59.372544639 +0000 UTC m=+2548.539871957" Apr 23 18:41:00.358783 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:41:00.358744 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" podUID="ec770698-9d17-4526-b07d-f8a324c46007" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 23 18:41:05.364556 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:41:05.363739 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" Apr 23 18:41:05.369089 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:41:05.369053 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" podUID="ec770698-9d17-4526-b07d-f8a324c46007" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 23 18:41:15.365387 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:41:15.365346 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" podUID="ec770698-9d17-4526-b07d-f8a324c46007" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 23 18:41:25.366030 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:41:25.365982 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" podUID="ec770698-9d17-4526-b07d-f8a324c46007" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 23 18:41:35.365538 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:41:35.365495 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" podUID="ec770698-9d17-4526-b07d-f8a324c46007" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 23 18:41:45.365131 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:41:45.365042 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" podUID="ec770698-9d17-4526-b07d-f8a324c46007" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 23 18:41:55.365497 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:41:55.365441 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" podUID="ec770698-9d17-4526-b07d-f8a324c46007" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 23 18:42:05.365644 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:05.365614 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" Apr 23 18:42:10.003161 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:10.003122 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67cf9fcb56-mznb7"] Apr 23 18:42:13.312042 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:13.312009 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b"] Apr 23 18:42:13.312540 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:13.312450 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" podUID="ec770698-9d17-4526-b07d-f8a324c46007" containerName="kserve-container" containerID="cri-o://b11534a183988bae8942a1472c9e33bc4d3425c5f9bb2b91a511ef4e77dd6518" gracePeriod=30 Apr 23 18:42:13.312619 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:13.312549 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" podUID="ec770698-9d17-4526-b07d-f8a324c46007" containerName="kube-rbac-proxy" containerID="cri-o://7fc74bf74004d5af1f1018f22e0ff8cc12334dfd9286f41c4c326af99a5d4600" gracePeriod=30 Apr 23 18:42:13.447044 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:13.447010 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7"] Apr 23 18:42:13.450549 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:13.450531 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" Apr 23 18:42:13.452337 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:13.452315 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-predictor-serving-cert\"" Apr 23 18:42:13.452337 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:13.452331 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 23 18:42:13.460583 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:13.460557 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7"] Apr 23 18:42:13.538831 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:13.538799 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ed9390f7-ada6-4059-8643-57b6bff2d505-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-jfgq7\" (UID: \"ed9390f7-ada6-4059-8643-57b6bff2d505\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" Apr 23 18:42:13.539003 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:13.538842 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed9390f7-ada6-4059-8643-57b6bff2d505-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-jfgq7\" (UID: \"ed9390f7-ada6-4059-8643-57b6bff2d505\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" Apr 23 18:42:13.539003 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:13.538870 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed9390f7-ada6-4059-8643-57b6bff2d505-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-jfgq7\" (UID: \"ed9390f7-ada6-4059-8643-57b6bff2d505\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" Apr 23 18:42:13.539003 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:13.538923 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4klv\" (UniqueName: \"kubernetes.io/projected/ed9390f7-ada6-4059-8643-57b6bff2d505-kube-api-access-p4klv\") pod \"sklearn-v2-mlserver-predictor-65d8664766-jfgq7\" (UID: \"ed9390f7-ada6-4059-8643-57b6bff2d505\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" Apr 23 18:42:13.601509 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:13.601447 2576 generic.go:358] "Generic (PLEG): container finished" podID="ec770698-9d17-4526-b07d-f8a324c46007" containerID="7fc74bf74004d5af1f1018f22e0ff8cc12334dfd9286f41c4c326af99a5d4600" exitCode=2 Apr 23 18:42:13.601675 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:13.601491 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" event={"ID":"ec770698-9d17-4526-b07d-f8a324c46007","Type":"ContainerDied","Data":"7fc74bf74004d5af1f1018f22e0ff8cc12334dfd9286f41c4c326af99a5d4600"} Apr 23 18:42:13.639515 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:13.639483 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ed9390f7-ada6-4059-8643-57b6bff2d505-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-jfgq7\" (UID: \"ed9390f7-ada6-4059-8643-57b6bff2d505\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" Apr 23 18:42:13.639609 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:13.639532 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed9390f7-ada6-4059-8643-57b6bff2d505-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-jfgq7\" (UID: \"ed9390f7-ada6-4059-8643-57b6bff2d505\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" Apr 23 18:42:13.639609 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:13.639564 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed9390f7-ada6-4059-8643-57b6bff2d505-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-jfgq7\" (UID: \"ed9390f7-ada6-4059-8643-57b6bff2d505\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" Apr 23 18:42:13.639609 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:13.639596 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4klv\" (UniqueName: \"kubernetes.io/projected/ed9390f7-ada6-4059-8643-57b6bff2d505-kube-api-access-p4klv\") pod \"sklearn-v2-mlserver-predictor-65d8664766-jfgq7\" (UID: \"ed9390f7-ada6-4059-8643-57b6bff2d505\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" Apr 23 18:42:13.639759 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:42:13.639681 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-serving-cert: secret "sklearn-v2-mlserver-predictor-serving-cert" not found Apr 23 18:42:13.639805 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:42:13.639762 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed9390f7-ada6-4059-8643-57b6bff2d505-proxy-tls podName:ed9390f7-ada6-4059-8643-57b6bff2d505 nodeName:}" failed. No retries permitted until 2026-04-23 18:42:14.13973995 +0000 UTC m=+2623.307067255 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ed9390f7-ada6-4059-8643-57b6bff2d505-proxy-tls") pod "sklearn-v2-mlserver-predictor-65d8664766-jfgq7" (UID: "ed9390f7-ada6-4059-8643-57b6bff2d505") : secret "sklearn-v2-mlserver-predictor-serving-cert" not found Apr 23 18:42:13.640039 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:13.640017 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed9390f7-ada6-4059-8643-57b6bff2d505-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-jfgq7\" (UID: \"ed9390f7-ada6-4059-8643-57b6bff2d505\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" Apr 23 18:42:13.640190 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:13.640170 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ed9390f7-ada6-4059-8643-57b6bff2d505-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-jfgq7\" (UID: \"ed9390f7-ada6-4059-8643-57b6bff2d505\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" Apr 23 18:42:13.649737 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:13.649713 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4klv\" (UniqueName: \"kubernetes.io/projected/ed9390f7-ada6-4059-8643-57b6bff2d505-kube-api-access-p4klv\") pod \"sklearn-v2-mlserver-predictor-65d8664766-jfgq7\" (UID: \"ed9390f7-ada6-4059-8643-57b6bff2d505\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" Apr 23 18:42:14.145727 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:14.145684 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed9390f7-ada6-4059-8643-57b6bff2d505-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-jfgq7\" (UID: \"ed9390f7-ada6-4059-8643-57b6bff2d505\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" Apr 23 18:42:14.148309 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:14.148288 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed9390f7-ada6-4059-8643-57b6bff2d505-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-jfgq7\" (UID: \"ed9390f7-ada6-4059-8643-57b6bff2d505\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" Apr 23 18:42:14.363173 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:14.363117 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" Apr 23 18:42:14.490717 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:14.490690 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7"] Apr 23 18:42:14.493529 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:42:14.493499 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded9390f7_ada6_4059_8643_57b6bff2d505.slice/crio-1cc64862de60c729ba93b4efce499117c910d04e1cfd5486db2e75921dae9d14 WatchSource:0}: Error finding container 1cc64862de60c729ba93b4efce499117c910d04e1cfd5486db2e75921dae9d14: Status 404 returned error can't find the container with id 1cc64862de60c729ba93b4efce499117c910d04e1cfd5486db2e75921dae9d14 Apr 23 18:42:14.607228 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:14.607183 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" event={"ID":"ed9390f7-ada6-4059-8643-57b6bff2d505","Type":"ContainerStarted","Data":"4f815868a6f8e032c186ac965290346c6fadf1c7626317589940123f868c6b64"} Apr 23 18:42:14.607228 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:14.607220 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" event={"ID":"ed9390f7-ada6-4059-8643-57b6bff2d505","Type":"ContainerStarted","Data":"1cc64862de60c729ba93b4efce499117c910d04e1cfd5486db2e75921dae9d14"} Apr 23 18:42:15.359823 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:15.359782 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" podUID="ec770698-9d17-4526-b07d-f8a324c46007" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.47:8643/healthz\": dial tcp 10.132.0.47:8643: connect: connection refused" Apr 23 18:42:15.365524 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:15.365497 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" podUID="ec770698-9d17-4526-b07d-f8a324c46007" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 23 18:42:17.550299 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.550276 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" Apr 23 18:42:17.618410 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.618376 2576 generic.go:358] "Generic (PLEG): container finished" podID="ec770698-9d17-4526-b07d-f8a324c46007" containerID="b11534a183988bae8942a1472c9e33bc4d3425c5f9bb2b91a511ef4e77dd6518" exitCode=0 Apr 23 18:42:17.618595 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.618409 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" event={"ID":"ec770698-9d17-4526-b07d-f8a324c46007","Type":"ContainerDied","Data":"b11534a183988bae8942a1472c9e33bc4d3425c5f9bb2b91a511ef4e77dd6518"} Apr 23 18:42:17.618595 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.618446 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" event={"ID":"ec770698-9d17-4526-b07d-f8a324c46007","Type":"ContainerDied","Data":"22fac79cf2a13b6aa587fc80b99a436264dc5f2b373bd63c5ab85934a13b384e"} Apr 23 18:42:17.618595 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.618485 2576 scope.go:117] "RemoveContainer" containerID="7fc74bf74004d5af1f1018f22e0ff8cc12334dfd9286f41c4c326af99a5d4600" Apr 23 18:42:17.618595 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.618505 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b" Apr 23 18:42:17.626584 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.626567 2576 scope.go:117] "RemoveContainer" containerID="b11534a183988bae8942a1472c9e33bc4d3425c5f9bb2b91a511ef4e77dd6518" Apr 23 18:42:17.634109 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.634093 2576 scope.go:117] "RemoveContainer" containerID="4873f328e235a721d7d80a82635cd1019f5ed15b68549eab18f6fc2672d9982c" Apr 23 18:42:17.641098 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.641084 2576 scope.go:117] "RemoveContainer" containerID="7fc74bf74004d5af1f1018f22e0ff8cc12334dfd9286f41c4c326af99a5d4600" Apr 23 18:42:17.641344 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:42:17.641325 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fc74bf74004d5af1f1018f22e0ff8cc12334dfd9286f41c4c326af99a5d4600\": container with ID starting with 7fc74bf74004d5af1f1018f22e0ff8cc12334dfd9286f41c4c326af99a5d4600 not found: ID does not exist" containerID="7fc74bf74004d5af1f1018f22e0ff8cc12334dfd9286f41c4c326af99a5d4600" Apr 23 18:42:17.641380 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.641355 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fc74bf74004d5af1f1018f22e0ff8cc12334dfd9286f41c4c326af99a5d4600"} err="failed to get container status \"7fc74bf74004d5af1f1018f22e0ff8cc12334dfd9286f41c4c326af99a5d4600\": rpc error: code = NotFound desc = could not find container \"7fc74bf74004d5af1f1018f22e0ff8cc12334dfd9286f41c4c326af99a5d4600\": container with ID starting with 7fc74bf74004d5af1f1018f22e0ff8cc12334dfd9286f41c4c326af99a5d4600 not found: ID does not exist" Apr 23 18:42:17.641380 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.641373 2576 scope.go:117] "RemoveContainer" containerID="b11534a183988bae8942a1472c9e33bc4d3425c5f9bb2b91a511ef4e77dd6518" Apr 23 18:42:17.641649 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:42:17.641621 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b11534a183988bae8942a1472c9e33bc4d3425c5f9bb2b91a511ef4e77dd6518\": container with ID starting with b11534a183988bae8942a1472c9e33bc4d3425c5f9bb2b91a511ef4e77dd6518 not found: ID does not exist" containerID="b11534a183988bae8942a1472c9e33bc4d3425c5f9bb2b91a511ef4e77dd6518" Apr 23 18:42:17.641649 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.641644 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b11534a183988bae8942a1472c9e33bc4d3425c5f9bb2b91a511ef4e77dd6518"} err="failed to get container status \"b11534a183988bae8942a1472c9e33bc4d3425c5f9bb2b91a511ef4e77dd6518\": rpc error: code = NotFound desc = could not find container \"b11534a183988bae8942a1472c9e33bc4d3425c5f9bb2b91a511ef4e77dd6518\": container with ID starting with b11534a183988bae8942a1472c9e33bc4d3425c5f9bb2b91a511ef4e77dd6518 not found: ID does not exist" Apr 23 18:42:17.641792 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.641660 2576 scope.go:117] "RemoveContainer" containerID="4873f328e235a721d7d80a82635cd1019f5ed15b68549eab18f6fc2672d9982c" Apr 23 18:42:17.641873 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:42:17.641858 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4873f328e235a721d7d80a82635cd1019f5ed15b68549eab18f6fc2672d9982c\": container with ID starting with 4873f328e235a721d7d80a82635cd1019f5ed15b68549eab18f6fc2672d9982c not found: ID does not exist" containerID="4873f328e235a721d7d80a82635cd1019f5ed15b68549eab18f6fc2672d9982c" Apr 23 18:42:17.641910 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.641877 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4873f328e235a721d7d80a82635cd1019f5ed15b68549eab18f6fc2672d9982c"} err="failed to get container status \"4873f328e235a721d7d80a82635cd1019f5ed15b68549eab18f6fc2672d9982c\": rpc error: code = NotFound desc = could not find container \"4873f328e235a721d7d80a82635cd1019f5ed15b68549eab18f6fc2672d9982c\": container with ID starting with 4873f328e235a721d7d80a82635cd1019f5ed15b68549eab18f6fc2672d9982c not found: ID does not exist" Apr 23 18:42:17.678691 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.678671 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrmb4\" (UniqueName: \"kubernetes.io/projected/ec770698-9d17-4526-b07d-f8a324c46007-kube-api-access-wrmb4\") pod \"ec770698-9d17-4526-b07d-f8a324c46007\" (UID: \"ec770698-9d17-4526-b07d-f8a324c46007\") " Apr 23 18:42:17.678788 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.678762 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec770698-9d17-4526-b07d-f8a324c46007-proxy-tls\") pod \"ec770698-9d17-4526-b07d-f8a324c46007\" (UID: \"ec770698-9d17-4526-b07d-f8a324c46007\") " Apr 23 18:42:17.678788 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.678784 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec770698-9d17-4526-b07d-f8a324c46007-kserve-provision-location\") pod \"ec770698-9d17-4526-b07d-f8a324c46007\" (UID: \"ec770698-9d17-4526-b07d-f8a324c46007\") " Apr 23 18:42:17.678861 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.678824 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ec770698-9d17-4526-b07d-f8a324c46007-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"ec770698-9d17-4526-b07d-f8a324c46007\" (UID: \"ec770698-9d17-4526-b07d-f8a324c46007\") " Apr 23 18:42:17.679109 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.679084 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec770698-9d17-4526-b07d-f8a324c46007-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ec770698-9d17-4526-b07d-f8a324c46007" (UID: "ec770698-9d17-4526-b07d-f8a324c46007"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:42:17.679195 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.679171 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec770698-9d17-4526-b07d-f8a324c46007-isvc-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-kube-rbac-proxy-sar-config") pod "ec770698-9d17-4526-b07d-f8a324c46007" (UID: "ec770698-9d17-4526-b07d-f8a324c46007"). InnerVolumeSpecName "isvc-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:42:17.680836 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.680818 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec770698-9d17-4526-b07d-f8a324c46007-kube-api-access-wrmb4" (OuterVolumeSpecName: "kube-api-access-wrmb4") pod "ec770698-9d17-4526-b07d-f8a324c46007" (UID: "ec770698-9d17-4526-b07d-f8a324c46007"). InnerVolumeSpecName "kube-api-access-wrmb4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:42:17.680957 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.680940 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec770698-9d17-4526-b07d-f8a324c46007-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ec770698-9d17-4526-b07d-f8a324c46007" (UID: "ec770698-9d17-4526-b07d-f8a324c46007"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:42:17.779795 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.779763 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ec770698-9d17-4526-b07d-f8a324c46007-isvc-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:42:17.779795 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.779791 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wrmb4\" (UniqueName: \"kubernetes.io/projected/ec770698-9d17-4526-b07d-f8a324c46007-kube-api-access-wrmb4\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:42:17.779980 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.779802 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec770698-9d17-4526-b07d-f8a324c46007-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:42:17.779980 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.779812 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec770698-9d17-4526-b07d-f8a324c46007-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:42:17.940546 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.940516 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b"] Apr 23 18:42:17.944188 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:17.944163 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-67d5fd9dd5-ht89b"] Apr 23 18:42:18.623667 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:18.623631 2576 generic.go:358] "Generic (PLEG): container finished" podID="ed9390f7-ada6-4059-8643-57b6bff2d505" containerID="4f815868a6f8e032c186ac965290346c6fadf1c7626317589940123f868c6b64" exitCode=0 Apr 23 18:42:18.623667 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:18.623668 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" event={"ID":"ed9390f7-ada6-4059-8643-57b6bff2d505","Type":"ContainerDied","Data":"4f815868a6f8e032c186ac965290346c6fadf1c7626317589940123f868c6b64"} Apr 23 18:42:19.415399 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:19.415350 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec770698-9d17-4526-b07d-f8a324c46007" path="/var/lib/kubelet/pods/ec770698-9d17-4526-b07d-f8a324c46007/volumes" Apr 23 18:42:19.628511 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:19.628452 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" event={"ID":"ed9390f7-ada6-4059-8643-57b6bff2d505","Type":"ContainerStarted","Data":"9112ff56b5e7055f6992de1a3f44030f4ced662e67c76636520a20d3243813a7"} Apr 23 18:42:19.628908 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:19.628521 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" event={"ID":"ed9390f7-ada6-4059-8643-57b6bff2d505","Type":"ContainerStarted","Data":"8e2c6ec9383c0157b9e3bf3876c3df7397006d3df14b507575cfb386cf81cad2"} Apr 23 18:42:19.628908 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:19.628746 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" Apr 23 18:42:19.628908 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:19.628884 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" Apr 23 18:42:19.648202 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:19.648158 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" podStartSLOduration=6.648144275 podStartE2EDuration="6.648144275s" podCreationTimestamp="2026-04-23 18:42:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:42:19.645816561 +0000 UTC m=+2628.813143880" watchObservedRunningTime="2026-04-23 18:42:19.648144275 +0000 UTC m=+2628.815471648" Apr 23 18:42:25.637282 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:25.637251 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" Apr 23 18:42:35.027810 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.027771 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-67cf9fcb56-mznb7" podUID="fa6b8cee-9a30-43a6-8532-16762f460f8a" containerName="console" containerID="cri-o://fee78e802b6b43e0bfbaf3436b72eefe7b49fbdce97bb2e4bcf63607f6cac07a" gracePeriod=15 Apr 23 18:42:35.271165 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.271144 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67cf9fcb56-mznb7_fa6b8cee-9a30-43a6-8532-16762f460f8a/console/0.log" Apr 23 18:42:35.271269 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.271204 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:42:35.320523 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.320490 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-oauth-serving-cert\") pod \"fa6b8cee-9a30-43a6-8532-16762f460f8a\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " Apr 23 18:42:35.320701 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.320533 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa6b8cee-9a30-43a6-8532-16762f460f8a-console-serving-cert\") pod \"fa6b8cee-9a30-43a6-8532-16762f460f8a\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " Apr 23 18:42:35.320701 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.320586 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-trusted-ca-bundle\") pod \"fa6b8cee-9a30-43a6-8532-16762f460f8a\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " Apr 23 18:42:35.320701 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.320609 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-console-config\") pod \"fa6b8cee-9a30-43a6-8532-16762f460f8a\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " Apr 23 18:42:35.320701 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.320629 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-service-ca\") pod \"fa6b8cee-9a30-43a6-8532-16762f460f8a\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " Apr 23 18:42:35.320701 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.320678 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa6b8cee-9a30-43a6-8532-16762f460f8a-console-oauth-config\") pod \"fa6b8cee-9a30-43a6-8532-16762f460f8a\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " Apr 23 18:42:35.320968 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.320814 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjf8s\" (UniqueName: \"kubernetes.io/projected/fa6b8cee-9a30-43a6-8532-16762f460f8a-kube-api-access-fjf8s\") pod \"fa6b8cee-9a30-43a6-8532-16762f460f8a\" (UID: \"fa6b8cee-9a30-43a6-8532-16762f460f8a\") " Apr 23 18:42:35.321099 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.321005 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "fa6b8cee-9a30-43a6-8532-16762f460f8a" (UID: "fa6b8cee-9a30-43a6-8532-16762f460f8a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:42:35.321099 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.321064 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-console-config" (OuterVolumeSpecName: "console-config") pod "fa6b8cee-9a30-43a6-8532-16762f460f8a" (UID: "fa6b8cee-9a30-43a6-8532-16762f460f8a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:42:35.321189 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.321097 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-service-ca" (OuterVolumeSpecName: "service-ca") pod "fa6b8cee-9a30-43a6-8532-16762f460f8a" (UID: "fa6b8cee-9a30-43a6-8532-16762f460f8a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:42:35.321189 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.321109 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "fa6b8cee-9a30-43a6-8532-16762f460f8a" (UID: "fa6b8cee-9a30-43a6-8532-16762f460f8a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:42:35.321189 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.321171 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-oauth-serving-cert\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:42:35.321299 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.321188 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-console-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:42:35.321299 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.321203 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-service-ca\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:42:35.322976 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.322958 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa6b8cee-9a30-43a6-8532-16762f460f8a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "fa6b8cee-9a30-43a6-8532-16762f460f8a" (UID: "fa6b8cee-9a30-43a6-8532-16762f460f8a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:42:35.323095 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.323080 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa6b8cee-9a30-43a6-8532-16762f460f8a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "fa6b8cee-9a30-43a6-8532-16762f460f8a" (UID: "fa6b8cee-9a30-43a6-8532-16762f460f8a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:42:35.323222 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.323201 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa6b8cee-9a30-43a6-8532-16762f460f8a-kube-api-access-fjf8s" (OuterVolumeSpecName: "kube-api-access-fjf8s") pod "fa6b8cee-9a30-43a6-8532-16762f460f8a" (UID: "fa6b8cee-9a30-43a6-8532-16762f460f8a"). InnerVolumeSpecName "kube-api-access-fjf8s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:42:35.421739 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.421573 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa6b8cee-9a30-43a6-8532-16762f460f8a-console-oauth-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:42:35.421739 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.421600 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fjf8s\" (UniqueName: \"kubernetes.io/projected/fa6b8cee-9a30-43a6-8532-16762f460f8a-kube-api-access-fjf8s\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:42:35.421739 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.421615 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa6b8cee-9a30-43a6-8532-16762f460f8a-console-serving-cert\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:42:35.421739 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.421631 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa6b8cee-9a30-43a6-8532-16762f460f8a-trusted-ca-bundle\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:42:35.682875 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.682796 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67cf9fcb56-mznb7_fa6b8cee-9a30-43a6-8532-16762f460f8a/console/0.log" Apr 23 18:42:35.682875 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.682834 2576 generic.go:358] "Generic (PLEG): container finished" podID="fa6b8cee-9a30-43a6-8532-16762f460f8a" containerID="fee78e802b6b43e0bfbaf3436b72eefe7b49fbdce97bb2e4bcf63607f6cac07a" exitCode=2 Apr 23 18:42:35.683096 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.682888 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67cf9fcb56-mznb7" event={"ID":"fa6b8cee-9a30-43a6-8532-16762f460f8a","Type":"ContainerDied","Data":"fee78e802b6b43e0bfbaf3436b72eefe7b49fbdce97bb2e4bcf63607f6cac07a"} Apr 23 18:42:35.683096 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.682910 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67cf9fcb56-mznb7" event={"ID":"fa6b8cee-9a30-43a6-8532-16762f460f8a","Type":"ContainerDied","Data":"3df1120f35bfff1f2a17379c6b45f8db4d73e0aead6cf697f1ee76c5bda4968c"} Apr 23 18:42:35.683096 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.682909 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67cf9fcb56-mznb7" Apr 23 18:42:35.683096 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.682990 2576 scope.go:117] "RemoveContainer" containerID="fee78e802b6b43e0bfbaf3436b72eefe7b49fbdce97bb2e4bcf63607f6cac07a" Apr 23 18:42:35.694171 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.694147 2576 scope.go:117] "RemoveContainer" containerID="fee78e802b6b43e0bfbaf3436b72eefe7b49fbdce97bb2e4bcf63607f6cac07a" Apr 23 18:42:35.694492 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:42:35.694473 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fee78e802b6b43e0bfbaf3436b72eefe7b49fbdce97bb2e4bcf63607f6cac07a\": container with ID starting with fee78e802b6b43e0bfbaf3436b72eefe7b49fbdce97bb2e4bcf63607f6cac07a not found: ID does not exist" containerID="fee78e802b6b43e0bfbaf3436b72eefe7b49fbdce97bb2e4bcf63607f6cac07a" Apr 23 18:42:35.694552 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.694501 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee78e802b6b43e0bfbaf3436b72eefe7b49fbdce97bb2e4bcf63607f6cac07a"} err="failed to get container status \"fee78e802b6b43e0bfbaf3436b72eefe7b49fbdce97bb2e4bcf63607f6cac07a\": rpc error: code = NotFound desc = could not find container \"fee78e802b6b43e0bfbaf3436b72eefe7b49fbdce97bb2e4bcf63607f6cac07a\": container with ID starting with fee78e802b6b43e0bfbaf3436b72eefe7b49fbdce97bb2e4bcf63607f6cac07a not found: ID does not exist" Apr 23 18:42:35.700887 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.700861 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67cf9fcb56-mznb7"] Apr 23 18:42:35.702951 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:35.702930 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-67cf9fcb56-mznb7"] Apr 23 18:42:37.415157 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:37.415123 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa6b8cee-9a30-43a6-8532-16762f460f8a" path="/var/lib/kubelet/pods/fa6b8cee-9a30-43a6-8532-16762f460f8a/volumes" Apr 23 18:42:55.674339 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:42:55.674291 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" podUID="ed9390f7-ada6-4059-8643-57b6bff2d505" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 23 18:43:05.640349 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:05.640320 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" Apr 23 18:43:13.469167 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.469081 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7"] Apr 23 18:43:13.469723 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.469504 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" podUID="ed9390f7-ada6-4059-8643-57b6bff2d505" containerName="kserve-container" containerID="cri-o://8e2c6ec9383c0157b9e3bf3876c3df7397006d3df14b507575cfb386cf81cad2" gracePeriod=30 Apr 23 18:43:13.469829 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.469803 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" podUID="ed9390f7-ada6-4059-8643-57b6bff2d505" containerName="kube-rbac-proxy" containerID="cri-o://9112ff56b5e7055f6992de1a3f44030f4ced662e67c76636520a20d3243813a7" gracePeriod=30 Apr 23 18:43:13.540477 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.540437 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z"] Apr 23 18:43:13.540825 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.540813 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec770698-9d17-4526-b07d-f8a324c46007" containerName="kserve-container" Apr 23 18:43:13.540895 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.540828 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec770698-9d17-4526-b07d-f8a324c46007" containerName="kserve-container" Apr 23 18:43:13.540895 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.540842 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec770698-9d17-4526-b07d-f8a324c46007" containerName="kube-rbac-proxy" Apr 23 18:43:13.540895 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.540848 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec770698-9d17-4526-b07d-f8a324c46007" containerName="kube-rbac-proxy" Apr 23 18:43:13.540895 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.540864 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa6b8cee-9a30-43a6-8532-16762f460f8a" containerName="console" Apr 23 18:43:13.540895 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.540870 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6b8cee-9a30-43a6-8532-16762f460f8a" containerName="console" Apr 23 18:43:13.540895 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.540879 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec770698-9d17-4526-b07d-f8a324c46007" containerName="storage-initializer" Apr 23 18:43:13.540895 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.540884 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec770698-9d17-4526-b07d-f8a324c46007" containerName="storage-initializer" Apr 23 18:43:13.541121 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.540936 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa6b8cee-9a30-43a6-8532-16762f460f8a" containerName="console" Apr 23 18:43:13.541121 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.540949 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec770698-9d17-4526-b07d-f8a324c46007" containerName="kserve-container" Apr 23 18:43:13.541121 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.540957 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec770698-9d17-4526-b07d-f8a324c46007" containerName="kube-rbac-proxy" Apr 23 18:43:13.544223 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.544206 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" Apr 23 18:43:13.547840 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.547814 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-predictor-serving-cert\"" Apr 23 18:43:13.547960 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.547858 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\"" Apr 23 18:43:13.562119 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.562097 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z"] Apr 23 18:43:13.654018 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.653988 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-67895ccb75-5v47z\" (UID: \"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" Apr 23 18:43:13.654171 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.654029 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqrb4\" (UniqueName: \"kubernetes.io/projected/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-kube-api-access-jqrb4\") pod \"isvc-sklearn-runtime-predictor-67895ccb75-5v47z\" (UID: \"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" Apr 23 18:43:13.654171 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.654109 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-67895ccb75-5v47z\" (UID: \"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" Apr 23 18:43:13.654171 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.654157 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-67895ccb75-5v47z\" (UID: \"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" Apr 23 18:43:13.754739 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.754663 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-67895ccb75-5v47z\" (UID: \"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" Apr 23 18:43:13.754739 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.754702 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-67895ccb75-5v47z\" (UID: \"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" Apr 23 18:43:13.754739 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.754731 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqrb4\" (UniqueName: \"kubernetes.io/projected/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-kube-api-access-jqrb4\") pod \"isvc-sklearn-runtime-predictor-67895ccb75-5v47z\" (UID: \"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" Apr 23 18:43:13.754936 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.754767 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-67895ccb75-5v47z\" (UID: \"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" Apr 23 18:43:13.755113 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.755091 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-67895ccb75-5v47z\" (UID: \"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" Apr 23 18:43:13.755386 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.755365 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-67895ccb75-5v47z\" (UID: \"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" Apr 23 18:43:13.757412 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.757377 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-67895ccb75-5v47z\" (UID: \"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" Apr 23 18:43:13.762908 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.762882 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqrb4\" (UniqueName: \"kubernetes.io/projected/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-kube-api-access-jqrb4\") pod \"isvc-sklearn-runtime-predictor-67895ccb75-5v47z\" (UID: \"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" Apr 23 18:43:13.814120 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.814090 2576 generic.go:358] "Generic (PLEG): container finished" podID="ed9390f7-ada6-4059-8643-57b6bff2d505" containerID="9112ff56b5e7055f6992de1a3f44030f4ced662e67c76636520a20d3243813a7" exitCode=2 Apr 23 18:43:13.814264 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.814173 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" event={"ID":"ed9390f7-ada6-4059-8643-57b6bff2d505","Type":"ContainerDied","Data":"9112ff56b5e7055f6992de1a3f44030f4ced662e67c76636520a20d3243813a7"} Apr 23 18:43:13.853974 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.853936 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" Apr 23 18:43:13.984196 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:13.984164 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z"] Apr 23 18:43:13.986758 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:43:13.986729 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad2f8f2d_4998_49ea_90b1_e2b8d6be87a3.slice/crio-19c4c574a34ad586d88fc748fff26bbef344e424c327bf03230108b027b48dbe WatchSource:0}: Error finding container 19c4c574a34ad586d88fc748fff26bbef344e424c327bf03230108b027b48dbe: Status 404 returned error can't find the container with id 19c4c574a34ad586d88fc748fff26bbef344e424c327bf03230108b027b48dbe Apr 23 18:43:14.818899 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:14.818866 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" event={"ID":"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3","Type":"ContainerStarted","Data":"9b68cdb4ff971a6f5b554ebe71ad82a521b1c4178e02f94b2fb27390156d4217"} Apr 23 18:43:14.818899 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:14.818901 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" event={"ID":"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3","Type":"ContainerStarted","Data":"19c4c574a34ad586d88fc748fff26bbef344e424c327bf03230108b027b48dbe"} Apr 23 18:43:15.633380 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:15.633332 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" podUID="ed9390f7-ada6-4059-8643-57b6bff2d505" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.48:8643/healthz\": dial tcp 10.132.0.48:8643: connect: connection refused" Apr 23 18:43:16.678668 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:16.678622 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" podUID="ed9390f7-ada6-4059-8643-57b6bff2d505" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.48:8080/v2/models/sklearn-v2-mlserver/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 23 18:43:19.838283 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:19.838246 2576 generic.go:358] "Generic (PLEG): container finished" podID="ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3" containerID="9b68cdb4ff971a6f5b554ebe71ad82a521b1c4178e02f94b2fb27390156d4217" exitCode=0 Apr 23 18:43:19.838756 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:19.838295 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" event={"ID":"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3","Type":"ContainerDied","Data":"9b68cdb4ff971a6f5b554ebe71ad82a521b1c4178e02f94b2fb27390156d4217"} Apr 23 18:43:20.633124 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:20.633081 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" podUID="ed9390f7-ada6-4059-8643-57b6bff2d505" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.48:8643/healthz\": dial tcp 10.132.0.48:8643: connect: connection refused" Apr 23 18:43:20.851497 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:20.851443 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" event={"ID":"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3","Type":"ContainerStarted","Data":"626e64737f585c18bcef8b8dd552d3bd29a38f7afe96140a9f9c572c2867d7ac"} Apr 23 18:43:20.851497 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:20.851499 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" event={"ID":"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3","Type":"ContainerStarted","Data":"8a2d18bb039ae9d23b97cc62fd8c8f3fb25dadd71af6db840713745b137ca42f"} Apr 23 18:43:20.851950 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:20.851820 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" Apr 23 18:43:20.851950 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:20.851853 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" Apr 23 18:43:20.853278 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:20.853251 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" podUID="ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 23 18:43:20.869983 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:20.869936 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" podStartSLOduration=7.86992253 podStartE2EDuration="7.86992253s" podCreationTimestamp="2026-04-23 18:43:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:43:20.868197868 +0000 UTC m=+2690.035525189" watchObservedRunningTime="2026-04-23 18:43:20.86992253 +0000 UTC m=+2690.037249848" Apr 23 18:43:21.158597 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.158573 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" Apr 23 18:43:21.222245 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.222209 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4klv\" (UniqueName: \"kubernetes.io/projected/ed9390f7-ada6-4059-8643-57b6bff2d505-kube-api-access-p4klv\") pod \"ed9390f7-ada6-4059-8643-57b6bff2d505\" (UID: \"ed9390f7-ada6-4059-8643-57b6bff2d505\") " Apr 23 18:43:21.222436 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.222289 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed9390f7-ada6-4059-8643-57b6bff2d505-proxy-tls\") pod \"ed9390f7-ada6-4059-8643-57b6bff2d505\" (UID: \"ed9390f7-ada6-4059-8643-57b6bff2d505\") " Apr 23 18:43:21.222436 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.222313 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed9390f7-ada6-4059-8643-57b6bff2d505-kserve-provision-location\") pod \"ed9390f7-ada6-4059-8643-57b6bff2d505\" (UID: \"ed9390f7-ada6-4059-8643-57b6bff2d505\") " Apr 23 18:43:21.222436 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.222332 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ed9390f7-ada6-4059-8643-57b6bff2d505-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"ed9390f7-ada6-4059-8643-57b6bff2d505\" (UID: \"ed9390f7-ada6-4059-8643-57b6bff2d505\") " Apr 23 18:43:21.222653 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.222627 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed9390f7-ada6-4059-8643-57b6bff2d505-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ed9390f7-ada6-4059-8643-57b6bff2d505" (UID: "ed9390f7-ada6-4059-8643-57b6bff2d505"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:43:21.222768 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.222738 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed9390f7-ada6-4059-8643-57b6bff2d505-sklearn-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "sklearn-v2-mlserver-kube-rbac-proxy-sar-config") pod "ed9390f7-ada6-4059-8643-57b6bff2d505" (UID: "ed9390f7-ada6-4059-8643-57b6bff2d505"). InnerVolumeSpecName "sklearn-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:43:21.224602 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.224570 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed9390f7-ada6-4059-8643-57b6bff2d505-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ed9390f7-ada6-4059-8643-57b6bff2d505" (UID: "ed9390f7-ada6-4059-8643-57b6bff2d505"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:43:21.224707 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.224596 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed9390f7-ada6-4059-8643-57b6bff2d505-kube-api-access-p4klv" (OuterVolumeSpecName: "kube-api-access-p4klv") pod "ed9390f7-ada6-4059-8643-57b6bff2d505" (UID: "ed9390f7-ada6-4059-8643-57b6bff2d505"). InnerVolumeSpecName "kube-api-access-p4klv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:43:21.323991 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.323947 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p4klv\" (UniqueName: \"kubernetes.io/projected/ed9390f7-ada6-4059-8643-57b6bff2d505-kube-api-access-p4klv\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:43:21.323991 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.323992 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed9390f7-ada6-4059-8643-57b6bff2d505-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:43:21.324191 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.324009 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed9390f7-ada6-4059-8643-57b6bff2d505-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:43:21.324191 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.324026 2576 reconciler_common.go:299] "Volume detached for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ed9390f7-ada6-4059-8643-57b6bff2d505-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:43:21.856229 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.856191 2576 generic.go:358] "Generic (PLEG): container finished" podID="ed9390f7-ada6-4059-8643-57b6bff2d505" containerID="8e2c6ec9383c0157b9e3bf3876c3df7397006d3df14b507575cfb386cf81cad2" exitCode=0 Apr 23 18:43:21.856745 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.856278 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" event={"ID":"ed9390f7-ada6-4059-8643-57b6bff2d505","Type":"ContainerDied","Data":"8e2c6ec9383c0157b9e3bf3876c3df7397006d3df14b507575cfb386cf81cad2"} Apr 23 18:43:21.856745 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.856322 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" event={"ID":"ed9390f7-ada6-4059-8643-57b6bff2d505","Type":"ContainerDied","Data":"1cc64862de60c729ba93b4efce499117c910d04e1cfd5486db2e75921dae9d14"} Apr 23 18:43:21.856745 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.856288 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7" Apr 23 18:43:21.856745 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.856343 2576 scope.go:117] "RemoveContainer" containerID="9112ff56b5e7055f6992de1a3f44030f4ced662e67c76636520a20d3243813a7" Apr 23 18:43:21.857062 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.857032 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" podUID="ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 23 18:43:21.865561 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.865545 2576 scope.go:117] "RemoveContainer" containerID="8e2c6ec9383c0157b9e3bf3876c3df7397006d3df14b507575cfb386cf81cad2" Apr 23 18:43:21.874020 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.873982 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7"] Apr 23 18:43:21.878071 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.878035 2576 scope.go:117] "RemoveContainer" containerID="4f815868a6f8e032c186ac965290346c6fadf1c7626317589940123f868c6b64" Apr 23 18:43:21.878673 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.878648 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-jfgq7"] Apr 23 18:43:21.888189 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.888167 2576 scope.go:117] "RemoveContainer" containerID="9112ff56b5e7055f6992de1a3f44030f4ced662e67c76636520a20d3243813a7" Apr 23 18:43:21.888492 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:43:21.888451 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9112ff56b5e7055f6992de1a3f44030f4ced662e67c76636520a20d3243813a7\": container with ID starting with 9112ff56b5e7055f6992de1a3f44030f4ced662e67c76636520a20d3243813a7 not found: ID does not exist" containerID="9112ff56b5e7055f6992de1a3f44030f4ced662e67c76636520a20d3243813a7" Apr 23 18:43:21.888617 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.888501 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9112ff56b5e7055f6992de1a3f44030f4ced662e67c76636520a20d3243813a7"} err="failed to get container status \"9112ff56b5e7055f6992de1a3f44030f4ced662e67c76636520a20d3243813a7\": rpc error: code = NotFound desc = could not find container \"9112ff56b5e7055f6992de1a3f44030f4ced662e67c76636520a20d3243813a7\": container with ID starting with 9112ff56b5e7055f6992de1a3f44030f4ced662e67c76636520a20d3243813a7 not found: ID does not exist" Apr 23 18:43:21.888617 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.888523 2576 scope.go:117] "RemoveContainer" containerID="8e2c6ec9383c0157b9e3bf3876c3df7397006d3df14b507575cfb386cf81cad2" Apr 23 18:43:21.888743 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:43:21.888731 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e2c6ec9383c0157b9e3bf3876c3df7397006d3df14b507575cfb386cf81cad2\": container with ID starting with 8e2c6ec9383c0157b9e3bf3876c3df7397006d3df14b507575cfb386cf81cad2 not found: ID does not exist" containerID="8e2c6ec9383c0157b9e3bf3876c3df7397006d3df14b507575cfb386cf81cad2" Apr 23 18:43:21.888797 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.888749 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e2c6ec9383c0157b9e3bf3876c3df7397006d3df14b507575cfb386cf81cad2"} err="failed to get container status \"8e2c6ec9383c0157b9e3bf3876c3df7397006d3df14b507575cfb386cf81cad2\": rpc error: code = NotFound desc = could not find container \"8e2c6ec9383c0157b9e3bf3876c3df7397006d3df14b507575cfb386cf81cad2\": container with ID starting with 8e2c6ec9383c0157b9e3bf3876c3df7397006d3df14b507575cfb386cf81cad2 not found: ID does not exist" Apr 23 18:43:21.888797 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.888764 2576 scope.go:117] "RemoveContainer" containerID="4f815868a6f8e032c186ac965290346c6fadf1c7626317589940123f868c6b64" Apr 23 18:43:21.888977 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:43:21.888960 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f815868a6f8e032c186ac965290346c6fadf1c7626317589940123f868c6b64\": container with ID starting with 4f815868a6f8e032c186ac965290346c6fadf1c7626317589940123f868c6b64 not found: ID does not exist" containerID="4f815868a6f8e032c186ac965290346c6fadf1c7626317589940123f868c6b64" Apr 23 18:43:21.889050 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:21.888978 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f815868a6f8e032c186ac965290346c6fadf1c7626317589940123f868c6b64"} err="failed to get container status \"4f815868a6f8e032c186ac965290346c6fadf1c7626317589940123f868c6b64\": rpc error: code = NotFound desc = could not find container \"4f815868a6f8e032c186ac965290346c6fadf1c7626317589940123f868c6b64\": container with ID starting with 4f815868a6f8e032c186ac965290346c6fadf1c7626317589940123f868c6b64 not found: ID does not exist" Apr 23 18:43:23.415615 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:23.415570 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed9390f7-ada6-4059-8643-57b6bff2d505" path="/var/lib/kubelet/pods/ed9390f7-ada6-4059-8643-57b6bff2d505/volumes" Apr 23 18:43:26.861891 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:26.861855 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" Apr 23 18:43:26.862492 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:26.862439 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" podUID="ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 23 18:43:31.515040 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:31.515006 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:43:31.523354 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:31.523328 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:43:36.863600 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:36.863571 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" Apr 23 18:43:50.477793 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.477763 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-67895ccb75-5v47z_ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3/kserve-container/0.log" Apr 23 18:43:50.614345 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.614311 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z"] Apr 23 18:43:50.614669 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.614634 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" podUID="ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3" containerName="kserve-container" containerID="cri-o://8a2d18bb039ae9d23b97cc62fd8c8f3fb25dadd71af6db840713745b137ca42f" gracePeriod=30 Apr 23 18:43:50.614818 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.614675 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" podUID="ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3" containerName="kube-rbac-proxy" containerID="cri-o://626e64737f585c18bcef8b8dd552d3bd29a38f7afe96140a9f9c572c2867d7ac" gracePeriod=30 Apr 23 18:43:50.703053 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.703018 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd"] Apr 23 18:43:50.703416 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.703399 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed9390f7-ada6-4059-8643-57b6bff2d505" containerName="storage-initializer" Apr 23 18:43:50.703493 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.703419 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed9390f7-ada6-4059-8643-57b6bff2d505" containerName="storage-initializer" Apr 23 18:43:50.703493 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.703434 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed9390f7-ada6-4059-8643-57b6bff2d505" containerName="kserve-container" Apr 23 18:43:50.703493 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.703440 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed9390f7-ada6-4059-8643-57b6bff2d505" containerName="kserve-container" Apr 23 18:43:50.703493 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.703446 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed9390f7-ada6-4059-8643-57b6bff2d505" containerName="kube-rbac-proxy" Apr 23 18:43:50.703493 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.703452 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed9390f7-ada6-4059-8643-57b6bff2d505" containerName="kube-rbac-proxy" Apr 23 18:43:50.703654 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.703546 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed9390f7-ada6-4059-8643-57b6bff2d505" containerName="kube-rbac-proxy" Apr 23 18:43:50.703654 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.703556 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed9390f7-ada6-4059-8643-57b6bff2d505" containerName="kserve-container" Apr 23 18:43:50.708170 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.708149 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" Apr 23 18:43:50.710380 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.710356 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 23 18:43:50.710522 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.710362 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-predictor-serving-cert\"" Apr 23 18:43:50.715160 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.715132 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd"] Apr 23 18:43:50.772929 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.772896 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1612d0f-dabf-4e1e-bd70-42f2d287e956-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd\" (UID: \"f1612d0f-dabf-4e1e-bd70-42f2d287e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" Apr 23 18:43:50.773085 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.772942 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwqwp\" (UniqueName: \"kubernetes.io/projected/f1612d0f-dabf-4e1e-bd70-42f2d287e956-kube-api-access-wwqwp\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd\" (UID: \"f1612d0f-dabf-4e1e-bd70-42f2d287e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" Apr 23 18:43:50.773085 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.772981 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1612d0f-dabf-4e1e-bd70-42f2d287e956-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd\" (UID: \"f1612d0f-dabf-4e1e-bd70-42f2d287e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" Apr 23 18:43:50.773085 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.773037 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1612d0f-dabf-4e1e-bd70-42f2d287e956-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd\" (UID: \"f1612d0f-dabf-4e1e-bd70-42f2d287e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" Apr 23 18:43:50.873780 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.873732 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwqwp\" (UniqueName: \"kubernetes.io/projected/f1612d0f-dabf-4e1e-bd70-42f2d287e956-kube-api-access-wwqwp\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd\" (UID: \"f1612d0f-dabf-4e1e-bd70-42f2d287e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" Apr 23 18:43:50.873986 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.873800 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1612d0f-dabf-4e1e-bd70-42f2d287e956-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd\" (UID: \"f1612d0f-dabf-4e1e-bd70-42f2d287e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" Apr 23 18:43:50.873986 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.873854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1612d0f-dabf-4e1e-bd70-42f2d287e956-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd\" (UID: \"f1612d0f-dabf-4e1e-bd70-42f2d287e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" Apr 23 18:43:50.873986 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.873901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1612d0f-dabf-4e1e-bd70-42f2d287e956-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd\" (UID: \"f1612d0f-dabf-4e1e-bd70-42f2d287e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" Apr 23 18:43:50.874149 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:43:50.874053 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-serving-cert: secret "isvc-sklearn-v2-runtime-predictor-serving-cert" not found Apr 23 18:43:50.874149 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:43:50.874129 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1612d0f-dabf-4e1e-bd70-42f2d287e956-proxy-tls podName:f1612d0f-dabf-4e1e-bd70-42f2d287e956 nodeName:}" failed. No retries permitted until 2026-04-23 18:43:51.374108016 +0000 UTC m=+2720.541435316 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f1612d0f-dabf-4e1e-bd70-42f2d287e956-proxy-tls") pod "isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" (UID: "f1612d0f-dabf-4e1e-bd70-42f2d287e956") : secret "isvc-sklearn-v2-runtime-predictor-serving-cert" not found Apr 23 18:43:50.874350 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.874329 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1612d0f-dabf-4e1e-bd70-42f2d287e956-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd\" (UID: \"f1612d0f-dabf-4e1e-bd70-42f2d287e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" Apr 23 18:43:50.874645 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.874624 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1612d0f-dabf-4e1e-bd70-42f2d287e956-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd\" (UID: \"f1612d0f-dabf-4e1e-bd70-42f2d287e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" Apr 23 18:43:50.884442 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.884418 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwqwp\" (UniqueName: \"kubernetes.io/projected/f1612d0f-dabf-4e1e-bd70-42f2d287e956-kube-api-access-wwqwp\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd\" (UID: \"f1612d0f-dabf-4e1e-bd70-42f2d287e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" Apr 23 18:43:50.958892 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.958854 2576 generic.go:358] "Generic (PLEG): container finished" podID="ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3" containerID="626e64737f585c18bcef8b8dd552d3bd29a38f7afe96140a9f9c572c2867d7ac" exitCode=2 Apr 23 18:43:50.959051 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:50.958929 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" event={"ID":"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3","Type":"ContainerDied","Data":"626e64737f585c18bcef8b8dd552d3bd29a38f7afe96140a9f9c572c2867d7ac"} Apr 23 18:43:51.379802 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.379525 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1612d0f-dabf-4e1e-bd70-42f2d287e956-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd\" (UID: \"f1612d0f-dabf-4e1e-bd70-42f2d287e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" Apr 23 18:43:51.382606 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.382581 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1612d0f-dabf-4e1e-bd70-42f2d287e956-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd\" (UID: \"f1612d0f-dabf-4e1e-bd70-42f2d287e956\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" Apr 23 18:43:51.456808 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.456783 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" Apr 23 18:43:51.581344 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.581311 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-proxy-tls\") pod \"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3\" (UID: \"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3\") " Apr 23 18:43:51.581808 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.581356 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3\" (UID: \"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3\") " Apr 23 18:43:51.581808 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.581388 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqrb4\" (UniqueName: \"kubernetes.io/projected/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-kube-api-access-jqrb4\") pod \"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3\" (UID: \"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3\") " Apr 23 18:43:51.581808 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.581503 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-kserve-provision-location\") pod \"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3\" (UID: \"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3\") " Apr 23 18:43:51.581988 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.581831 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-isvc-sklearn-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-runtime-kube-rbac-proxy-sar-config") pod "ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3" (UID: "ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3"). InnerVolumeSpecName "isvc-sklearn-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:43:51.583700 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.583677 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3" (UID: "ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:43:51.583814 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.583680 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-kube-api-access-jqrb4" (OuterVolumeSpecName: "kube-api-access-jqrb4") pod "ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3" (UID: "ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3"). InnerVolumeSpecName "kube-api-access-jqrb4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:43:51.605584 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.605547 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3" (UID: "ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:43:51.619988 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.619963 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" Apr 23 18:43:51.683253 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.683145 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:43:51.683253 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.683190 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:43:51.683253 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.683201 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:43:51.683253 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.683210 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jqrb4\" (UniqueName: \"kubernetes.io/projected/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3-kube-api-access-jqrb4\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:43:51.744215 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.744193 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd"] Apr 23 18:43:51.745988 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:43:51.745959 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1612d0f_dabf_4e1e_bd70_42f2d287e956.slice/crio-dbe278bd4faa539196bfd2691aba05c1c51a12825b709f63af77493f9c31d5b7 WatchSource:0}: Error finding container dbe278bd4faa539196bfd2691aba05c1c51a12825b709f63af77493f9c31d5b7: Status 404 returned error can't find the container with id dbe278bd4faa539196bfd2691aba05c1c51a12825b709f63af77493f9c31d5b7 Apr 23 18:43:51.963682 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.963590 2576 generic.go:358] "Generic (PLEG): container finished" podID="ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3" containerID="8a2d18bb039ae9d23b97cc62fd8c8f3fb25dadd71af6db840713745b137ca42f" exitCode=0 Apr 23 18:43:51.963682 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.963674 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" Apr 23 18:43:51.963907 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.963675 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" event={"ID":"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3","Type":"ContainerDied","Data":"8a2d18bb039ae9d23b97cc62fd8c8f3fb25dadd71af6db840713745b137ca42f"} Apr 23 18:43:51.963907 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.963720 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z" event={"ID":"ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3","Type":"ContainerDied","Data":"19c4c574a34ad586d88fc748fff26bbef344e424c327bf03230108b027b48dbe"} Apr 23 18:43:51.963907 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.963734 2576 scope.go:117] "RemoveContainer" containerID="626e64737f585c18bcef8b8dd552d3bd29a38f7afe96140a9f9c572c2867d7ac" Apr 23 18:43:51.965244 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.965219 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" event={"ID":"f1612d0f-dabf-4e1e-bd70-42f2d287e956","Type":"ContainerStarted","Data":"fa41ea0880d71dae77891b26a6bf48fe6ac0c2a91411fb94aedf9e3aaf6bcce2"} Apr 23 18:43:51.965361 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.965253 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" event={"ID":"f1612d0f-dabf-4e1e-bd70-42f2d287e956","Type":"ContainerStarted","Data":"dbe278bd4faa539196bfd2691aba05c1c51a12825b709f63af77493f9c31d5b7"} Apr 23 18:43:51.973316 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.973268 2576 scope.go:117] "RemoveContainer" containerID="8a2d18bb039ae9d23b97cc62fd8c8f3fb25dadd71af6db840713745b137ca42f" Apr 23 18:43:51.981157 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.981136 2576 scope.go:117] "RemoveContainer" containerID="9b68cdb4ff971a6f5b554ebe71ad82a521b1c4178e02f94b2fb27390156d4217" Apr 23 18:43:51.990805 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.990788 2576 scope.go:117] "RemoveContainer" containerID="626e64737f585c18bcef8b8dd552d3bd29a38f7afe96140a9f9c572c2867d7ac" Apr 23 18:43:51.991088 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:43:51.991067 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"626e64737f585c18bcef8b8dd552d3bd29a38f7afe96140a9f9c572c2867d7ac\": container with ID starting with 626e64737f585c18bcef8b8dd552d3bd29a38f7afe96140a9f9c572c2867d7ac not found: ID does not exist" containerID="626e64737f585c18bcef8b8dd552d3bd29a38f7afe96140a9f9c572c2867d7ac" Apr 23 18:43:51.991159 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.991099 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626e64737f585c18bcef8b8dd552d3bd29a38f7afe96140a9f9c572c2867d7ac"} err="failed to get container status \"626e64737f585c18bcef8b8dd552d3bd29a38f7afe96140a9f9c572c2867d7ac\": rpc error: code = NotFound desc = could not find container \"626e64737f585c18bcef8b8dd552d3bd29a38f7afe96140a9f9c572c2867d7ac\": container with ID starting with 626e64737f585c18bcef8b8dd552d3bd29a38f7afe96140a9f9c572c2867d7ac not found: ID does not exist" Apr 23 18:43:51.991159 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.991121 2576 scope.go:117] "RemoveContainer" containerID="8a2d18bb039ae9d23b97cc62fd8c8f3fb25dadd71af6db840713745b137ca42f" Apr 23 18:43:51.991425 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:43:51.991400 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a2d18bb039ae9d23b97cc62fd8c8f3fb25dadd71af6db840713745b137ca42f\": container with ID starting with 8a2d18bb039ae9d23b97cc62fd8c8f3fb25dadd71af6db840713745b137ca42f not found: ID does not exist" containerID="8a2d18bb039ae9d23b97cc62fd8c8f3fb25dadd71af6db840713745b137ca42f" Apr 23 18:43:51.991692 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.991434 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a2d18bb039ae9d23b97cc62fd8c8f3fb25dadd71af6db840713745b137ca42f"} err="failed to get container status \"8a2d18bb039ae9d23b97cc62fd8c8f3fb25dadd71af6db840713745b137ca42f\": rpc error: code = NotFound desc = could not find container \"8a2d18bb039ae9d23b97cc62fd8c8f3fb25dadd71af6db840713745b137ca42f\": container with ID starting with 8a2d18bb039ae9d23b97cc62fd8c8f3fb25dadd71af6db840713745b137ca42f not found: ID does not exist" Apr 23 18:43:51.991692 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.991455 2576 scope.go:117] "RemoveContainer" containerID="9b68cdb4ff971a6f5b554ebe71ad82a521b1c4178e02f94b2fb27390156d4217" Apr 23 18:43:51.991822 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:43:51.991746 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b68cdb4ff971a6f5b554ebe71ad82a521b1c4178e02f94b2fb27390156d4217\": container with ID starting with 9b68cdb4ff971a6f5b554ebe71ad82a521b1c4178e02f94b2fb27390156d4217 not found: ID does not exist" containerID="9b68cdb4ff971a6f5b554ebe71ad82a521b1c4178e02f94b2fb27390156d4217" Apr 23 18:43:51.991822 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.991776 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b68cdb4ff971a6f5b554ebe71ad82a521b1c4178e02f94b2fb27390156d4217"} err="failed to get container status \"9b68cdb4ff971a6f5b554ebe71ad82a521b1c4178e02f94b2fb27390156d4217\": rpc error: code = NotFound desc = could not find container \"9b68cdb4ff971a6f5b554ebe71ad82a521b1c4178e02f94b2fb27390156d4217\": container with ID starting with 9b68cdb4ff971a6f5b554ebe71ad82a521b1c4178e02f94b2fb27390156d4217 not found: ID does not exist" Apr 23 18:43:51.998874 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:51.998853 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z"] Apr 23 18:43:52.002055 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:52.002037 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-67895ccb75-5v47z"] Apr 23 18:43:53.414919 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:53.414879 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3" path="/var/lib/kubelet/pods/ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3/volumes" Apr 23 18:43:55.982215 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:55.982181 2576 generic.go:358] "Generic (PLEG): container finished" podID="f1612d0f-dabf-4e1e-bd70-42f2d287e956" containerID="fa41ea0880d71dae77891b26a6bf48fe6ac0c2a91411fb94aedf9e3aaf6bcce2" exitCode=0 Apr 23 18:43:55.982633 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:55.982257 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" event={"ID":"f1612d0f-dabf-4e1e-bd70-42f2d287e956","Type":"ContainerDied","Data":"fa41ea0880d71dae77891b26a6bf48fe6ac0c2a91411fb94aedf9e3aaf6bcce2"} Apr 23 18:43:56.988000 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:56.987962 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" event={"ID":"f1612d0f-dabf-4e1e-bd70-42f2d287e956","Type":"ContainerStarted","Data":"f510d2719a936ae81c0a802d517b0b3be1733bc33957aee81c9cbfcaeb14b461"} Apr 23 18:43:56.988000 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:56.988001 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" event={"ID":"f1612d0f-dabf-4e1e-bd70-42f2d287e956","Type":"ContainerStarted","Data":"cb3e182315e879287ee27d9b0a205722c43ac9ca20731243a1b7dedb8bed4888"} Apr 23 18:43:56.988411 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:56.988200 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" Apr 23 18:43:57.018900 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:57.018848 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" podStartSLOduration=7.018831871 podStartE2EDuration="7.018831871s" podCreationTimestamp="2026-04-23 18:43:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:43:57.018122309 +0000 UTC m=+2726.185449629" watchObservedRunningTime="2026-04-23 18:43:57.018831871 +0000 UTC m=+2726.186159246" Apr 23 18:43:57.992383 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:43:57.992352 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" Apr 23 18:44:04.001179 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:04.001147 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" Apr 23 18:44:34.074422 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:34.074367 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" podUID="f1612d0f-dabf-4e1e-bd70-42f2d287e956" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 23 18:44:44.004164 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:44.004082 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" Apr 23 18:44:50.799400 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:50.799363 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd"] Apr 23 18:44:50.799905 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:50.799760 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" podUID="f1612d0f-dabf-4e1e-bd70-42f2d287e956" containerName="kserve-container" containerID="cri-o://cb3e182315e879287ee27d9b0a205722c43ac9ca20731243a1b7dedb8bed4888" gracePeriod=30 Apr 23 18:44:50.799905 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:50.799829 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" podUID="f1612d0f-dabf-4e1e-bd70-42f2d287e956" containerName="kube-rbac-proxy" containerID="cri-o://f510d2719a936ae81c0a802d517b0b3be1733bc33957aee81c9cbfcaeb14b461" gracePeriod=30 Apr 23 18:44:50.873515 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:50.873480 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw"] Apr 23 18:44:50.873877 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:50.873864 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3" containerName="kserve-container" Apr 23 18:44:50.873924 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:50.873879 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3" containerName="kserve-container" Apr 23 18:44:50.873924 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:50.873894 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3" containerName="storage-initializer" Apr 23 18:44:50.873924 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:50.873900 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3" containerName="storage-initializer" Apr 23 18:44:50.873924 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:50.873919 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3" containerName="kube-rbac-proxy" Apr 23 18:44:50.873924 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:50.873924 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3" containerName="kube-rbac-proxy" Apr 23 18:44:50.874080 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:50.873974 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3" containerName="kube-rbac-proxy" Apr 23 18:44:50.874080 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:50.873984 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad2f8f2d-4998-49ea-90b1-e2b8d6be87a3" containerName="kserve-container" Apr 23 18:44:50.878278 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:50.878261 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" Apr 23 18:44:50.880686 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:50.880658 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 23 18:44:50.880811 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:50.880687 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-predictor-serving-cert\"" Apr 23 18:44:50.889042 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:50.889019 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw"] Apr 23 18:44:50.983980 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:50.983937 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv4cn\" (UniqueName: \"kubernetes.io/projected/31e95bf7-a82a-4935-84ed-9791d464d79e-kube-api-access-tv4cn\") pod \"isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw\" (UID: \"31e95bf7-a82a-4935-84ed-9791d464d79e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" Apr 23 18:44:50.984149 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:50.983991 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/31e95bf7-a82a-4935-84ed-9791d464d79e-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw\" (UID: \"31e95bf7-a82a-4935-84ed-9791d464d79e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" Apr 23 18:44:50.984149 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:50.984114 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31e95bf7-a82a-4935-84ed-9791d464d79e-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw\" (UID: \"31e95bf7-a82a-4935-84ed-9791d464d79e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" Apr 23 18:44:50.984225 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:50.984166 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31e95bf7-a82a-4935-84ed-9791d464d79e-proxy-tls\") pod \"isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw\" (UID: \"31e95bf7-a82a-4935-84ed-9791d464d79e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" Apr 23 18:44:51.085389 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:51.085355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tv4cn\" (UniqueName: \"kubernetes.io/projected/31e95bf7-a82a-4935-84ed-9791d464d79e-kube-api-access-tv4cn\") pod \"isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw\" (UID: \"31e95bf7-a82a-4935-84ed-9791d464d79e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" Apr 23 18:44:51.085389 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:51.085392 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/31e95bf7-a82a-4935-84ed-9791d464d79e-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw\" (UID: \"31e95bf7-a82a-4935-84ed-9791d464d79e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" Apr 23 18:44:51.085655 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:51.085438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31e95bf7-a82a-4935-84ed-9791d464d79e-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw\" (UID: \"31e95bf7-a82a-4935-84ed-9791d464d79e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" Apr 23 18:44:51.085655 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:51.085492 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31e95bf7-a82a-4935-84ed-9791d464d79e-proxy-tls\") pod \"isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw\" (UID: \"31e95bf7-a82a-4935-84ed-9791d464d79e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" Apr 23 18:44:51.086007 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:51.085983 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31e95bf7-a82a-4935-84ed-9791d464d79e-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw\" (UID: \"31e95bf7-a82a-4935-84ed-9791d464d79e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" Apr 23 18:44:51.086092 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:51.086073 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/31e95bf7-a82a-4935-84ed-9791d464d79e-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw\" (UID: \"31e95bf7-a82a-4935-84ed-9791d464d79e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" Apr 23 18:44:51.088194 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:51.088170 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31e95bf7-a82a-4935-84ed-9791d464d79e-proxy-tls\") pod \"isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw\" (UID: \"31e95bf7-a82a-4935-84ed-9791d464d79e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" Apr 23 18:44:51.093758 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:51.093738 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv4cn\" (UniqueName: \"kubernetes.io/projected/31e95bf7-a82a-4935-84ed-9791d464d79e-kube-api-access-tv4cn\") pod \"isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw\" (UID: \"31e95bf7-a82a-4935-84ed-9791d464d79e\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" Apr 23 18:44:51.174784 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:51.174752 2576 generic.go:358] "Generic (PLEG): container finished" podID="f1612d0f-dabf-4e1e-bd70-42f2d287e956" containerID="f510d2719a936ae81c0a802d517b0b3be1733bc33957aee81c9cbfcaeb14b461" exitCode=2 Apr 23 18:44:51.174943 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:51.174839 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" event={"ID":"f1612d0f-dabf-4e1e-bd70-42f2d287e956","Type":"ContainerDied","Data":"f510d2719a936ae81c0a802d517b0b3be1733bc33957aee81c9cbfcaeb14b461"} Apr 23 18:44:51.190125 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:51.190080 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" Apr 23 18:44:51.310809 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:51.310786 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw"] Apr 23 18:44:51.313051 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:44:51.313018 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31e95bf7_a82a_4935_84ed_9791d464d79e.slice/crio-aea24687e606dcb7adfcd3d272dc1dbf56d560334d061d318c8efead87204dc0 WatchSource:0}: Error finding container aea24687e606dcb7adfcd3d272dc1dbf56d560334d061d318c8efead87204dc0: Status 404 returned error can't find the container with id aea24687e606dcb7adfcd3d272dc1dbf56d560334d061d318c8efead87204dc0 Apr 23 18:44:52.179915 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:52.179877 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" event={"ID":"31e95bf7-a82a-4935-84ed-9791d464d79e","Type":"ContainerStarted","Data":"5741379e77226e9d2a110a11739a7ff43ab64a39316a030bab6abbd162bbff77"} Apr 23 18:44:52.179915 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:52.179920 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" event={"ID":"31e95bf7-a82a-4935-84ed-9791d464d79e","Type":"ContainerStarted","Data":"aea24687e606dcb7adfcd3d272dc1dbf56d560334d061d318c8efead87204dc0"} Apr 23 18:44:53.996507 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:53.996446 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" podUID="f1612d0f-dabf-4e1e-bd70-42f2d287e956" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.50:8643/healthz\": dial tcp 10.132.0.50:8643: connect: connection refused" Apr 23 18:44:55.190252 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:55.190207 2576 generic.go:358] "Generic (PLEG): container finished" podID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerID="5741379e77226e9d2a110a11739a7ff43ab64a39316a030bab6abbd162bbff77" exitCode=0 Apr 23 18:44:55.190644 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:55.190285 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" event={"ID":"31e95bf7-a82a-4935-84ed-9791d464d79e","Type":"ContainerDied","Data":"5741379e77226e9d2a110a11739a7ff43ab64a39316a030bab6abbd162bbff77"} Apr 23 18:44:56.202355 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:56.202314 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" event={"ID":"31e95bf7-a82a-4935-84ed-9791d464d79e","Type":"ContainerStarted","Data":"1132c18a66d5c0283cc91f23e7b1e9277acef7d5b8132cc951e6fceaf6ee80ac"} Apr 23 18:44:56.202355 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:56.202360 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" event={"ID":"31e95bf7-a82a-4935-84ed-9791d464d79e","Type":"ContainerStarted","Data":"721eaddc446ff5edb2c99b9df647a92e27ba6280ca050a8c58e22e2d83103c3d"} Apr 23 18:44:56.202836 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:56.202698 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" Apr 23 18:44:56.202836 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:56.202728 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" Apr 23 18:44:56.204240 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:56.204210 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" podUID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 23 18:44:56.220446 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:56.220409 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" podStartSLOduration=6.220398471 podStartE2EDuration="6.220398471s" podCreationTimestamp="2026-04-23 18:44:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:44:56.220287753 +0000 UTC m=+2785.387615073" watchObservedRunningTime="2026-04-23 18:44:56.220398471 +0000 UTC m=+2785.387725782" Apr 23 18:44:57.205617 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:57.205576 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" podUID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 23 18:44:58.211924 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:58.211891 2576 generic.go:358] "Generic (PLEG): container finished" podID="f1612d0f-dabf-4e1e-bd70-42f2d287e956" containerID="cb3e182315e879287ee27d9b0a205722c43ac9ca20731243a1b7dedb8bed4888" exitCode=0 Apr 23 18:44:58.212298 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:58.211963 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" event={"ID":"f1612d0f-dabf-4e1e-bd70-42f2d287e956","Type":"ContainerDied","Data":"cb3e182315e879287ee27d9b0a205722c43ac9ca20731243a1b7dedb8bed4888"} Apr 23 18:44:58.240802 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:58.240780 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" Apr 23 18:44:58.342325 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:58.342230 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1612d0f-dabf-4e1e-bd70-42f2d287e956-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"f1612d0f-dabf-4e1e-bd70-42f2d287e956\" (UID: \"f1612d0f-dabf-4e1e-bd70-42f2d287e956\") " Apr 23 18:44:58.342533 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:58.342334 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1612d0f-dabf-4e1e-bd70-42f2d287e956-proxy-tls\") pod \"f1612d0f-dabf-4e1e-bd70-42f2d287e956\" (UID: \"f1612d0f-dabf-4e1e-bd70-42f2d287e956\") " Apr 23 18:44:58.342533 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:58.342374 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1612d0f-dabf-4e1e-bd70-42f2d287e956-kserve-provision-location\") pod \"f1612d0f-dabf-4e1e-bd70-42f2d287e956\" (UID: \"f1612d0f-dabf-4e1e-bd70-42f2d287e956\") " Apr 23 18:44:58.342533 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:58.342404 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwqwp\" (UniqueName: \"kubernetes.io/projected/f1612d0f-dabf-4e1e-bd70-42f2d287e956-kube-api-access-wwqwp\") pod \"f1612d0f-dabf-4e1e-bd70-42f2d287e956\" (UID: \"f1612d0f-dabf-4e1e-bd70-42f2d287e956\") " Apr 23 18:44:58.342686 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:58.342630 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1612d0f-dabf-4e1e-bd70-42f2d287e956-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config") pod "f1612d0f-dabf-4e1e-bd70-42f2d287e956" (UID: "f1612d0f-dabf-4e1e-bd70-42f2d287e956"). InnerVolumeSpecName "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:44:58.342748 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:58.342722 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1612d0f-dabf-4e1e-bd70-42f2d287e956-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f1612d0f-dabf-4e1e-bd70-42f2d287e956" (UID: "f1612d0f-dabf-4e1e-bd70-42f2d287e956"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:44:58.344653 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:58.344630 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1612d0f-dabf-4e1e-bd70-42f2d287e956-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f1612d0f-dabf-4e1e-bd70-42f2d287e956" (UID: "f1612d0f-dabf-4e1e-bd70-42f2d287e956"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:44:58.344728 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:58.344682 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1612d0f-dabf-4e1e-bd70-42f2d287e956-kube-api-access-wwqwp" (OuterVolumeSpecName: "kube-api-access-wwqwp") pod "f1612d0f-dabf-4e1e-bd70-42f2d287e956" (UID: "f1612d0f-dabf-4e1e-bd70-42f2d287e956"). InnerVolumeSpecName "kube-api-access-wwqwp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:44:58.443780 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:58.443743 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1612d0f-dabf-4e1e-bd70-42f2d287e956-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:44:58.443780 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:58.443779 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1612d0f-dabf-4e1e-bd70-42f2d287e956-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:44:58.443990 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:58.443795 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wwqwp\" (UniqueName: \"kubernetes.io/projected/f1612d0f-dabf-4e1e-bd70-42f2d287e956-kube-api-access-wwqwp\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:44:58.443990 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:58.443807 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1612d0f-dabf-4e1e-bd70-42f2d287e956-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:44:59.218875 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:59.218838 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" event={"ID":"f1612d0f-dabf-4e1e-bd70-42f2d287e956","Type":"ContainerDied","Data":"dbe278bd4faa539196bfd2691aba05c1c51a12825b709f63af77493f9c31d5b7"} Apr 23 18:44:59.218875 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:59.218884 2576 scope.go:117] "RemoveContainer" containerID="f510d2719a936ae81c0a802d517b0b3be1733bc33957aee81c9cbfcaeb14b461" Apr 23 18:44:59.219551 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:59.218964 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd" Apr 23 18:44:59.227585 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:59.227567 2576 scope.go:117] "RemoveContainer" containerID="cb3e182315e879287ee27d9b0a205722c43ac9ca20731243a1b7dedb8bed4888" Apr 23 18:44:59.235399 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:59.235375 2576 scope.go:117] "RemoveContainer" containerID="fa41ea0880d71dae77891b26a6bf48fe6ac0c2a91411fb94aedf9e3aaf6bcce2" Apr 23 18:44:59.240439 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:59.240416 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd"] Apr 23 18:44:59.244014 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:59.243994 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-cbtfd"] Apr 23 18:44:59.416444 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:44:59.416406 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1612d0f-dabf-4e1e-bd70-42f2d287e956" path="/var/lib/kubelet/pods/f1612d0f-dabf-4e1e-bd70-42f2d287e956/volumes" Apr 23 18:45:02.209749 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:45:02.209721 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" Apr 23 18:45:02.210237 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:45:02.210210 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" podUID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 23 18:45:12.210175 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:45:12.210134 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" podUID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 23 18:45:22.210838 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:45:22.210797 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" podUID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 23 18:45:32.210524 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:45:32.210482 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" podUID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 23 18:45:42.210221 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:45:42.210179 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" podUID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 23 18:45:52.210656 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:45:52.210617 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" podUID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 23 18:46:02.210807 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:02.210778 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" Apr 23 18:46:11.054362 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.054287 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw"] Apr 23 18:46:11.054770 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.054612 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" podUID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerName="kserve-container" containerID="cri-o://721eaddc446ff5edb2c99b9df647a92e27ba6280ca050a8c58e22e2d83103c3d" gracePeriod=30 Apr 23 18:46:11.054770 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.054657 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" podUID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerName="kube-rbac-proxy" containerID="cri-o://1132c18a66d5c0283cc91f23e7b1e9277acef7d5b8132cc951e6fceaf6ee80ac" gracePeriod=30 Apr 23 18:46:11.177887 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.177851 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks"] Apr 23 18:46:11.178347 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.178328 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1612d0f-dabf-4e1e-bd70-42f2d287e956" containerName="storage-initializer" Apr 23 18:46:11.178347 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.178349 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1612d0f-dabf-4e1e-bd70-42f2d287e956" containerName="storage-initializer" Apr 23 18:46:11.178541 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.178368 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1612d0f-dabf-4e1e-bd70-42f2d287e956" containerName="kserve-container" Apr 23 18:46:11.178541 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.178377 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1612d0f-dabf-4e1e-bd70-42f2d287e956" containerName="kserve-container" Apr 23 18:46:11.178541 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.178404 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1612d0f-dabf-4e1e-bd70-42f2d287e956" containerName="kube-rbac-proxy" Apr 23 18:46:11.178541 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.178414 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1612d0f-dabf-4e1e-bd70-42f2d287e956" containerName="kube-rbac-proxy" Apr 23 18:46:11.178541 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.178524 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1612d0f-dabf-4e1e-bd70-42f2d287e956" containerName="kserve-container" Apr 23 18:46:11.178541 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.178543 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1612d0f-dabf-4e1e-bd70-42f2d287e956" containerName="kube-rbac-proxy" Apr 23 18:46:11.181707 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.181688 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" Apr 23 18:46:11.183632 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.183608 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-predictor-serving-cert\"" Apr 23 18:46:11.183632 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.183626 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\"" Apr 23 18:46:11.192174 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.192153 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks"] Apr 23 18:46:11.354980 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.354943 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f0a7baf-f73f-43cb-9e60-197518d065da-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks\" (UID: \"8f0a7baf-f73f-43cb-9e60-197518d065da\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" Apr 23 18:46:11.355164 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.355008 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f0a7baf-f73f-43cb-9e60-197518d065da-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks\" (UID: \"8f0a7baf-f73f-43cb-9e60-197518d065da\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" Apr 23 18:46:11.355164 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.355039 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f0a7baf-f73f-43cb-9e60-197518d065da-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks\" (UID: \"8f0a7baf-f73f-43cb-9e60-197518d065da\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" Apr 23 18:46:11.355164 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.355089 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n5vs\" (UniqueName: \"kubernetes.io/projected/8f0a7baf-f73f-43cb-9e60-197518d065da-kube-api-access-4n5vs\") pod \"isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks\" (UID: \"8f0a7baf-f73f-43cb-9e60-197518d065da\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" Apr 23 18:46:11.455534 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.455501 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f0a7baf-f73f-43cb-9e60-197518d065da-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks\" (UID: \"8f0a7baf-f73f-43cb-9e60-197518d065da\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" Apr 23 18:46:11.455710 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.455558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f0a7baf-f73f-43cb-9e60-197518d065da-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks\" (UID: \"8f0a7baf-f73f-43cb-9e60-197518d065da\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" Apr 23 18:46:11.455710 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.455581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f0a7baf-f73f-43cb-9e60-197518d065da-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks\" (UID: \"8f0a7baf-f73f-43cb-9e60-197518d065da\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" Apr 23 18:46:11.455710 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.455620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4n5vs\" (UniqueName: \"kubernetes.io/projected/8f0a7baf-f73f-43cb-9e60-197518d065da-kube-api-access-4n5vs\") pod \"isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks\" (UID: \"8f0a7baf-f73f-43cb-9e60-197518d065da\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" Apr 23 18:46:11.455974 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.455950 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f0a7baf-f73f-43cb-9e60-197518d065da-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks\" (UID: \"8f0a7baf-f73f-43cb-9e60-197518d065da\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" Apr 23 18:46:11.456233 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.456207 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f0a7baf-f73f-43cb-9e60-197518d065da-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks\" (UID: \"8f0a7baf-f73f-43cb-9e60-197518d065da\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" Apr 23 18:46:11.458180 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.458162 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f0a7baf-f73f-43cb-9e60-197518d065da-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks\" (UID: \"8f0a7baf-f73f-43cb-9e60-197518d065da\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" Apr 23 18:46:11.464874 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.464846 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n5vs\" (UniqueName: \"kubernetes.io/projected/8f0a7baf-f73f-43cb-9e60-197518d065da-kube-api-access-4n5vs\") pod \"isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks\" (UID: \"8f0a7baf-f73f-43cb-9e60-197518d065da\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" Apr 23 18:46:11.467281 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.467259 2576 generic.go:358] "Generic (PLEG): container finished" podID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerID="1132c18a66d5c0283cc91f23e7b1e9277acef7d5b8132cc951e6fceaf6ee80ac" exitCode=2 Apr 23 18:46:11.467379 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.467328 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" event={"ID":"31e95bf7-a82a-4935-84ed-9791d464d79e","Type":"ContainerDied","Data":"1132c18a66d5c0283cc91f23e7b1e9277acef7d5b8132cc951e6fceaf6ee80ac"} Apr 23 18:46:11.493086 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.493038 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" Apr 23 18:46:11.623078 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.623053 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks"] Apr 23 18:46:11.624911 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:46:11.624887 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f0a7baf_f73f_43cb_9e60_197518d065da.slice/crio-706ce859bad40e6b8a587f79ad17cc5e8f01591bdaaaa51954caee8e725c8959 WatchSource:0}: Error finding container 706ce859bad40e6b8a587f79ad17cc5e8f01591bdaaaa51954caee8e725c8959: Status 404 returned error can't find the container with id 706ce859bad40e6b8a587f79ad17cc5e8f01591bdaaaa51954caee8e725c8959 Apr 23 18:46:11.626839 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:11.626821 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:46:12.206221 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:12.206171 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" podUID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.51:8643/healthz\": dial tcp 10.132.0.51:8643: connect: connection refused" Apr 23 18:46:12.210803 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:12.210773 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" podUID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 23 18:46:12.471442 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:12.471355 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" event={"ID":"8f0a7baf-f73f-43cb-9e60-197518d065da","Type":"ContainerStarted","Data":"e9de4b709f035cf746a8e4db0893e1970269d648cfe6335b981fcff3aac3fa01"} Apr 23 18:46:12.471442 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:12.471391 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" event={"ID":"8f0a7baf-f73f-43cb-9e60-197518d065da","Type":"ContainerStarted","Data":"706ce859bad40e6b8a587f79ad17cc5e8f01591bdaaaa51954caee8e725c8959"} Apr 23 18:46:15.484450 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:15.484414 2576 generic.go:358] "Generic (PLEG): container finished" podID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerID="721eaddc446ff5edb2c99b9df647a92e27ba6280ca050a8c58e22e2d83103c3d" exitCode=0 Apr 23 18:46:15.484974 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:15.484488 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" event={"ID":"31e95bf7-a82a-4935-84ed-9791d464d79e","Type":"ContainerDied","Data":"721eaddc446ff5edb2c99b9df647a92e27ba6280ca050a8c58e22e2d83103c3d"} Apr 23 18:46:15.549045 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:15.548970 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" Apr 23 18:46:15.690660 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:15.690626 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv4cn\" (UniqueName: \"kubernetes.io/projected/31e95bf7-a82a-4935-84ed-9791d464d79e-kube-api-access-tv4cn\") pod \"31e95bf7-a82a-4935-84ed-9791d464d79e\" (UID: \"31e95bf7-a82a-4935-84ed-9791d464d79e\") " Apr 23 18:46:15.690660 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:15.690660 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31e95bf7-a82a-4935-84ed-9791d464d79e-proxy-tls\") pod \"31e95bf7-a82a-4935-84ed-9791d464d79e\" (UID: \"31e95bf7-a82a-4935-84ed-9791d464d79e\") " Apr 23 18:46:15.690870 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:15.690688 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31e95bf7-a82a-4935-84ed-9791d464d79e-kserve-provision-location\") pod \"31e95bf7-a82a-4935-84ed-9791d464d79e\" (UID: \"31e95bf7-a82a-4935-84ed-9791d464d79e\") " Apr 23 18:46:15.690870 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:15.690758 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/31e95bf7-a82a-4935-84ed-9791d464d79e-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"31e95bf7-a82a-4935-84ed-9791d464d79e\" (UID: \"31e95bf7-a82a-4935-84ed-9791d464d79e\") " Apr 23 18:46:15.691118 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:15.691085 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31e95bf7-a82a-4935-84ed-9791d464d79e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "31e95bf7-a82a-4935-84ed-9791d464d79e" (UID: "31e95bf7-a82a-4935-84ed-9791d464d79e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:46:15.691234 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:15.691131 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31e95bf7-a82a-4935-84ed-9791d464d79e-isvc-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-kube-rbac-proxy-sar-config") pod "31e95bf7-a82a-4935-84ed-9791d464d79e" (UID: "31e95bf7-a82a-4935-84ed-9791d464d79e"). InnerVolumeSpecName "isvc-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:46:15.692907 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:15.692877 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31e95bf7-a82a-4935-84ed-9791d464d79e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31e95bf7-a82a-4935-84ed-9791d464d79e" (UID: "31e95bf7-a82a-4935-84ed-9791d464d79e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:46:15.693020 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:15.692914 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e95bf7-a82a-4935-84ed-9791d464d79e-kube-api-access-tv4cn" (OuterVolumeSpecName: "kube-api-access-tv4cn") pod "31e95bf7-a82a-4935-84ed-9791d464d79e" (UID: "31e95bf7-a82a-4935-84ed-9791d464d79e"). InnerVolumeSpecName "kube-api-access-tv4cn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:46:15.791706 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:15.791667 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tv4cn\" (UniqueName: \"kubernetes.io/projected/31e95bf7-a82a-4935-84ed-9791d464d79e-kube-api-access-tv4cn\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:46:15.791706 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:15.791701 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31e95bf7-a82a-4935-84ed-9791d464d79e-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:46:15.791706 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:15.791712 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31e95bf7-a82a-4935-84ed-9791d464d79e-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:46:15.791944 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:15.791722 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/31e95bf7-a82a-4935-84ed-9791d464d79e-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:46:16.489937 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:16.489898 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" event={"ID":"31e95bf7-a82a-4935-84ed-9791d464d79e","Type":"ContainerDied","Data":"aea24687e606dcb7adfcd3d272dc1dbf56d560334d061d318c8efead87204dc0"} Apr 23 18:46:16.490411 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:16.489952 2576 scope.go:117] "RemoveContainer" containerID="1132c18a66d5c0283cc91f23e7b1e9277acef7d5b8132cc951e6fceaf6ee80ac" Apr 23 18:46:16.490411 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:16.489950 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw" Apr 23 18:46:16.491572 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:16.491544 2576 generic.go:358] "Generic (PLEG): container finished" podID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerID="e9de4b709f035cf746a8e4db0893e1970269d648cfe6335b981fcff3aac3fa01" exitCode=0 Apr 23 18:46:16.491688 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:16.491616 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" event={"ID":"8f0a7baf-f73f-43cb-9e60-197518d065da","Type":"ContainerDied","Data":"e9de4b709f035cf746a8e4db0893e1970269d648cfe6335b981fcff3aac3fa01"} Apr 23 18:46:16.500038 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:16.500018 2576 scope.go:117] "RemoveContainer" containerID="721eaddc446ff5edb2c99b9df647a92e27ba6280ca050a8c58e22e2d83103c3d" Apr 23 18:46:16.507657 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:16.507633 2576 scope.go:117] "RemoveContainer" containerID="5741379e77226e9d2a110a11739a7ff43ab64a39316a030bab6abbd162bbff77" Apr 23 18:46:16.522867 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:16.522839 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw"] Apr 23 18:46:16.526798 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:16.526776 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7f7fb98cb5-2xrvw"] Apr 23 18:46:17.415809 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:17.415776 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31e95bf7-a82a-4935-84ed-9791d464d79e" path="/var/lib/kubelet/pods/31e95bf7-a82a-4935-84ed-9791d464d79e/volumes" Apr 23 18:46:17.497821 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:17.497789 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" event={"ID":"8f0a7baf-f73f-43cb-9e60-197518d065da","Type":"ContainerStarted","Data":"cf4e2bb4c78ff4f2fef6d6b9599dc5bf1e891e9c9c317ae8c71dda57db126100"} Apr 23 18:46:17.497821 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:17.497827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" event={"ID":"8f0a7baf-f73f-43cb-9e60-197518d065da","Type":"ContainerStarted","Data":"984fb1fa745c2470fe04e9153799766368dcb2e12b5a8599fdf0badd7c6e988b"} Apr 23 18:46:17.498216 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:17.498159 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" Apr 23 18:46:17.498293 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:17.498276 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" Apr 23 18:46:17.499580 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:17.499554 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" podUID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 23 18:46:17.517626 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:17.517581 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" podStartSLOduration=6.517567575 podStartE2EDuration="6.517567575s" podCreationTimestamp="2026-04-23 18:46:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:46:17.515790675 +0000 UTC m=+2866.683117996" watchObservedRunningTime="2026-04-23 18:46:17.517567575 +0000 UTC m=+2866.684894893" Apr 23 18:46:18.501684 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:18.501637 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" podUID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 23 18:46:23.506720 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:23.506690 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" Apr 23 18:46:23.507271 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:23.507247 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" podUID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 23 18:46:33.507222 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:33.507182 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" podUID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 23 18:46:43.507441 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:43.507403 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" podUID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 23 18:46:53.507880 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:46:53.507842 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" podUID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 23 18:47:03.508053 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:03.508014 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" podUID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 23 18:47:13.508034 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:13.507993 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" podUID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 23 18:47:23.508631 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:23.508600 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" Apr 23 18:47:31.415856 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:31.415817 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks"] Apr 23 18:47:31.416225 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:31.416133 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" podUID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerName="kserve-container" containerID="cri-o://984fb1fa745c2470fe04e9153799766368dcb2e12b5a8599fdf0badd7c6e988b" gracePeriod=30 Apr 23 18:47:31.416225 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:31.416154 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" podUID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerName="kube-rbac-proxy" containerID="cri-o://cf4e2bb4c78ff4f2fef6d6b9599dc5bf1e891e9c9c317ae8c71dda57db126100" gracePeriod=30 Apr 23 18:47:31.748586 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:31.748501 2576 generic.go:358] "Generic (PLEG): container finished" podID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerID="cf4e2bb4c78ff4f2fef6d6b9599dc5bf1e891e9c9c317ae8c71dda57db126100" exitCode=2 Apr 23 18:47:31.748586 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:31.748571 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" event={"ID":"8f0a7baf-f73f-43cb-9e60-197518d065da","Type":"ContainerDied","Data":"cf4e2bb4c78ff4f2fef6d6b9599dc5bf1e891e9c9c317ae8c71dda57db126100"} Apr 23 18:47:33.502045 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:33.502001 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" podUID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.52:8643/healthz\": dial tcp 10.132.0.52:8643: connect: connection refused" Apr 23 18:47:33.508030 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:33.507995 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" podUID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 23 18:47:38.502012 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:38.501970 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" podUID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.52:8643/healthz\": dial tcp 10.132.0.52:8643: connect: connection refused" Apr 23 18:47:39.357644 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.357621 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" Apr 23 18:47:39.492594 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.492492 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f0a7baf-f73f-43cb-9e60-197518d065da-kserve-provision-location\") pod \"8f0a7baf-f73f-43cb-9e60-197518d065da\" (UID: \"8f0a7baf-f73f-43cb-9e60-197518d065da\") " Apr 23 18:47:39.492594 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.492544 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n5vs\" (UniqueName: \"kubernetes.io/projected/8f0a7baf-f73f-43cb-9e60-197518d065da-kube-api-access-4n5vs\") pod \"8f0a7baf-f73f-43cb-9e60-197518d065da\" (UID: \"8f0a7baf-f73f-43cb-9e60-197518d065da\") " Apr 23 18:47:39.492594 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.492578 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f0a7baf-f73f-43cb-9e60-197518d065da-proxy-tls\") pod \"8f0a7baf-f73f-43cb-9e60-197518d065da\" (UID: \"8f0a7baf-f73f-43cb-9e60-197518d065da\") " Apr 23 18:47:39.492894 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.492631 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f0a7baf-f73f-43cb-9e60-197518d065da-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"8f0a7baf-f73f-43cb-9e60-197518d065da\" (UID: \"8f0a7baf-f73f-43cb-9e60-197518d065da\") " Apr 23 18:47:39.492894 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.492812 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f0a7baf-f73f-43cb-9e60-197518d065da-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8f0a7baf-f73f-43cb-9e60-197518d065da" (UID: "8f0a7baf-f73f-43cb-9e60-197518d065da"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:47:39.493048 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.493026 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f0a7baf-f73f-43cb-9e60-197518d065da-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config") pod "8f0a7baf-f73f-43cb-9e60-197518d065da" (UID: "8f0a7baf-f73f-43cb-9e60-197518d065da"). InnerVolumeSpecName "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:47:39.494922 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.494899 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f0a7baf-f73f-43cb-9e60-197518d065da-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8f0a7baf-f73f-43cb-9e60-197518d065da" (UID: "8f0a7baf-f73f-43cb-9e60-197518d065da"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:47:39.495006 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.494899 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f0a7baf-f73f-43cb-9e60-197518d065da-kube-api-access-4n5vs" (OuterVolumeSpecName: "kube-api-access-4n5vs") pod "8f0a7baf-f73f-43cb-9e60-197518d065da" (UID: "8f0a7baf-f73f-43cb-9e60-197518d065da"). InnerVolumeSpecName "kube-api-access-4n5vs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:47:39.593626 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.593587 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f0a7baf-f73f-43cb-9e60-197518d065da-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:47:39.593626 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.593618 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f0a7baf-f73f-43cb-9e60-197518d065da-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:47:39.593626 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.593629 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4n5vs\" (UniqueName: \"kubernetes.io/projected/8f0a7baf-f73f-43cb-9e60-197518d065da-kube-api-access-4n5vs\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:47:39.593626 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.593639 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f0a7baf-f73f-43cb-9e60-197518d065da-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:47:39.775843 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.775714 2576 generic.go:358] "Generic (PLEG): container finished" podID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerID="984fb1fa745c2470fe04e9153799766368dcb2e12b5a8599fdf0badd7c6e988b" exitCode=0 Apr 23 18:47:39.775843 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.775777 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" event={"ID":"8f0a7baf-f73f-43cb-9e60-197518d065da","Type":"ContainerDied","Data":"984fb1fa745c2470fe04e9153799766368dcb2e12b5a8599fdf0badd7c6e988b"} Apr 23 18:47:39.775843 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.775789 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" Apr 23 18:47:39.775843 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.775811 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks" event={"ID":"8f0a7baf-f73f-43cb-9e60-197518d065da","Type":"ContainerDied","Data":"706ce859bad40e6b8a587f79ad17cc5e8f01591bdaaaa51954caee8e725c8959"} Apr 23 18:47:39.775843 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.775832 2576 scope.go:117] "RemoveContainer" containerID="cf4e2bb4c78ff4f2fef6d6b9599dc5bf1e891e9c9c317ae8c71dda57db126100" Apr 23 18:47:39.784378 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.784355 2576 scope.go:117] "RemoveContainer" containerID="984fb1fa745c2470fe04e9153799766368dcb2e12b5a8599fdf0badd7c6e988b" Apr 23 18:47:39.792104 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.792086 2576 scope.go:117] "RemoveContainer" containerID="e9de4b709f035cf746a8e4db0893e1970269d648cfe6335b981fcff3aac3fa01" Apr 23 18:47:39.800809 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.800782 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks"] Apr 23 18:47:39.803023 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.802967 2576 scope.go:117] "RemoveContainer" containerID="cf4e2bb4c78ff4f2fef6d6b9599dc5bf1e891e9c9c317ae8c71dda57db126100" Apr 23 18:47:39.803515 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:47:39.803480 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf4e2bb4c78ff4f2fef6d6b9599dc5bf1e891e9c9c317ae8c71dda57db126100\": container with ID starting with cf4e2bb4c78ff4f2fef6d6b9599dc5bf1e891e9c9c317ae8c71dda57db126100 not found: ID does not exist" containerID="cf4e2bb4c78ff4f2fef6d6b9599dc5bf1e891e9c9c317ae8c71dda57db126100" Apr 23 18:47:39.803660 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.803527 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf4e2bb4c78ff4f2fef6d6b9599dc5bf1e891e9c9c317ae8c71dda57db126100"} err="failed to get container status \"cf4e2bb4c78ff4f2fef6d6b9599dc5bf1e891e9c9c317ae8c71dda57db126100\": rpc error: code = NotFound desc = could not find container \"cf4e2bb4c78ff4f2fef6d6b9599dc5bf1e891e9c9c317ae8c71dda57db126100\": container with ID starting with cf4e2bb4c78ff4f2fef6d6b9599dc5bf1e891e9c9c317ae8c71dda57db126100 not found: ID does not exist" Apr 23 18:47:39.803660 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.803554 2576 scope.go:117] "RemoveContainer" containerID="984fb1fa745c2470fe04e9153799766368dcb2e12b5a8599fdf0badd7c6e988b" Apr 23 18:47:39.804023 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:47:39.803995 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"984fb1fa745c2470fe04e9153799766368dcb2e12b5a8599fdf0badd7c6e988b\": container with ID starting with 984fb1fa745c2470fe04e9153799766368dcb2e12b5a8599fdf0badd7c6e988b not found: ID does not exist" containerID="984fb1fa745c2470fe04e9153799766368dcb2e12b5a8599fdf0badd7c6e988b" Apr 23 18:47:39.804151 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.804029 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"984fb1fa745c2470fe04e9153799766368dcb2e12b5a8599fdf0badd7c6e988b"} err="failed to get container status \"984fb1fa745c2470fe04e9153799766368dcb2e12b5a8599fdf0badd7c6e988b\": rpc error: code = NotFound desc = could not find container \"984fb1fa745c2470fe04e9153799766368dcb2e12b5a8599fdf0badd7c6e988b\": container with ID starting with 984fb1fa745c2470fe04e9153799766368dcb2e12b5a8599fdf0badd7c6e988b not found: ID does not exist" Apr 23 18:47:39.804151 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.804052 2576 scope.go:117] "RemoveContainer" containerID="e9de4b709f035cf746a8e4db0893e1970269d648cfe6335b981fcff3aac3fa01" Apr 23 18:47:39.804336 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:47:39.804302 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9de4b709f035cf746a8e4db0893e1970269d648cfe6335b981fcff3aac3fa01\": container with ID starting with e9de4b709f035cf746a8e4db0893e1970269d648cfe6335b981fcff3aac3fa01 not found: ID does not exist" containerID="e9de4b709f035cf746a8e4db0893e1970269d648cfe6335b981fcff3aac3fa01" Apr 23 18:47:39.804394 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.804340 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9de4b709f035cf746a8e4db0893e1970269d648cfe6335b981fcff3aac3fa01"} err="failed to get container status \"e9de4b709f035cf746a8e4db0893e1970269d648cfe6335b981fcff3aac3fa01\": rpc error: code = NotFound desc = could not find container \"e9de4b709f035cf746a8e4db0893e1970269d648cfe6335b981fcff3aac3fa01\": container with ID starting with e9de4b709f035cf746a8e4db0893e1970269d648cfe6335b981fcff3aac3fa01 not found: ID does not exist" Apr 23 18:47:39.805500 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:39.805478 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-dcfd46f6d-bh8ks"] Apr 23 18:47:41.414953 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:47:41.414920 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f0a7baf-f73f-43cb-9e60-197518d065da" path="/var/lib/kubelet/pods/8f0a7baf-f73f-43cb-9e60-197518d065da/volumes" Apr 23 18:48:13.345407 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.345368 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph"] Apr 23 18:48:13.345925 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.345727 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerName="kube-rbac-proxy" Apr 23 18:48:13.345925 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.345738 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerName="kube-rbac-proxy" Apr 23 18:48:13.345925 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.345749 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerName="storage-initializer" Apr 23 18:48:13.345925 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.345755 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerName="storage-initializer" Apr 23 18:48:13.345925 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.345763 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerName="kube-rbac-proxy" Apr 23 18:48:13.345925 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.345769 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerName="kube-rbac-proxy" Apr 23 18:48:13.345925 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.345776 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerName="kserve-container" Apr 23 18:48:13.345925 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.345781 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerName="kserve-container" Apr 23 18:48:13.345925 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.345788 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerName="storage-initializer" Apr 23 18:48:13.345925 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.345793 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerName="storage-initializer" Apr 23 18:48:13.345925 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.345802 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerName="kserve-container" Apr 23 18:48:13.345925 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.345807 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerName="kserve-container" Apr 23 18:48:13.345925 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.345850 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerName="kube-rbac-proxy" Apr 23 18:48:13.345925 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.345856 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerName="kube-rbac-proxy" Apr 23 18:48:13.345925 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.345865 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f0a7baf-f73f-43cb-9e60-197518d065da" containerName="kserve-container" Apr 23 18:48:13.345925 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.345871 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="31e95bf7-a82a-4935-84ed-9791d464d79e" containerName="kserve-container" Apr 23 18:48:13.349032 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.349008 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" Apr 23 18:48:13.351354 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.351327 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t4fg7\"" Apr 23 18:48:13.351570 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.351550 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\"" Apr 23 18:48:13.351774 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.351475 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-predictor-serving-cert\"" Apr 23 18:48:13.351907 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.351788 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:48:13.351971 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.351863 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 18:48:13.362371 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.362342 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph"] Apr 23 18:48:13.369672 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.369646 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b0d1dc9-f68f-4404-92f4-11f154259358-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph\" (UID: \"5b0d1dc9-f68f-4404-92f4-11f154259358\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" Apr 23 18:48:13.369807 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.369686 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b0d1dc9-f68f-4404-92f4-11f154259358-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph\" (UID: \"5b0d1dc9-f68f-4404-92f4-11f154259358\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" Apr 23 18:48:13.369807 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.369754 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkxmt\" (UniqueName: \"kubernetes.io/projected/5b0d1dc9-f68f-4404-92f4-11f154259358-kube-api-access-tkxmt\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph\" (UID: \"5b0d1dc9-f68f-4404-92f4-11f154259358\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" Apr 23 18:48:13.369929 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.369814 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b0d1dc9-f68f-4404-92f4-11f154259358-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph\" (UID: \"5b0d1dc9-f68f-4404-92f4-11f154259358\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" Apr 23 18:48:13.471049 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.471011 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b0d1dc9-f68f-4404-92f4-11f154259358-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph\" (UID: \"5b0d1dc9-f68f-4404-92f4-11f154259358\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" Apr 23 18:48:13.471049 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.471054 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b0d1dc9-f68f-4404-92f4-11f154259358-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph\" (UID: \"5b0d1dc9-f68f-4404-92f4-11f154259358\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" Apr 23 18:48:13.471310 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.471089 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkxmt\" (UniqueName: \"kubernetes.io/projected/5b0d1dc9-f68f-4404-92f4-11f154259358-kube-api-access-tkxmt\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph\" (UID: \"5b0d1dc9-f68f-4404-92f4-11f154259358\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" Apr 23 18:48:13.471310 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.471116 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b0d1dc9-f68f-4404-92f4-11f154259358-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph\" (UID: \"5b0d1dc9-f68f-4404-92f4-11f154259358\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" Apr 23 18:48:13.471516 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.471450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b0d1dc9-f68f-4404-92f4-11f154259358-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph\" (UID: \"5b0d1dc9-f68f-4404-92f4-11f154259358\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" Apr 23 18:48:13.471812 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.471783 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b0d1dc9-f68f-4404-92f4-11f154259358-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph\" (UID: \"5b0d1dc9-f68f-4404-92f4-11f154259358\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" Apr 23 18:48:13.473927 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.473904 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b0d1dc9-f68f-4404-92f4-11f154259358-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph\" (UID: \"5b0d1dc9-f68f-4404-92f4-11f154259358\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" Apr 23 18:48:13.479781 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.479760 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkxmt\" (UniqueName: \"kubernetes.io/projected/5b0d1dc9-f68f-4404-92f4-11f154259358-kube-api-access-tkxmt\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph\" (UID: \"5b0d1dc9-f68f-4404-92f4-11f154259358\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" Apr 23 18:48:13.663907 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.663802 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" Apr 23 18:48:13.784826 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.784792 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph"] Apr 23 18:48:13.788840 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:48:13.788812 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b0d1dc9_f68f_4404_92f4_11f154259358.slice/crio-f4318b354d0a9d6e99073e8a4185c6c6dec1c56d049c23f60b141f6ebe3d03b6 WatchSource:0}: Error finding container f4318b354d0a9d6e99073e8a4185c6c6dec1c56d049c23f60b141f6ebe3d03b6: Status 404 returned error can't find the container with id f4318b354d0a9d6e99073e8a4185c6c6dec1c56d049c23f60b141f6ebe3d03b6 Apr 23 18:48:13.893582 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.893549 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" event={"ID":"5b0d1dc9-f68f-4404-92f4-11f154259358","Type":"ContainerStarted","Data":"669702cb1ac9d7c57c153ce2bb4d9f050453eb297c2368b40896d6ddd805ee10"} Apr 23 18:48:13.893752 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:13.893594 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" event={"ID":"5b0d1dc9-f68f-4404-92f4-11f154259358","Type":"ContainerStarted","Data":"f4318b354d0a9d6e99073e8a4185c6c6dec1c56d049c23f60b141f6ebe3d03b6"} Apr 23 18:48:18.911624 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:18.911592 2576 generic.go:358] "Generic (PLEG): container finished" podID="5b0d1dc9-f68f-4404-92f4-11f154259358" containerID="669702cb1ac9d7c57c153ce2bb4d9f050453eb297c2368b40896d6ddd805ee10" exitCode=0 Apr 23 18:48:18.912024 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:18.911633 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" event={"ID":"5b0d1dc9-f68f-4404-92f4-11f154259358","Type":"ContainerDied","Data":"669702cb1ac9d7c57c153ce2bb4d9f050453eb297c2368b40896d6ddd805ee10"} Apr 23 18:48:22.928246 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:22.928212 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" event={"ID":"5b0d1dc9-f68f-4404-92f4-11f154259358","Type":"ContainerStarted","Data":"44a9918fc520458e4bfaa2b6a57c8e63d4935091b39ee72fac95d5e2a1347bba"} Apr 23 18:48:22.928246 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:22.928252 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" event={"ID":"5b0d1dc9-f68f-4404-92f4-11f154259358","Type":"ContainerStarted","Data":"69addebc84723aaab1da8f9a274a4268d198426bdb9ccbcfecfd6c359ea723fa"} Apr 23 18:48:22.928717 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:22.928552 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" Apr 23 18:48:22.928717 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:22.928665 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" Apr 23 18:48:22.929982 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:22.929957 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" podUID="5b0d1dc9-f68f-4404-92f4-11f154259358" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 23 18:48:22.961865 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:22.961818 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" podStartSLOduration=6.359346072 podStartE2EDuration="9.961804013s" podCreationTimestamp="2026-04-23 18:48:13 +0000 UTC" firstStartedPulling="2026-04-23 18:48:18.912749326 +0000 UTC m=+2988.080076622" lastFinishedPulling="2026-04-23 18:48:22.515207264 +0000 UTC m=+2991.682534563" observedRunningTime="2026-04-23 18:48:22.960550555 +0000 UTC m=+2992.127877873" watchObservedRunningTime="2026-04-23 18:48:22.961804013 +0000 UTC m=+2992.129131335" Apr 23 18:48:23.931564 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:23.931525 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" podUID="5b0d1dc9-f68f-4404-92f4-11f154259358" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 23 18:48:28.936987 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:28.936934 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" Apr 23 18:48:28.937628 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:28.937597 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" podUID="5b0d1dc9-f68f-4404-92f4-11f154259358" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 23 18:48:31.539518 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:31.539492 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:48:31.549511 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:31.549488 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:48:38.938273 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:38.938242 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" Apr 23 18:48:54.006039 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:54.006000 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph"] Apr 23 18:48:54.006495 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:54.006333 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" podUID="5b0d1dc9-f68f-4404-92f4-11f154259358" containerName="kserve-container" containerID="cri-o://69addebc84723aaab1da8f9a274a4268d198426bdb9ccbcfecfd6c359ea723fa" gracePeriod=30 Apr 23 18:48:54.006495 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:54.006380 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" podUID="5b0d1dc9-f68f-4404-92f4-11f154259358" containerName="kube-rbac-proxy" containerID="cri-o://44a9918fc520458e4bfaa2b6a57c8e63d4935091b39ee72fac95d5e2a1347bba" gracePeriod=30 Apr 23 18:48:55.035179 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:55.035147 2576 generic.go:358] "Generic (PLEG): container finished" podID="5b0d1dc9-f68f-4404-92f4-11f154259358" containerID="44a9918fc520458e4bfaa2b6a57c8e63d4935091b39ee72fac95d5e2a1347bba" exitCode=2 Apr 23 18:48:55.035645 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:55.035222 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" event={"ID":"5b0d1dc9-f68f-4404-92f4-11f154259358","Type":"ContainerDied","Data":"44a9918fc520458e4bfaa2b6a57c8e63d4935091b39ee72fac95d5e2a1347bba"} Apr 23 18:48:58.932719 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:48:58.932680 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" podUID="5b0d1dc9-f68f-4404-92f4-11f154259358" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 23 18:49:03.932467 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:03.932428 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" podUID="5b0d1dc9-f68f-4404-92f4-11f154259358" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 23 18:49:08.931869 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:08.931825 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" podUID="5b0d1dc9-f68f-4404-92f4-11f154259358" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 23 18:49:08.932262 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:08.931957 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" Apr 23 18:49:13.932105 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:13.932005 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" podUID="5b0d1dc9-f68f-4404-92f4-11f154259358" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 23 18:49:18.932611 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:18.932574 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" podUID="5b0d1dc9-f68f-4404-92f4-11f154259358" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 23 18:49:23.932132 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:23.932088 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" podUID="5b0d1dc9-f68f-4404-92f4-11f154259358" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 23 18:49:24.128475 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:24.128429 2576 generic.go:358] "Generic (PLEG): container finished" podID="5b0d1dc9-f68f-4404-92f4-11f154259358" containerID="69addebc84723aaab1da8f9a274a4268d198426bdb9ccbcfecfd6c359ea723fa" exitCode=137 Apr 23 18:49:24.128619 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:24.128486 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" event={"ID":"5b0d1dc9-f68f-4404-92f4-11f154259358","Type":"ContainerDied","Data":"69addebc84723aaab1da8f9a274a4268d198426bdb9ccbcfecfd6c359ea723fa"} Apr 23 18:49:24.645179 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:24.645158 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" Apr 23 18:49:24.731640 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:24.731604 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b0d1dc9-f68f-4404-92f4-11f154259358-kserve-provision-location\") pod \"5b0d1dc9-f68f-4404-92f4-11f154259358\" (UID: \"5b0d1dc9-f68f-4404-92f4-11f154259358\") " Apr 23 18:49:24.731789 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:24.731668 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b0d1dc9-f68f-4404-92f4-11f154259358-proxy-tls\") pod \"5b0d1dc9-f68f-4404-92f4-11f154259358\" (UID: \"5b0d1dc9-f68f-4404-92f4-11f154259358\") " Apr 23 18:49:24.731789 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:24.731725 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkxmt\" (UniqueName: \"kubernetes.io/projected/5b0d1dc9-f68f-4404-92f4-11f154259358-kube-api-access-tkxmt\") pod \"5b0d1dc9-f68f-4404-92f4-11f154259358\" (UID: \"5b0d1dc9-f68f-4404-92f4-11f154259358\") " Apr 23 18:49:24.731789 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:24.731761 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b0d1dc9-f68f-4404-92f4-11f154259358-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"5b0d1dc9-f68f-4404-92f4-11f154259358\" (UID: \"5b0d1dc9-f68f-4404-92f4-11f154259358\") " Apr 23 18:49:24.732161 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:24.732132 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b0d1dc9-f68f-4404-92f4-11f154259358-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config") pod "5b0d1dc9-f68f-4404-92f4-11f154259358" (UID: "5b0d1dc9-f68f-4404-92f4-11f154259358"). InnerVolumeSpecName "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:49:24.733984 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:24.733948 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b0d1dc9-f68f-4404-92f4-11f154259358-kube-api-access-tkxmt" (OuterVolumeSpecName: "kube-api-access-tkxmt") pod "5b0d1dc9-f68f-4404-92f4-11f154259358" (UID: "5b0d1dc9-f68f-4404-92f4-11f154259358"). InnerVolumeSpecName "kube-api-access-tkxmt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:49:24.733984 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:24.733976 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b0d1dc9-f68f-4404-92f4-11f154259358-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5b0d1dc9-f68f-4404-92f4-11f154259358" (UID: "5b0d1dc9-f68f-4404-92f4-11f154259358"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:49:24.742690 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:24.742661 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b0d1dc9-f68f-4404-92f4-11f154259358-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5b0d1dc9-f68f-4404-92f4-11f154259358" (UID: "5b0d1dc9-f68f-4404-92f4-11f154259358"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:49:24.833283 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:24.833254 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b0d1dc9-f68f-4404-92f4-11f154259358-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:49:24.833283 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:24.833283 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tkxmt\" (UniqueName: \"kubernetes.io/projected/5b0d1dc9-f68f-4404-92f4-11f154259358-kube-api-access-tkxmt\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:49:24.833495 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:24.833295 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b0d1dc9-f68f-4404-92f4-11f154259358-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:49:24.833495 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:24.833307 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b0d1dc9-f68f-4404-92f4-11f154259358-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:49:25.134324 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:25.134228 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" event={"ID":"5b0d1dc9-f68f-4404-92f4-11f154259358","Type":"ContainerDied","Data":"f4318b354d0a9d6e99073e8a4185c6c6dec1c56d049c23f60b141f6ebe3d03b6"} Apr 23 18:49:25.134324 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:25.134283 2576 scope.go:117] "RemoveContainer" containerID="44a9918fc520458e4bfaa2b6a57c8e63d4935091b39ee72fac95d5e2a1347bba" Apr 23 18:49:25.134324 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:25.134301 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph" Apr 23 18:49:25.143861 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:25.143838 2576 scope.go:117] "RemoveContainer" containerID="69addebc84723aaab1da8f9a274a4268d198426bdb9ccbcfecfd6c359ea723fa" Apr 23 18:49:25.150992 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:25.150972 2576 scope.go:117] "RemoveContainer" containerID="669702cb1ac9d7c57c153ce2bb4d9f050453eb297c2368b40896d6ddd805ee10" Apr 23 18:49:25.158352 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:25.158331 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph"] Apr 23 18:49:25.165384 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:25.165360 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-wp4ph"] Apr 23 18:49:25.415437 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:49:25.415353 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b0d1dc9-f68f-4404-92f4-11f154259358" path="/var/lib/kubelet/pods/5b0d1dc9-f68f-4404-92f4-11f154259358/volumes" Apr 23 18:52:46.063845 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.063755 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57"] Apr 23 18:52:46.064339 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.064096 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b0d1dc9-f68f-4404-92f4-11f154259358" containerName="kserve-container" Apr 23 18:52:46.064339 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.064106 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0d1dc9-f68f-4404-92f4-11f154259358" containerName="kserve-container" Apr 23 18:52:46.064339 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.064120 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b0d1dc9-f68f-4404-92f4-11f154259358" containerName="kube-rbac-proxy" Apr 23 18:52:46.064339 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.064126 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0d1dc9-f68f-4404-92f4-11f154259358" containerName="kube-rbac-proxy" Apr 23 18:52:46.064339 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.064136 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b0d1dc9-f68f-4404-92f4-11f154259358" containerName="storage-initializer" Apr 23 18:52:46.064339 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.064141 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0d1dc9-f68f-4404-92f4-11f154259358" containerName="storage-initializer" Apr 23 18:52:46.064339 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.064200 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b0d1dc9-f68f-4404-92f4-11f154259358" containerName="kube-rbac-proxy" Apr 23 18:52:46.064339 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.064212 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b0d1dc9-f68f-4404-92f4-11f154259358" containerName="kserve-container" Apr 23 18:52:46.067317 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.067298 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" Apr 23 18:52:46.069610 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.069590 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:52:46.069706 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.069615 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 18:52:46.069706 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.069594 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 23 18:52:46.070188 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.070172 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-predictor-serving-cert\"" Apr 23 18:52:46.070235 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.070177 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t4fg7\"" Apr 23 18:52:46.077896 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.077871 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57"] Apr 23 18:52:46.107034 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.107000 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a536c176-fa7e-4e67-ab83-14b8dc651c9b-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57\" (UID: \"a536c176-fa7e-4e67-ab83-14b8dc651c9b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" Apr 23 18:52:46.107190 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.107053 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a536c176-fa7e-4e67-ab83-14b8dc651c9b-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57\" (UID: \"a536c176-fa7e-4e67-ab83-14b8dc651c9b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" Apr 23 18:52:46.107190 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.107112 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr8x4\" (UniqueName: \"kubernetes.io/projected/a536c176-fa7e-4e67-ab83-14b8dc651c9b-kube-api-access-nr8x4\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57\" (UID: \"a536c176-fa7e-4e67-ab83-14b8dc651c9b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" Apr 23 18:52:46.107290 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.107184 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a536c176-fa7e-4e67-ab83-14b8dc651c9b-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57\" (UID: \"a536c176-fa7e-4e67-ab83-14b8dc651c9b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" Apr 23 18:52:46.208500 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.208441 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a536c176-fa7e-4e67-ab83-14b8dc651c9b-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57\" (UID: \"a536c176-fa7e-4e67-ab83-14b8dc651c9b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" Apr 23 18:52:46.208688 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.208513 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nr8x4\" (UniqueName: \"kubernetes.io/projected/a536c176-fa7e-4e67-ab83-14b8dc651c9b-kube-api-access-nr8x4\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57\" (UID: \"a536c176-fa7e-4e67-ab83-14b8dc651c9b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" Apr 23 18:52:46.208688 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.208551 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a536c176-fa7e-4e67-ab83-14b8dc651c9b-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57\" (UID: \"a536c176-fa7e-4e67-ab83-14b8dc651c9b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" Apr 23 18:52:46.208688 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.208576 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a536c176-fa7e-4e67-ab83-14b8dc651c9b-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57\" (UID: \"a536c176-fa7e-4e67-ab83-14b8dc651c9b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" Apr 23 18:52:46.208965 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.208942 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a536c176-fa7e-4e67-ab83-14b8dc651c9b-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57\" (UID: \"a536c176-fa7e-4e67-ab83-14b8dc651c9b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" Apr 23 18:52:46.209092 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.209074 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a536c176-fa7e-4e67-ab83-14b8dc651c9b-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57\" (UID: \"a536c176-fa7e-4e67-ab83-14b8dc651c9b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" Apr 23 18:52:46.211125 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.211108 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a536c176-fa7e-4e67-ab83-14b8dc651c9b-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57\" (UID: \"a536c176-fa7e-4e67-ab83-14b8dc651c9b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" Apr 23 18:52:46.217650 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.217628 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr8x4\" (UniqueName: \"kubernetes.io/projected/a536c176-fa7e-4e67-ab83-14b8dc651c9b-kube-api-access-nr8x4\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57\" (UID: \"a536c176-fa7e-4e67-ab83-14b8dc651c9b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" Apr 23 18:52:46.379914 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.379879 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" Apr 23 18:52:46.506570 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.506546 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57"] Apr 23 18:52:46.508713 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:52:46.508678 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda536c176_fa7e_4e67_ab83_14b8dc651c9b.slice/crio-4d9cd00e20d866af413e70a67e98b1545e6830d9d01a0b60cfb17089334388b8 WatchSource:0}: Error finding container 4d9cd00e20d866af413e70a67e98b1545e6830d9d01a0b60cfb17089334388b8: Status 404 returned error can't find the container with id 4d9cd00e20d866af413e70a67e98b1545e6830d9d01a0b60cfb17089334388b8 Apr 23 18:52:46.510868 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.510852 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:52:46.796489 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.796372 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" event={"ID":"a536c176-fa7e-4e67-ab83-14b8dc651c9b","Type":"ContainerStarted","Data":"d6968c2be92aaf1e17a1f886efa8350c93ebc930a61d15403b0216ec3c1448bb"} Apr 23 18:52:46.796489 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:46.796421 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" event={"ID":"a536c176-fa7e-4e67-ab83-14b8dc651c9b","Type":"ContainerStarted","Data":"4d9cd00e20d866af413e70a67e98b1545e6830d9d01a0b60cfb17089334388b8"} Apr 23 18:52:50.810849 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:50.810812 2576 generic.go:358] "Generic (PLEG): container finished" podID="a536c176-fa7e-4e67-ab83-14b8dc651c9b" containerID="d6968c2be92aaf1e17a1f886efa8350c93ebc930a61d15403b0216ec3c1448bb" exitCode=0 Apr 23 18:52:50.811283 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:50.810883 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" event={"ID":"a536c176-fa7e-4e67-ab83-14b8dc651c9b","Type":"ContainerDied","Data":"d6968c2be92aaf1e17a1f886efa8350c93ebc930a61d15403b0216ec3c1448bb"} Apr 23 18:52:51.816336 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:51.816296 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" event={"ID":"a536c176-fa7e-4e67-ab83-14b8dc651c9b","Type":"ContainerStarted","Data":"582f8e77ad5ce8a44505565540ef7f88f97d60e2c24f7b791084358042c8de4f"} Apr 23 18:52:51.816336 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:51.816340 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" event={"ID":"a536c176-fa7e-4e67-ab83-14b8dc651c9b","Type":"ContainerStarted","Data":"67878dd41a7eff7bf253ee12d0e736c90895ee43b7d8085adff9251a24a4429e"} Apr 23 18:52:51.816827 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:51.816613 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" Apr 23 18:52:51.816827 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:51.816673 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" Apr 23 18:52:51.836665 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:51.836628 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" podStartSLOduration=5.836617352 podStartE2EDuration="5.836617352s" podCreationTimestamp="2026-04-23 18:52:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:52:51.835509724 +0000 UTC m=+3261.002837046" watchObservedRunningTime="2026-04-23 18:52:51.836617352 +0000 UTC m=+3261.003944671" Apr 23 18:52:57.825657 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:52:57.825626 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" Apr 23 18:53:27.831107 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:27.831075 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" Apr 23 18:53:31.563817 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:31.563786 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:53:31.574748 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:31.574722 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:53:36.095074 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.095043 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57"] Apr 23 18:53:36.095612 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.095418 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" podUID="a536c176-fa7e-4e67-ab83-14b8dc651c9b" containerName="kube-rbac-proxy" containerID="cri-o://582f8e77ad5ce8a44505565540ef7f88f97d60e2c24f7b791084358042c8de4f" gracePeriod=30 Apr 23 18:53:36.095700 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.095670 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" podUID="a536c176-fa7e-4e67-ab83-14b8dc651c9b" containerName="kserve-container" containerID="cri-o://67878dd41a7eff7bf253ee12d0e736c90895ee43b7d8085adff9251a24a4429e" gracePeriod=30 Apr 23 18:53:36.154162 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.154132 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v"] Apr 23 18:53:36.162611 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.162582 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" Apr 23 18:53:36.164768 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.164742 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-predictor-serving-cert\"" Apr 23 18:53:36.164922 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.164773 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 23 18:53:36.170409 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.170371 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v"] Apr 23 18:53:36.224709 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.224676 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0086951-1cde-4fcd-b96c-26425b3d2235-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-h7m5v\" (UID: \"c0086951-1cde-4fcd-b96c-26425b3d2235\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" Apr 23 18:53:36.224829 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.224725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0086951-1cde-4fcd-b96c-26425b3d2235-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-h7m5v\" (UID: \"c0086951-1cde-4fcd-b96c-26425b3d2235\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" Apr 23 18:53:36.224829 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.224753 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hrxx\" (UniqueName: \"kubernetes.io/projected/c0086951-1cde-4fcd-b96c-26425b3d2235-kube-api-access-8hrxx\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-h7m5v\" (UID: \"c0086951-1cde-4fcd-b96c-26425b3d2235\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" Apr 23 18:53:36.224829 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.224790 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c0086951-1cde-4fcd-b96c-26425b3d2235-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-h7m5v\" (UID: \"c0086951-1cde-4fcd-b96c-26425b3d2235\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" Apr 23 18:53:36.325408 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.325374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c0086951-1cde-4fcd-b96c-26425b3d2235-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-h7m5v\" (UID: \"c0086951-1cde-4fcd-b96c-26425b3d2235\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" Apr 23 18:53:36.325591 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.325497 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0086951-1cde-4fcd-b96c-26425b3d2235-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-h7m5v\" (UID: \"c0086951-1cde-4fcd-b96c-26425b3d2235\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" Apr 23 18:53:36.325591 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.325528 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0086951-1cde-4fcd-b96c-26425b3d2235-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-h7m5v\" (UID: \"c0086951-1cde-4fcd-b96c-26425b3d2235\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" Apr 23 18:53:36.325591 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.325550 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hrxx\" (UniqueName: \"kubernetes.io/projected/c0086951-1cde-4fcd-b96c-26425b3d2235-kube-api-access-8hrxx\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-h7m5v\" (UID: \"c0086951-1cde-4fcd-b96c-26425b3d2235\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" Apr 23 18:53:36.325925 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.325904 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0086951-1cde-4fcd-b96c-26425b3d2235-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-h7m5v\" (UID: \"c0086951-1cde-4fcd-b96c-26425b3d2235\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" Apr 23 18:53:36.326145 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.326125 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c0086951-1cde-4fcd-b96c-26425b3d2235-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-h7m5v\" (UID: \"c0086951-1cde-4fcd-b96c-26425b3d2235\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" Apr 23 18:53:36.328027 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.327994 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0086951-1cde-4fcd-b96c-26425b3d2235-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-h7m5v\" (UID: \"c0086951-1cde-4fcd-b96c-26425b3d2235\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" Apr 23 18:53:36.333714 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.333695 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hrxx\" (UniqueName: \"kubernetes.io/projected/c0086951-1cde-4fcd-b96c-26425b3d2235-kube-api-access-8hrxx\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-h7m5v\" (UID: \"c0086951-1cde-4fcd-b96c-26425b3d2235\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" Apr 23 18:53:36.474416 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.474311 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" Apr 23 18:53:36.595655 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.595627 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v"] Apr 23 18:53:36.598126 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:53:36.598102 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0086951_1cde_4fcd_b96c_26425b3d2235.slice/crio-0fde7c28b98a2eb37994240af8509f88e1b1d2c77fd05869b4095fc8dc4d3493 WatchSource:0}: Error finding container 0fde7c28b98a2eb37994240af8509f88e1b1d2c77fd05869b4095fc8dc4d3493: Status 404 returned error can't find the container with id 0fde7c28b98a2eb37994240af8509f88e1b1d2c77fd05869b4095fc8dc4d3493 Apr 23 18:53:36.971874 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.971836 2576 generic.go:358] "Generic (PLEG): container finished" podID="a536c176-fa7e-4e67-ab83-14b8dc651c9b" containerID="582f8e77ad5ce8a44505565540ef7f88f97d60e2c24f7b791084358042c8de4f" exitCode=2 Apr 23 18:53:36.972072 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.971912 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" event={"ID":"a536c176-fa7e-4e67-ab83-14b8dc651c9b","Type":"ContainerDied","Data":"582f8e77ad5ce8a44505565540ef7f88f97d60e2c24f7b791084358042c8de4f"} Apr 23 18:53:36.973617 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.973588 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" event={"ID":"c0086951-1cde-4fcd-b96c-26425b3d2235","Type":"ContainerStarted","Data":"44949470df135fe2611fc5663e896f42940fbefe4fafa31d7af0db2d66bf3e55"} Apr 23 18:53:36.973759 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:36.973623 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" event={"ID":"c0086951-1cde-4fcd-b96c-26425b3d2235","Type":"ContainerStarted","Data":"0fde7c28b98a2eb37994240af8509f88e1b1d2c77fd05869b4095fc8dc4d3493"} Apr 23 18:53:37.819599 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:37.819554 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" podUID="a536c176-fa7e-4e67-ab83-14b8dc651c9b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.54:8643/healthz\": dial tcp 10.132.0.54:8643: connect: connection refused" Apr 23 18:53:40.992811 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:40.992774 2576 generic.go:358] "Generic (PLEG): container finished" podID="c0086951-1cde-4fcd-b96c-26425b3d2235" containerID="44949470df135fe2611fc5663e896f42940fbefe4fafa31d7af0db2d66bf3e55" exitCode=0 Apr 23 18:53:40.993250 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:40.992848 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" event={"ID":"c0086951-1cde-4fcd-b96c-26425b3d2235","Type":"ContainerDied","Data":"44949470df135fe2611fc5663e896f42940fbefe4fafa31d7af0db2d66bf3e55"} Apr 23 18:53:41.998097 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:41.998058 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" event={"ID":"c0086951-1cde-4fcd-b96c-26425b3d2235","Type":"ContainerStarted","Data":"a8ddc51371b91c9831069b38d7655b24a6b1ae4537b9e66a68cafd20f17d7e8b"} Apr 23 18:53:41.998097 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:41.998099 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" event={"ID":"c0086951-1cde-4fcd-b96c-26425b3d2235","Type":"ContainerStarted","Data":"4bf1321e18b309eb04547c7bfbfe65bb7f50cfe7e061abe314e028b7decae503"} Apr 23 18:53:41.998555 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:41.998349 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" Apr 23 18:53:41.998555 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:41.998411 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" Apr 23 18:53:42.019412 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:42.019361 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" podStartSLOduration=6.019347248 podStartE2EDuration="6.019347248s" podCreationTimestamp="2026-04-23 18:53:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:53:42.017168441 +0000 UTC m=+3311.184495760" watchObservedRunningTime="2026-04-23 18:53:42.019347248 +0000 UTC m=+3311.186674566" Apr 23 18:53:42.641280 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:42.641256 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" Apr 23 18:53:42.679441 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:42.679354 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr8x4\" (UniqueName: \"kubernetes.io/projected/a536c176-fa7e-4e67-ab83-14b8dc651c9b-kube-api-access-nr8x4\") pod \"a536c176-fa7e-4e67-ab83-14b8dc651c9b\" (UID: \"a536c176-fa7e-4e67-ab83-14b8dc651c9b\") " Apr 23 18:53:42.679638 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:42.679531 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a536c176-fa7e-4e67-ab83-14b8dc651c9b-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"a536c176-fa7e-4e67-ab83-14b8dc651c9b\" (UID: \"a536c176-fa7e-4e67-ab83-14b8dc651c9b\") " Apr 23 18:53:42.679638 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:42.679558 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a536c176-fa7e-4e67-ab83-14b8dc651c9b-kserve-provision-location\") pod \"a536c176-fa7e-4e67-ab83-14b8dc651c9b\" (UID: \"a536c176-fa7e-4e67-ab83-14b8dc651c9b\") " Apr 23 18:53:42.679638 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:42.679576 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a536c176-fa7e-4e67-ab83-14b8dc651c9b-proxy-tls\") pod \"a536c176-fa7e-4e67-ab83-14b8dc651c9b\" (UID: \"a536c176-fa7e-4e67-ab83-14b8dc651c9b\") " Apr 23 18:53:42.679923 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:42.679896 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a536c176-fa7e-4e67-ab83-14b8dc651c9b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a536c176-fa7e-4e67-ab83-14b8dc651c9b" (UID: "a536c176-fa7e-4e67-ab83-14b8dc651c9b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:53:42.679923 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:42.679905 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a536c176-fa7e-4e67-ab83-14b8dc651c9b-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "a536c176-fa7e-4e67-ab83-14b8dc651c9b" (UID: "a536c176-fa7e-4e67-ab83-14b8dc651c9b"). InnerVolumeSpecName "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:53:42.681792 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:42.681760 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a536c176-fa7e-4e67-ab83-14b8dc651c9b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a536c176-fa7e-4e67-ab83-14b8dc651c9b" (UID: "a536c176-fa7e-4e67-ab83-14b8dc651c9b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:53:42.682104 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:42.682077 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a536c176-fa7e-4e67-ab83-14b8dc651c9b-kube-api-access-nr8x4" (OuterVolumeSpecName: "kube-api-access-nr8x4") pod "a536c176-fa7e-4e67-ab83-14b8dc651c9b" (UID: "a536c176-fa7e-4e67-ab83-14b8dc651c9b"). InnerVolumeSpecName "kube-api-access-nr8x4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:53:42.780559 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:42.780474 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nr8x4\" (UniqueName: \"kubernetes.io/projected/a536c176-fa7e-4e67-ab83-14b8dc651c9b-kube-api-access-nr8x4\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:53:42.780559 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:42.780502 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a536c176-fa7e-4e67-ab83-14b8dc651c9b-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:53:42.780559 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:42.780512 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a536c176-fa7e-4e67-ab83-14b8dc651c9b-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:53:42.780559 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:42.780521 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a536c176-fa7e-4e67-ab83-14b8dc651c9b-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:53:43.002894 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:43.002853 2576 generic.go:358] "Generic (PLEG): container finished" podID="a536c176-fa7e-4e67-ab83-14b8dc651c9b" containerID="67878dd41a7eff7bf253ee12d0e736c90895ee43b7d8085adff9251a24a4429e" exitCode=0 Apr 23 18:53:43.003278 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:43.002935 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" Apr 23 18:53:43.003278 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:43.002934 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" event={"ID":"a536c176-fa7e-4e67-ab83-14b8dc651c9b","Type":"ContainerDied","Data":"67878dd41a7eff7bf253ee12d0e736c90895ee43b7d8085adff9251a24a4429e"} Apr 23 18:53:43.003278 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:43.003054 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57" event={"ID":"a536c176-fa7e-4e67-ab83-14b8dc651c9b","Type":"ContainerDied","Data":"4d9cd00e20d866af413e70a67e98b1545e6830d9d01a0b60cfb17089334388b8"} Apr 23 18:53:43.003278 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:43.003080 2576 scope.go:117] "RemoveContainer" containerID="582f8e77ad5ce8a44505565540ef7f88f97d60e2c24f7b791084358042c8de4f" Apr 23 18:53:43.011804 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:43.011786 2576 scope.go:117] "RemoveContainer" containerID="67878dd41a7eff7bf253ee12d0e736c90895ee43b7d8085adff9251a24a4429e" Apr 23 18:53:43.019346 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:43.019329 2576 scope.go:117] "RemoveContainer" containerID="d6968c2be92aaf1e17a1f886efa8350c93ebc930a61d15403b0216ec3c1448bb" Apr 23 18:53:43.024625 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:43.024605 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57"] Apr 23 18:53:43.027693 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:43.027671 2576 scope.go:117] "RemoveContainer" containerID="582f8e77ad5ce8a44505565540ef7f88f97d60e2c24f7b791084358042c8de4f" Apr 23 18:53:43.027987 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:53:43.027969 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"582f8e77ad5ce8a44505565540ef7f88f97d60e2c24f7b791084358042c8de4f\": container with ID starting with 582f8e77ad5ce8a44505565540ef7f88f97d60e2c24f7b791084358042c8de4f not found: ID does not exist" containerID="582f8e77ad5ce8a44505565540ef7f88f97d60e2c24f7b791084358042c8de4f" Apr 23 18:53:43.028066 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:43.028000 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"582f8e77ad5ce8a44505565540ef7f88f97d60e2c24f7b791084358042c8de4f"} err="failed to get container status \"582f8e77ad5ce8a44505565540ef7f88f97d60e2c24f7b791084358042c8de4f\": rpc error: code = NotFound desc = could not find container \"582f8e77ad5ce8a44505565540ef7f88f97d60e2c24f7b791084358042c8de4f\": container with ID starting with 582f8e77ad5ce8a44505565540ef7f88f97d60e2c24f7b791084358042c8de4f not found: ID does not exist" Apr 23 18:53:43.028066 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:43.028025 2576 scope.go:117] "RemoveContainer" containerID="67878dd41a7eff7bf253ee12d0e736c90895ee43b7d8085adff9251a24a4429e" Apr 23 18:53:43.028066 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:43.028029 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-wxq57"] Apr 23 18:53:43.028297 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:53:43.028277 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67878dd41a7eff7bf253ee12d0e736c90895ee43b7d8085adff9251a24a4429e\": container with ID starting with 67878dd41a7eff7bf253ee12d0e736c90895ee43b7d8085adff9251a24a4429e not found: ID does not exist" containerID="67878dd41a7eff7bf253ee12d0e736c90895ee43b7d8085adff9251a24a4429e" Apr 23 18:53:43.028339 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:43.028305 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67878dd41a7eff7bf253ee12d0e736c90895ee43b7d8085adff9251a24a4429e"} err="failed to get container status \"67878dd41a7eff7bf253ee12d0e736c90895ee43b7d8085adff9251a24a4429e\": rpc error: code = NotFound desc = could not find container \"67878dd41a7eff7bf253ee12d0e736c90895ee43b7d8085adff9251a24a4429e\": container with ID starting with 67878dd41a7eff7bf253ee12d0e736c90895ee43b7d8085adff9251a24a4429e not found: ID does not exist" Apr 23 18:53:43.028339 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:43.028321 2576 scope.go:117] "RemoveContainer" containerID="d6968c2be92aaf1e17a1f886efa8350c93ebc930a61d15403b0216ec3c1448bb" Apr 23 18:53:43.028557 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:53:43.028535 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6968c2be92aaf1e17a1f886efa8350c93ebc930a61d15403b0216ec3c1448bb\": container with ID starting with d6968c2be92aaf1e17a1f886efa8350c93ebc930a61d15403b0216ec3c1448bb not found: ID does not exist" containerID="d6968c2be92aaf1e17a1f886efa8350c93ebc930a61d15403b0216ec3c1448bb" Apr 23 18:53:43.028656 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:43.028565 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6968c2be92aaf1e17a1f886efa8350c93ebc930a61d15403b0216ec3c1448bb"} err="failed to get container status \"d6968c2be92aaf1e17a1f886efa8350c93ebc930a61d15403b0216ec3c1448bb\": rpc error: code = NotFound desc = could not find container \"d6968c2be92aaf1e17a1f886efa8350c93ebc930a61d15403b0216ec3c1448bb\": container with ID starting with d6968c2be92aaf1e17a1f886efa8350c93ebc930a61d15403b0216ec3c1448bb not found: ID does not exist" Apr 23 18:53:43.415707 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:43.415675 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a536c176-fa7e-4e67-ab83-14b8dc651c9b" path="/var/lib/kubelet/pods/a536c176-fa7e-4e67-ab83-14b8dc651c9b/volumes" Apr 23 18:53:48.009389 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:53:48.009360 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" Apr 23 18:54:18.013109 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:18.013079 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" Apr 23 18:54:26.229811 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:26.229779 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v"] Apr 23 18:54:26.230252 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:26.230080 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" podUID="c0086951-1cde-4fcd-b96c-26425b3d2235" containerName="kserve-container" containerID="cri-o://4bf1321e18b309eb04547c7bfbfe65bb7f50cfe7e061abe314e028b7decae503" gracePeriod=30 Apr 23 18:54:26.230252 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:26.230124 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" podUID="c0086951-1cde-4fcd-b96c-26425b3d2235" containerName="kube-rbac-proxy" containerID="cri-o://a8ddc51371b91c9831069b38d7655b24a6b1ae4537b9e66a68cafd20f17d7e8b" gracePeriod=30 Apr 23 18:54:27.185494 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:27.185438 2576 generic.go:358] "Generic (PLEG): container finished" podID="c0086951-1cde-4fcd-b96c-26425b3d2235" containerID="a8ddc51371b91c9831069b38d7655b24a6b1ae4537b9e66a68cafd20f17d7e8b" exitCode=2 Apr 23 18:54:27.185673 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:27.185497 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" event={"ID":"c0086951-1cde-4fcd-b96c-26425b3d2235","Type":"ContainerDied","Data":"a8ddc51371b91c9831069b38d7655b24a6b1ae4537b9e66a68cafd20f17d7e8b"} Apr 23 18:54:28.004009 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:28.003963 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" podUID="c0086951-1cde-4fcd-b96c-26425b3d2235" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.55:8643/healthz\": dial tcp 10.132.0.55:8643: connect: connection refused" Apr 23 18:54:32.672336 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:32.672312 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" Apr 23 18:54:32.710023 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:32.709990 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c0086951-1cde-4fcd-b96c-26425b3d2235-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"c0086951-1cde-4fcd-b96c-26425b3d2235\" (UID: \"c0086951-1cde-4fcd-b96c-26425b3d2235\") " Apr 23 18:54:32.710188 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:32.710044 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0086951-1cde-4fcd-b96c-26425b3d2235-kserve-provision-location\") pod \"c0086951-1cde-4fcd-b96c-26425b3d2235\" (UID: \"c0086951-1cde-4fcd-b96c-26425b3d2235\") " Apr 23 18:54:32.710188 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:32.710072 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hrxx\" (UniqueName: \"kubernetes.io/projected/c0086951-1cde-4fcd-b96c-26425b3d2235-kube-api-access-8hrxx\") pod \"c0086951-1cde-4fcd-b96c-26425b3d2235\" (UID: \"c0086951-1cde-4fcd-b96c-26425b3d2235\") " Apr 23 18:54:32.710188 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:32.710170 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0086951-1cde-4fcd-b96c-26425b3d2235-proxy-tls\") pod \"c0086951-1cde-4fcd-b96c-26425b3d2235\" (UID: \"c0086951-1cde-4fcd-b96c-26425b3d2235\") " Apr 23 18:54:32.710420 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:32.710396 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0086951-1cde-4fcd-b96c-26425b3d2235-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "c0086951-1cde-4fcd-b96c-26425b3d2235" (UID: "c0086951-1cde-4fcd-b96c-26425b3d2235"). InnerVolumeSpecName "xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:54:32.710489 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:32.710427 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0086951-1cde-4fcd-b96c-26425b3d2235-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c0086951-1cde-4fcd-b96c-26425b3d2235" (UID: "c0086951-1cde-4fcd-b96c-26425b3d2235"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:54:32.710553 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:32.710536 2576 reconciler_common.go:299] "Volume detached for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c0086951-1cde-4fcd-b96c-26425b3d2235-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:54:32.712350 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:32.712322 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0086951-1cde-4fcd-b96c-26425b3d2235-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c0086951-1cde-4fcd-b96c-26425b3d2235" (UID: "c0086951-1cde-4fcd-b96c-26425b3d2235"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:54:32.712493 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:32.712363 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0086951-1cde-4fcd-b96c-26425b3d2235-kube-api-access-8hrxx" (OuterVolumeSpecName: "kube-api-access-8hrxx") pod "c0086951-1cde-4fcd-b96c-26425b3d2235" (UID: "c0086951-1cde-4fcd-b96c-26425b3d2235"). InnerVolumeSpecName "kube-api-access-8hrxx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:54:32.811739 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:32.811661 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0086951-1cde-4fcd-b96c-26425b3d2235-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:54:32.811739 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:32.811687 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8hrxx\" (UniqueName: \"kubernetes.io/projected/c0086951-1cde-4fcd-b96c-26425b3d2235-kube-api-access-8hrxx\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:54:32.811739 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:32.811697 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0086951-1cde-4fcd-b96c-26425b3d2235-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:54:33.207921 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:33.207888 2576 generic.go:358] "Generic (PLEG): container finished" podID="c0086951-1cde-4fcd-b96c-26425b3d2235" containerID="4bf1321e18b309eb04547c7bfbfe65bb7f50cfe7e061abe314e028b7decae503" exitCode=0 Apr 23 18:54:33.208120 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:33.207960 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" event={"ID":"c0086951-1cde-4fcd-b96c-26425b3d2235","Type":"ContainerDied","Data":"4bf1321e18b309eb04547c7bfbfe65bb7f50cfe7e061abe314e028b7decae503"} Apr 23 18:54:33.208120 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:33.207986 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" event={"ID":"c0086951-1cde-4fcd-b96c-26425b3d2235","Type":"ContainerDied","Data":"0fde7c28b98a2eb37994240af8509f88e1b1d2c77fd05869b4095fc8dc4d3493"} Apr 23 18:54:33.208120 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:33.207992 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v" Apr 23 18:54:33.208120 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:33.208003 2576 scope.go:117] "RemoveContainer" containerID="a8ddc51371b91c9831069b38d7655b24a6b1ae4537b9e66a68cafd20f17d7e8b" Apr 23 18:54:33.216812 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:33.216793 2576 scope.go:117] "RemoveContainer" containerID="4bf1321e18b309eb04547c7bfbfe65bb7f50cfe7e061abe314e028b7decae503" Apr 23 18:54:33.224125 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:33.224108 2576 scope.go:117] "RemoveContainer" containerID="44949470df135fe2611fc5663e896f42940fbefe4fafa31d7af0db2d66bf3e55" Apr 23 18:54:33.228412 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:33.228385 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v"] Apr 23 18:54:33.231898 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:33.231876 2576 scope.go:117] "RemoveContainer" containerID="a8ddc51371b91c9831069b38d7655b24a6b1ae4537b9e66a68cafd20f17d7e8b" Apr 23 18:54:33.232184 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:54:33.232164 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8ddc51371b91c9831069b38d7655b24a6b1ae4537b9e66a68cafd20f17d7e8b\": container with ID starting with a8ddc51371b91c9831069b38d7655b24a6b1ae4537b9e66a68cafd20f17d7e8b not found: ID does not exist" containerID="a8ddc51371b91c9831069b38d7655b24a6b1ae4537b9e66a68cafd20f17d7e8b" Apr 23 18:54:33.232278 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:33.232199 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8ddc51371b91c9831069b38d7655b24a6b1ae4537b9e66a68cafd20f17d7e8b"} err="failed to get container status \"a8ddc51371b91c9831069b38d7655b24a6b1ae4537b9e66a68cafd20f17d7e8b\": rpc error: code = NotFound desc = could not find container \"a8ddc51371b91c9831069b38d7655b24a6b1ae4537b9e66a68cafd20f17d7e8b\": container with ID starting with a8ddc51371b91c9831069b38d7655b24a6b1ae4537b9e66a68cafd20f17d7e8b not found: ID does not exist" Apr 23 18:54:33.232278 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:33.232226 2576 scope.go:117] "RemoveContainer" containerID="4bf1321e18b309eb04547c7bfbfe65bb7f50cfe7e061abe314e028b7decae503" Apr 23 18:54:33.232566 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:54:33.232541 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bf1321e18b309eb04547c7bfbfe65bb7f50cfe7e061abe314e028b7decae503\": container with ID starting with 4bf1321e18b309eb04547c7bfbfe65bb7f50cfe7e061abe314e028b7decae503 not found: ID does not exist" containerID="4bf1321e18b309eb04547c7bfbfe65bb7f50cfe7e061abe314e028b7decae503" Apr 23 18:54:33.232666 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:33.232570 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bf1321e18b309eb04547c7bfbfe65bb7f50cfe7e061abe314e028b7decae503"} err="failed to get container status \"4bf1321e18b309eb04547c7bfbfe65bb7f50cfe7e061abe314e028b7decae503\": rpc error: code = NotFound desc = could not find container \"4bf1321e18b309eb04547c7bfbfe65bb7f50cfe7e061abe314e028b7decae503\": container with ID starting with 4bf1321e18b309eb04547c7bfbfe65bb7f50cfe7e061abe314e028b7decae503 not found: ID does not exist" Apr 23 18:54:33.232666 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:33.232588 2576 scope.go:117] "RemoveContainer" containerID="44949470df135fe2611fc5663e896f42940fbefe4fafa31d7af0db2d66bf3e55" Apr 23 18:54:33.232814 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:33.232796 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-h7m5v"] Apr 23 18:54:33.232854 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:54:33.232820 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44949470df135fe2611fc5663e896f42940fbefe4fafa31d7af0db2d66bf3e55\": container with ID starting with 44949470df135fe2611fc5663e896f42940fbefe4fafa31d7af0db2d66bf3e55 not found: ID does not exist" containerID="44949470df135fe2611fc5663e896f42940fbefe4fafa31d7af0db2d66bf3e55" Apr 23 18:54:33.232854 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:33.232844 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44949470df135fe2611fc5663e896f42940fbefe4fafa31d7af0db2d66bf3e55"} err="failed to get container status \"44949470df135fe2611fc5663e896f42940fbefe4fafa31d7af0db2d66bf3e55\": rpc error: code = NotFound desc = could not find container \"44949470df135fe2611fc5663e896f42940fbefe4fafa31d7af0db2d66bf3e55\": container with ID starting with 44949470df135fe2611fc5663e896f42940fbefe4fafa31d7af0db2d66bf3e55 not found: ID does not exist" Apr 23 18:54:33.415416 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:54:33.415379 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0086951-1cde-4fcd-b96c-26425b3d2235" path="/var/lib/kubelet/pods/c0086951-1cde-4fcd-b96c-26425b3d2235/volumes" Apr 23 18:55:46.530961 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.530923 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr"] Apr 23 18:55:46.531558 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.531320 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0086951-1cde-4fcd-b96c-26425b3d2235" containerName="storage-initializer" Apr 23 18:55:46.531558 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.531332 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0086951-1cde-4fcd-b96c-26425b3d2235" containerName="storage-initializer" Apr 23 18:55:46.531558 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.531344 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a536c176-fa7e-4e67-ab83-14b8dc651c9b" containerName="storage-initializer" Apr 23 18:55:46.531558 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.531352 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a536c176-fa7e-4e67-ab83-14b8dc651c9b" containerName="storage-initializer" Apr 23 18:55:46.531558 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.531363 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a536c176-fa7e-4e67-ab83-14b8dc651c9b" containerName="kserve-container" Apr 23 18:55:46.531558 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.531370 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a536c176-fa7e-4e67-ab83-14b8dc651c9b" containerName="kserve-container" Apr 23 18:55:46.531558 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.531380 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a536c176-fa7e-4e67-ab83-14b8dc651c9b" containerName="kube-rbac-proxy" Apr 23 18:55:46.531558 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.531386 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a536c176-fa7e-4e67-ab83-14b8dc651c9b" containerName="kube-rbac-proxy" Apr 23 18:55:46.531558 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.531403 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0086951-1cde-4fcd-b96c-26425b3d2235" containerName="kserve-container" Apr 23 18:55:46.531558 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.531407 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0086951-1cde-4fcd-b96c-26425b3d2235" containerName="kserve-container" Apr 23 18:55:46.531558 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.531413 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0086951-1cde-4fcd-b96c-26425b3d2235" containerName="kube-rbac-proxy" Apr 23 18:55:46.531558 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.531418 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0086951-1cde-4fcd-b96c-26425b3d2235" containerName="kube-rbac-proxy" Apr 23 18:55:46.531558 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.531488 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0086951-1cde-4fcd-b96c-26425b3d2235" containerName="kube-rbac-proxy" Apr 23 18:55:46.531558 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.531498 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0086951-1cde-4fcd-b96c-26425b3d2235" containerName="kserve-container" Apr 23 18:55:46.531558 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.531505 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a536c176-fa7e-4e67-ab83-14b8dc651c9b" containerName="kserve-container" Apr 23 18:55:46.531558 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.531513 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a536c176-fa7e-4e67-ab83-14b8dc651c9b" containerName="kube-rbac-proxy" Apr 23 18:55:46.535096 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.535078 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" Apr 23 18:55:46.537123 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.537094 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:55:46.537262 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.537092 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 23 18:55:46.537262 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.537254 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 18:55:46.537357 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.537301 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t4fg7\"" Apr 23 18:55:46.537357 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.537100 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-predictor-serving-cert\"" Apr 23 18:55:46.547367 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.547344 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr"] Apr 23 18:55:46.625649 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.625613 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdqsg\" (UniqueName: \"kubernetes.io/projected/cb52e804-2d33-41e1-9a34-a656ecca2773-kube-api-access-cdqsg\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr\" (UID: \"cb52e804-2d33-41e1-9a34-a656ecca2773\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" Apr 23 18:55:46.625649 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.625652 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb52e804-2d33-41e1-9a34-a656ecca2773-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr\" (UID: \"cb52e804-2d33-41e1-9a34-a656ecca2773\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" Apr 23 18:55:46.625849 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.625687 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb52e804-2d33-41e1-9a34-a656ecca2773-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr\" (UID: \"cb52e804-2d33-41e1-9a34-a656ecca2773\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" Apr 23 18:55:46.625849 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.625747 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb52e804-2d33-41e1-9a34-a656ecca2773-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr\" (UID: \"cb52e804-2d33-41e1-9a34-a656ecca2773\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" Apr 23 18:55:46.726350 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.726305 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb52e804-2d33-41e1-9a34-a656ecca2773-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr\" (UID: \"cb52e804-2d33-41e1-9a34-a656ecca2773\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" Apr 23 18:55:46.726350 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.726359 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb52e804-2d33-41e1-9a34-a656ecca2773-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr\" (UID: \"cb52e804-2d33-41e1-9a34-a656ecca2773\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" Apr 23 18:55:46.726554 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.726384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb52e804-2d33-41e1-9a34-a656ecca2773-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr\" (UID: \"cb52e804-2d33-41e1-9a34-a656ecca2773\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" Apr 23 18:55:46.726598 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.726576 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdqsg\" (UniqueName: \"kubernetes.io/projected/cb52e804-2d33-41e1-9a34-a656ecca2773-kube-api-access-cdqsg\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr\" (UID: \"cb52e804-2d33-41e1-9a34-a656ecca2773\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" Apr 23 18:55:46.726857 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.726820 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb52e804-2d33-41e1-9a34-a656ecca2773-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr\" (UID: \"cb52e804-2d33-41e1-9a34-a656ecca2773\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" Apr 23 18:55:46.727142 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.727120 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb52e804-2d33-41e1-9a34-a656ecca2773-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr\" (UID: \"cb52e804-2d33-41e1-9a34-a656ecca2773\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" Apr 23 18:55:46.728829 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.728811 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb52e804-2d33-41e1-9a34-a656ecca2773-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr\" (UID: \"cb52e804-2d33-41e1-9a34-a656ecca2773\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" Apr 23 18:55:46.734214 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.734192 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdqsg\" (UniqueName: \"kubernetes.io/projected/cb52e804-2d33-41e1-9a34-a656ecca2773-kube-api-access-cdqsg\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr\" (UID: \"cb52e804-2d33-41e1-9a34-a656ecca2773\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" Apr 23 18:55:46.845184 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.845153 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" Apr 23 18:55:46.967020 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:46.966899 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr"] Apr 23 18:55:46.969892 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:55:46.969860 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb52e804_2d33_41e1_9a34_a656ecca2773.slice/crio-2d1f45ec983c180230ce6535ae1a7306dba5d796dc145680e02f9d4a331e182d WatchSource:0}: Error finding container 2d1f45ec983c180230ce6535ae1a7306dba5d796dc145680e02f9d4a331e182d: Status 404 returned error can't find the container with id 2d1f45ec983c180230ce6535ae1a7306dba5d796dc145680e02f9d4a331e182d Apr 23 18:55:47.453575 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:47.453538 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" event={"ID":"cb52e804-2d33-41e1-9a34-a656ecca2773","Type":"ContainerStarted","Data":"c3d2a0f448685e570515694195a34f8703d06047037599ec72aaa153c1972d7d"} Apr 23 18:55:47.453575 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:47.453573 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" event={"ID":"cb52e804-2d33-41e1-9a34-a656ecca2773","Type":"ContainerStarted","Data":"2d1f45ec983c180230ce6535ae1a7306dba5d796dc145680e02f9d4a331e182d"} Apr 23 18:55:51.468334 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:51.468301 2576 generic.go:358] "Generic (PLEG): container finished" podID="cb52e804-2d33-41e1-9a34-a656ecca2773" containerID="c3d2a0f448685e570515694195a34f8703d06047037599ec72aaa153c1972d7d" exitCode=0 Apr 23 18:55:51.468786 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:51.468375 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" event={"ID":"cb52e804-2d33-41e1-9a34-a656ecca2773","Type":"ContainerDied","Data":"c3d2a0f448685e570515694195a34f8703d06047037599ec72aaa153c1972d7d"} Apr 23 18:55:52.474214 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:52.474173 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" event={"ID":"cb52e804-2d33-41e1-9a34-a656ecca2773","Type":"ContainerStarted","Data":"3fc54cc8fa38f1684f8a91d4fbc033aaba7874970535f5d9ef3d8440bec3bc65"} Apr 23 18:55:52.474214 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:52.474213 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" event={"ID":"cb52e804-2d33-41e1-9a34-a656ecca2773","Type":"ContainerStarted","Data":"d335058af2b258823826d34a5caa3a293ae6683c133864f817f688477c9be167"} Apr 23 18:55:52.474782 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:52.474434 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" Apr 23 18:55:52.493599 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:52.493538 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" podStartSLOduration=6.493523719 podStartE2EDuration="6.493523719s" podCreationTimestamp="2026-04-23 18:55:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:55:52.491671685 +0000 UTC m=+3441.658999004" watchObservedRunningTime="2026-04-23 18:55:52.493523719 +0000 UTC m=+3441.660851038" Apr 23 18:55:53.478253 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:53.478223 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" Apr 23 18:55:59.486557 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:55:59.486526 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" Apr 23 18:56:29.575144 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:29.575105 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" podUID="cb52e804-2d33-41e1-9a34-a656ecca2773" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 23 18:56:39.489668 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:39.489638 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" Apr 23 18:56:46.628285 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:46.628251 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr"] Apr 23 18:56:46.628902 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:46.628660 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" podUID="cb52e804-2d33-41e1-9a34-a656ecca2773" containerName="kserve-container" containerID="cri-o://d335058af2b258823826d34a5caa3a293ae6683c133864f817f688477c9be167" gracePeriod=30 Apr 23 18:56:46.629056 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:46.629035 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" podUID="cb52e804-2d33-41e1-9a34-a656ecca2773" containerName="kube-rbac-proxy" containerID="cri-o://3fc54cc8fa38f1684f8a91d4fbc033aaba7874970535f5d9ef3d8440bec3bc65" gracePeriod=30 Apr 23 18:56:47.661245 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:47.661213 2576 generic.go:358] "Generic (PLEG): container finished" podID="cb52e804-2d33-41e1-9a34-a656ecca2773" containerID="3fc54cc8fa38f1684f8a91d4fbc033aaba7874970535f5d9ef3d8440bec3bc65" exitCode=2 Apr 23 18:56:47.661245 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:47.661251 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" event={"ID":"cb52e804-2d33-41e1-9a34-a656ecca2773","Type":"ContainerDied","Data":"3fc54cc8fa38f1684f8a91d4fbc033aaba7874970535f5d9ef3d8440bec3bc65"} Apr 23 18:56:49.482484 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:49.482428 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" podUID="cb52e804-2d33-41e1-9a34-a656ecca2773" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.56:8643/healthz\": dial tcp 10.132.0.56:8643: connect: connection refused" Apr 23 18:56:53.970026 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:53.970003 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" Apr 23 18:56:54.093917 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.093885 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdqsg\" (UniqueName: \"kubernetes.io/projected/cb52e804-2d33-41e1-9a34-a656ecca2773-kube-api-access-cdqsg\") pod \"cb52e804-2d33-41e1-9a34-a656ecca2773\" (UID: \"cb52e804-2d33-41e1-9a34-a656ecca2773\") " Apr 23 18:56:54.094077 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.093984 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb52e804-2d33-41e1-9a34-a656ecca2773-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"cb52e804-2d33-41e1-9a34-a656ecca2773\" (UID: \"cb52e804-2d33-41e1-9a34-a656ecca2773\") " Apr 23 18:56:54.094077 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.094014 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb52e804-2d33-41e1-9a34-a656ecca2773-proxy-tls\") pod \"cb52e804-2d33-41e1-9a34-a656ecca2773\" (UID: \"cb52e804-2d33-41e1-9a34-a656ecca2773\") " Apr 23 18:56:54.094077 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.094058 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb52e804-2d33-41e1-9a34-a656ecca2773-kserve-provision-location\") pod \"cb52e804-2d33-41e1-9a34-a656ecca2773\" (UID: \"cb52e804-2d33-41e1-9a34-a656ecca2773\") " Apr 23 18:56:54.094398 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.094360 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb52e804-2d33-41e1-9a34-a656ecca2773-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config") pod "cb52e804-2d33-41e1-9a34-a656ecca2773" (UID: "cb52e804-2d33-41e1-9a34-a656ecca2773"). InnerVolumeSpecName "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:56:54.094537 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.094405 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb52e804-2d33-41e1-9a34-a656ecca2773-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cb52e804-2d33-41e1-9a34-a656ecca2773" (UID: "cb52e804-2d33-41e1-9a34-a656ecca2773"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:56:54.096161 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.096138 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb52e804-2d33-41e1-9a34-a656ecca2773-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cb52e804-2d33-41e1-9a34-a656ecca2773" (UID: "cb52e804-2d33-41e1-9a34-a656ecca2773"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:56:54.096255 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.096142 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb52e804-2d33-41e1-9a34-a656ecca2773-kube-api-access-cdqsg" (OuterVolumeSpecName: "kube-api-access-cdqsg") pod "cb52e804-2d33-41e1-9a34-a656ecca2773" (UID: "cb52e804-2d33-41e1-9a34-a656ecca2773"). InnerVolumeSpecName "kube-api-access-cdqsg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:56:54.195611 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.195571 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb52e804-2d33-41e1-9a34-a656ecca2773-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:56:54.195611 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.195607 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cdqsg\" (UniqueName: \"kubernetes.io/projected/cb52e804-2d33-41e1-9a34-a656ecca2773-kube-api-access-cdqsg\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:56:54.195611 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.195619 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb52e804-2d33-41e1-9a34-a656ecca2773-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:56:54.195837 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.195629 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb52e804-2d33-41e1-9a34-a656ecca2773-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:56:54.685137 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.685095 2576 generic.go:358] "Generic (PLEG): container finished" podID="cb52e804-2d33-41e1-9a34-a656ecca2773" containerID="d335058af2b258823826d34a5caa3a293ae6683c133864f817f688477c9be167" exitCode=0 Apr 23 18:56:54.685348 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.685153 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" event={"ID":"cb52e804-2d33-41e1-9a34-a656ecca2773","Type":"ContainerDied","Data":"d335058af2b258823826d34a5caa3a293ae6683c133864f817f688477c9be167"} Apr 23 18:56:54.685348 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.685184 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" event={"ID":"cb52e804-2d33-41e1-9a34-a656ecca2773","Type":"ContainerDied","Data":"2d1f45ec983c180230ce6535ae1a7306dba5d796dc145680e02f9d4a331e182d"} Apr 23 18:56:54.685348 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.685199 2576 scope.go:117] "RemoveContainer" containerID="3fc54cc8fa38f1684f8a91d4fbc033aaba7874970535f5d9ef3d8440bec3bc65" Apr 23 18:56:54.685348 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.685211 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr" Apr 23 18:56:54.694562 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.694543 2576 scope.go:117] "RemoveContainer" containerID="d335058af2b258823826d34a5caa3a293ae6683c133864f817f688477c9be167" Apr 23 18:56:54.702220 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.702200 2576 scope.go:117] "RemoveContainer" containerID="c3d2a0f448685e570515694195a34f8703d06047037599ec72aaa153c1972d7d" Apr 23 18:56:54.707374 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.707351 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr"] Apr 23 18:56:54.710061 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.710040 2576 scope.go:117] "RemoveContainer" containerID="3fc54cc8fa38f1684f8a91d4fbc033aaba7874970535f5d9ef3d8440bec3bc65" Apr 23 18:56:54.710359 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:56:54.710341 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fc54cc8fa38f1684f8a91d4fbc033aaba7874970535f5d9ef3d8440bec3bc65\": container with ID starting with 3fc54cc8fa38f1684f8a91d4fbc033aaba7874970535f5d9ef3d8440bec3bc65 not found: ID does not exist" containerID="3fc54cc8fa38f1684f8a91d4fbc033aaba7874970535f5d9ef3d8440bec3bc65" Apr 23 18:56:54.710409 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.710369 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fc54cc8fa38f1684f8a91d4fbc033aaba7874970535f5d9ef3d8440bec3bc65"} err="failed to get container status \"3fc54cc8fa38f1684f8a91d4fbc033aaba7874970535f5d9ef3d8440bec3bc65\": rpc error: code = NotFound desc = could not find container \"3fc54cc8fa38f1684f8a91d4fbc033aaba7874970535f5d9ef3d8440bec3bc65\": container with ID starting with 3fc54cc8fa38f1684f8a91d4fbc033aaba7874970535f5d9ef3d8440bec3bc65 not found: ID does not exist" Apr 23 18:56:54.710409 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.710388 2576 scope.go:117] "RemoveContainer" containerID="d335058af2b258823826d34a5caa3a293ae6683c133864f817f688477c9be167" Apr 23 18:56:54.710523 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.710416 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wwgnr"] Apr 23 18:56:54.710671 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:56:54.710652 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d335058af2b258823826d34a5caa3a293ae6683c133864f817f688477c9be167\": container with ID starting with d335058af2b258823826d34a5caa3a293ae6683c133864f817f688477c9be167 not found: ID does not exist" containerID="d335058af2b258823826d34a5caa3a293ae6683c133864f817f688477c9be167" Apr 23 18:56:54.710734 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.710682 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d335058af2b258823826d34a5caa3a293ae6683c133864f817f688477c9be167"} err="failed to get container status \"d335058af2b258823826d34a5caa3a293ae6683c133864f817f688477c9be167\": rpc error: code = NotFound desc = could not find container \"d335058af2b258823826d34a5caa3a293ae6683c133864f817f688477c9be167\": container with ID starting with d335058af2b258823826d34a5caa3a293ae6683c133864f817f688477c9be167 not found: ID does not exist" Apr 23 18:56:54.710734 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.710706 2576 scope.go:117] "RemoveContainer" containerID="c3d2a0f448685e570515694195a34f8703d06047037599ec72aaa153c1972d7d" Apr 23 18:56:54.710954 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:56:54.710936 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3d2a0f448685e570515694195a34f8703d06047037599ec72aaa153c1972d7d\": container with ID starting with c3d2a0f448685e570515694195a34f8703d06047037599ec72aaa153c1972d7d not found: ID does not exist" containerID="c3d2a0f448685e570515694195a34f8703d06047037599ec72aaa153c1972d7d" Apr 23 18:56:54.711006 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:54.710960 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d2a0f448685e570515694195a34f8703d06047037599ec72aaa153c1972d7d"} err="failed to get container status \"c3d2a0f448685e570515694195a34f8703d06047037599ec72aaa153c1972d7d\": rpc error: code = NotFound desc = could not find container \"c3d2a0f448685e570515694195a34f8703d06047037599ec72aaa153c1972d7d\": container with ID starting with c3d2a0f448685e570515694195a34f8703d06047037599ec72aaa153c1972d7d not found: ID does not exist" Apr 23 18:56:55.415440 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:56:55.415397 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb52e804-2d33-41e1-9a34-a656ecca2773" path="/var/lib/kubelet/pods/cb52e804-2d33-41e1-9a34-a656ecca2773/volumes" Apr 23 18:58:06.887518 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:06.887480 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch"] Apr 23 18:58:06.887952 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:06.887826 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb52e804-2d33-41e1-9a34-a656ecca2773" containerName="kserve-container" Apr 23 18:58:06.887952 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:06.887836 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb52e804-2d33-41e1-9a34-a656ecca2773" containerName="kserve-container" Apr 23 18:58:06.887952 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:06.887847 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb52e804-2d33-41e1-9a34-a656ecca2773" containerName="kube-rbac-proxy" Apr 23 18:58:06.887952 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:06.887853 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb52e804-2d33-41e1-9a34-a656ecca2773" containerName="kube-rbac-proxy" Apr 23 18:58:06.887952 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:06.887868 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb52e804-2d33-41e1-9a34-a656ecca2773" containerName="storage-initializer" Apr 23 18:58:06.887952 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:06.887873 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb52e804-2d33-41e1-9a34-a656ecca2773" containerName="storage-initializer" Apr 23 18:58:06.887952 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:06.887928 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb52e804-2d33-41e1-9a34-a656ecca2773" containerName="kserve-container" Apr 23 18:58:06.887952 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:06.887938 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb52e804-2d33-41e1-9a34-a656ecca2773" containerName="kube-rbac-proxy" Apr 23 18:58:06.891075 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:06.891053 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" Apr 23 18:58:06.893195 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:06.893174 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-predictor-serving-cert\"" Apr 23 18:58:06.893195 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:06.893192 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 23 18:58:06.893381 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:06.893192 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t4fg7\"" Apr 23 18:58:06.893626 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:06.893611 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-kube-rbac-proxy-sar-config\"" Apr 23 18:58:06.893852 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:06.893835 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 18:58:06.893912 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:06.893871 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:58:06.902136 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:06.902114 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch"] Apr 23 18:58:06.969070 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:06.969033 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x9sg\" (UniqueName: \"kubernetes.io/projected/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-kube-api-access-9x9sg\") pod \"isvc-sklearn-s3-predictor-79699d6c76-65jch\" (UID: \"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" Apr 23 18:58:06.969070 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:06.969072 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-79699d6c76-65jch\" (UID: \"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" Apr 23 18:58:06.969324 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:06.969179 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-79699d6c76-65jch\" (UID: \"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" Apr 23 18:58:06.969324 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:06.969223 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-proxy-tls\") pod \"isvc-sklearn-s3-predictor-79699d6c76-65jch\" (UID: \"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" Apr 23 18:58:07.070600 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:07.070556 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-79699d6c76-65jch\" (UID: \"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" Apr 23 18:58:07.070600 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:07.070604 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-proxy-tls\") pod \"isvc-sklearn-s3-predictor-79699d6c76-65jch\" (UID: \"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" Apr 23 18:58:07.070852 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:07.070665 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9x9sg\" (UniqueName: \"kubernetes.io/projected/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-kube-api-access-9x9sg\") pod \"isvc-sklearn-s3-predictor-79699d6c76-65jch\" (UID: \"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" Apr 23 18:58:07.070852 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:07.070699 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-79699d6c76-65jch\" (UID: \"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" Apr 23 18:58:07.071029 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:07.071010 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-79699d6c76-65jch\" (UID: \"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" Apr 23 18:58:07.071287 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:07.071264 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-79699d6c76-65jch\" (UID: \"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" Apr 23 18:58:07.073393 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:07.073365 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-proxy-tls\") pod \"isvc-sklearn-s3-predictor-79699d6c76-65jch\" (UID: \"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" Apr 23 18:58:07.077836 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:07.077813 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x9sg\" (UniqueName: \"kubernetes.io/projected/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-kube-api-access-9x9sg\") pod \"isvc-sklearn-s3-predictor-79699d6c76-65jch\" (UID: \"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" Apr 23 18:58:07.204867 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:07.204782 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" Apr 23 18:58:07.330007 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:07.329980 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch"] Apr 23 18:58:07.332820 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:58:07.332788 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d11e07a_3fe1_4608_95b7_c49fa50c8c1e.slice/crio-d110d40be81c2542c87636f16fd2b4a753f7f4223b9aca7fbfd5de7c3635798f WatchSource:0}: Error finding container d110d40be81c2542c87636f16fd2b4a753f7f4223b9aca7fbfd5de7c3635798f: Status 404 returned error can't find the container with id d110d40be81c2542c87636f16fd2b4a753f7f4223b9aca7fbfd5de7c3635798f Apr 23 18:58:07.334682 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:07.334662 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:58:07.927504 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:07.927446 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" event={"ID":"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e","Type":"ContainerStarted","Data":"882c297191e801da67687e639863c07203c9280a698c6001a9db5d2c1568cb19"} Apr 23 18:58:07.927504 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:07.927506 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" event={"ID":"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e","Type":"ContainerStarted","Data":"d110d40be81c2542c87636f16fd2b4a753f7f4223b9aca7fbfd5de7c3635798f"} Apr 23 18:58:08.935736 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:08.935699 2576 generic.go:358] "Generic (PLEG): container finished" podID="4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" containerID="882c297191e801da67687e639863c07203c9280a698c6001a9db5d2c1568cb19" exitCode=0 Apr 23 18:58:08.936128 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:08.935787 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" event={"ID":"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e","Type":"ContainerDied","Data":"882c297191e801da67687e639863c07203c9280a698c6001a9db5d2c1568cb19"} Apr 23 18:58:09.941279 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:09.941191 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" event={"ID":"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e","Type":"ContainerStarted","Data":"fd61c7fcc453af61aa3cebe883eb9d35441d73553fab590a1acd3d435329f905"} Apr 23 18:58:09.941279 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:09.941226 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" event={"ID":"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e","Type":"ContainerStarted","Data":"b7141349ad39e7f0ffb64a3c1b118301239ce7778478ae62503a6b2d5ff1f8e2"} Apr 23 18:58:09.941833 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:09.941406 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" Apr 23 18:58:09.960962 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:09.960911 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" podStartSLOduration=3.960898793 podStartE2EDuration="3.960898793s" podCreationTimestamp="2026-04-23 18:58:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:58:09.958894176 +0000 UTC m=+3579.126221495" watchObservedRunningTime="2026-04-23 18:58:09.960898793 +0000 UTC m=+3579.128226111" Apr 23 18:58:10.945186 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:10.945144 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" Apr 23 18:58:10.946410 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:10.946380 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" podUID="4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 23 18:58:11.948958 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:11.948915 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" podUID="4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 23 18:58:16.953105 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:16.953074 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" Apr 23 18:58:16.953697 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:16.953674 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" podUID="4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 23 18:58:26.953871 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:26.953831 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" podUID="4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 23 18:58:31.589481 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:31.589438 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:58:31.600544 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:31.600520 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 18:58:36.953807 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:36.953766 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" podUID="4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 23 18:58:46.954178 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:46.954137 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" podUID="4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 23 18:58:56.954127 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:58:56.954087 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" podUID="4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 23 18:59:06.954578 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:06.954531 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" podUID="4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 23 18:59:16.954652 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:16.954625 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" Apr 23 18:59:27.005931 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.005899 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch"] Apr 23 18:59:27.006337 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.006229 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" podUID="4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" containerName="kserve-container" containerID="cri-o://b7141349ad39e7f0ffb64a3c1b118301239ce7778478ae62503a6b2d5ff1f8e2" gracePeriod=30 Apr 23 18:59:27.006337 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.006280 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" podUID="4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" containerName="kube-rbac-proxy" containerID="cri-o://fd61c7fcc453af61aa3cebe883eb9d35441d73553fab590a1acd3d435329f905" gracePeriod=30 Apr 23 18:59:27.126606 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.126578 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l"] Apr 23 18:59:27.130101 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.130082 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 18:59:27.132161 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.132136 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\"" Apr 23 18:59:27.132262 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.132184 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-predictor-serving-cert\"" Apr 23 18:59:27.132262 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.132146 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 23 18:59:27.140318 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.140286 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l"] Apr 23 18:59:27.213240 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.213211 2576 generic.go:358] "Generic (PLEG): container finished" podID="4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" containerID="fd61c7fcc453af61aa3cebe883eb9d35441d73553fab590a1acd3d435329f905" exitCode=2 Apr 23 18:59:27.213391 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.213279 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" event={"ID":"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e","Type":"ContainerDied","Data":"fd61c7fcc453af61aa3cebe883eb9d35441d73553fab590a1acd3d435329f905"} Apr 23 18:59:27.231733 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.231706 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j4qq\" (UniqueName: \"kubernetes.io/projected/71cdc5b9-6181-45ca-aad3-0a09090f532f-kube-api-access-8j4qq\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l\" (UID: \"71cdc5b9-6181-45ca-aad3-0a09090f532f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 18:59:27.231831 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.231740 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71cdc5b9-6181-45ca-aad3-0a09090f532f-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l\" (UID: \"71cdc5b9-6181-45ca-aad3-0a09090f532f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 18:59:27.231831 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.231759 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/71cdc5b9-6181-45ca-aad3-0a09090f532f-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l\" (UID: \"71cdc5b9-6181-45ca-aad3-0a09090f532f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 18:59:27.231831 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.231809 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71cdc5b9-6181-45ca-aad3-0a09090f532f-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l\" (UID: \"71cdc5b9-6181-45ca-aad3-0a09090f532f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 18:59:27.231937 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.231870 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/71cdc5b9-6181-45ca-aad3-0a09090f532f-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l\" (UID: \"71cdc5b9-6181-45ca-aad3-0a09090f532f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 18:59:27.332679 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.332652 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/71cdc5b9-6181-45ca-aad3-0a09090f532f-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l\" (UID: \"71cdc5b9-6181-45ca-aad3-0a09090f532f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 18:59:27.332848 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.332712 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8j4qq\" (UniqueName: \"kubernetes.io/projected/71cdc5b9-6181-45ca-aad3-0a09090f532f-kube-api-access-8j4qq\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l\" (UID: \"71cdc5b9-6181-45ca-aad3-0a09090f532f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 18:59:27.332848 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.332741 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71cdc5b9-6181-45ca-aad3-0a09090f532f-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l\" (UID: \"71cdc5b9-6181-45ca-aad3-0a09090f532f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 18:59:27.332848 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.332765 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/71cdc5b9-6181-45ca-aad3-0a09090f532f-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l\" (UID: \"71cdc5b9-6181-45ca-aad3-0a09090f532f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 18:59:27.332848 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.332800 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71cdc5b9-6181-45ca-aad3-0a09090f532f-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l\" (UID: \"71cdc5b9-6181-45ca-aad3-0a09090f532f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 18:59:27.333223 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.333199 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71cdc5b9-6181-45ca-aad3-0a09090f532f-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l\" (UID: \"71cdc5b9-6181-45ca-aad3-0a09090f532f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 18:59:27.333533 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.333512 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/71cdc5b9-6181-45ca-aad3-0a09090f532f-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l\" (UID: \"71cdc5b9-6181-45ca-aad3-0a09090f532f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 18:59:27.333589 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.333512 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/71cdc5b9-6181-45ca-aad3-0a09090f532f-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l\" (UID: \"71cdc5b9-6181-45ca-aad3-0a09090f532f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 18:59:27.335366 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.335347 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71cdc5b9-6181-45ca-aad3-0a09090f532f-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l\" (UID: \"71cdc5b9-6181-45ca-aad3-0a09090f532f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 18:59:27.340415 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.340385 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j4qq\" (UniqueName: \"kubernetes.io/projected/71cdc5b9-6181-45ca-aad3-0a09090f532f-kube-api-access-8j4qq\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l\" (UID: \"71cdc5b9-6181-45ca-aad3-0a09090f532f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 18:59:27.441525 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.441496 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 18:59:27.565350 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:27.565307 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l"] Apr 23 18:59:27.568187 ip-10-0-137-157 kubenswrapper[2576]: W0423 18:59:27.568158 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71cdc5b9_6181_45ca_aad3_0a09090f532f.slice/crio-c0ac4d6cdfef3c5c76f8b16700043d5d90ee069fb1407215f073a83ec9085d4e WatchSource:0}: Error finding container c0ac4d6cdfef3c5c76f8b16700043d5d90ee069fb1407215f073a83ec9085d4e: Status 404 returned error can't find the container with id c0ac4d6cdfef3c5c76f8b16700043d5d90ee069fb1407215f073a83ec9085d4e Apr 23 18:59:28.218717 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:28.218681 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" event={"ID":"71cdc5b9-6181-45ca-aad3-0a09090f532f","Type":"ContainerStarted","Data":"c7ce5819c5ca0ba9ab5cd88159b91dbc04fe55ae016edbf9c650ffd34255cb58"} Apr 23 18:59:28.218717 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:28.218721 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" event={"ID":"71cdc5b9-6181-45ca-aad3-0a09090f532f","Type":"ContainerStarted","Data":"c0ac4d6cdfef3c5c76f8b16700043d5d90ee069fb1407215f073a83ec9085d4e"} Apr 23 18:59:29.223041 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:29.223001 2576 generic.go:358] "Generic (PLEG): container finished" podID="71cdc5b9-6181-45ca-aad3-0a09090f532f" containerID="c7ce5819c5ca0ba9ab5cd88159b91dbc04fe55ae016edbf9c650ffd34255cb58" exitCode=0 Apr 23 18:59:29.223426 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:29.223090 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" event={"ID":"71cdc5b9-6181-45ca-aad3-0a09090f532f","Type":"ContainerDied","Data":"c7ce5819c5ca0ba9ab5cd88159b91dbc04fe55ae016edbf9c650ffd34255cb58"} Apr 23 18:59:30.228532 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:30.228498 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" event={"ID":"71cdc5b9-6181-45ca-aad3-0a09090f532f","Type":"ContainerStarted","Data":"2d03651e3b91db7708047ecd8f671d35bee24474ceb4de594dc47472acfe71f3"} Apr 23 18:59:30.228532 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:30.228537 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" event={"ID":"71cdc5b9-6181-45ca-aad3-0a09090f532f","Type":"ContainerStarted","Data":"f95467326b7de8cc0756e951199dffc035b9216d80df14ab1d1d217f5e8f92a8"} Apr 23 18:59:30.228955 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:30.228641 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 18:59:30.228955 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:30.228732 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 18:59:30.230138 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:30.230109 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" podUID="71cdc5b9-6181-45ca-aad3-0a09090f532f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 23 18:59:30.249593 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:30.249551 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" podStartSLOduration=3.249538224 podStartE2EDuration="3.249538224s" podCreationTimestamp="2026-04-23 18:59:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:59:30.247404301 +0000 UTC m=+3659.414731621" watchObservedRunningTime="2026-04-23 18:59:30.249538224 +0000 UTC m=+3659.416865542" Apr 23 18:59:31.232159 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:31.232117 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" podUID="71cdc5b9-6181-45ca-aad3-0a09090f532f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 23 18:59:31.554933 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:31.554911 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" Apr 23 18:59:31.672316 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:31.672283 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e\" (UID: \"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e\") " Apr 23 18:59:31.672316 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:31.672318 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-proxy-tls\") pod \"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e\" (UID: \"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e\") " Apr 23 18:59:31.672592 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:31.672390 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x9sg\" (UniqueName: \"kubernetes.io/projected/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-kube-api-access-9x9sg\") pod \"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e\" (UID: \"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e\") " Apr 23 18:59:31.672592 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:31.672489 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-kserve-provision-location\") pod \"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e\" (UID: \"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e\") " Apr 23 18:59:31.672846 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:31.672705 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-isvc-sklearn-s3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-kube-rbac-proxy-sar-config") pod "4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" (UID: "4d11e07a-3fe1-4608-95b7-c49fa50c8c1e"). InnerVolumeSpecName "isvc-sklearn-s3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:59:31.672988 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:31.672855 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" (UID: "4d11e07a-3fe1-4608-95b7-c49fa50c8c1e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:59:31.674791 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:31.674762 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" (UID: "4d11e07a-3fe1-4608-95b7-c49fa50c8c1e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:59:31.674913 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:31.674864 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-kube-api-access-9x9sg" (OuterVolumeSpecName: "kube-api-access-9x9sg") pod "4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" (UID: "4d11e07a-3fe1-4608-95b7-c49fa50c8c1e"). InnerVolumeSpecName "kube-api-access-9x9sg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:59:31.773636 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:31.773608 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9x9sg\" (UniqueName: \"kubernetes.io/projected/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-kube-api-access-9x9sg\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:59:31.773636 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:31.773631 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:59:31.773636 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:31.773643 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:59:31.773860 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:31.773654 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 18:59:32.237218 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:32.237182 2576 generic.go:358] "Generic (PLEG): container finished" podID="4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" containerID="b7141349ad39e7f0ffb64a3c1b118301239ce7778478ae62503a6b2d5ff1f8e2" exitCode=0 Apr 23 18:59:32.237695 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:32.237266 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" Apr 23 18:59:32.237695 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:32.237273 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" event={"ID":"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e","Type":"ContainerDied","Data":"b7141349ad39e7f0ffb64a3c1b118301239ce7778478ae62503a6b2d5ff1f8e2"} Apr 23 18:59:32.237695 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:32.237313 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch" event={"ID":"4d11e07a-3fe1-4608-95b7-c49fa50c8c1e","Type":"ContainerDied","Data":"d110d40be81c2542c87636f16fd2b4a753f7f4223b9aca7fbfd5de7c3635798f"} Apr 23 18:59:32.237695 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:32.237330 2576 scope.go:117] "RemoveContainer" containerID="fd61c7fcc453af61aa3cebe883eb9d35441d73553fab590a1acd3d435329f905" Apr 23 18:59:32.250564 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:32.250542 2576 scope.go:117] "RemoveContainer" containerID="b7141349ad39e7f0ffb64a3c1b118301239ce7778478ae62503a6b2d5ff1f8e2" Apr 23 18:59:32.258496 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:32.258450 2576 scope.go:117] "RemoveContainer" containerID="882c297191e801da67687e639863c07203c9280a698c6001a9db5d2c1568cb19" Apr 23 18:59:32.265753 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:32.265735 2576 scope.go:117] "RemoveContainer" containerID="fd61c7fcc453af61aa3cebe883eb9d35441d73553fab590a1acd3d435329f905" Apr 23 18:59:32.266018 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:59:32.265998 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd61c7fcc453af61aa3cebe883eb9d35441d73553fab590a1acd3d435329f905\": container with ID starting with fd61c7fcc453af61aa3cebe883eb9d35441d73553fab590a1acd3d435329f905 not found: ID does not exist" containerID="fd61c7fcc453af61aa3cebe883eb9d35441d73553fab590a1acd3d435329f905" Apr 23 18:59:32.266086 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:32.266023 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd61c7fcc453af61aa3cebe883eb9d35441d73553fab590a1acd3d435329f905"} err="failed to get container status \"fd61c7fcc453af61aa3cebe883eb9d35441d73553fab590a1acd3d435329f905\": rpc error: code = NotFound desc = could not find container \"fd61c7fcc453af61aa3cebe883eb9d35441d73553fab590a1acd3d435329f905\": container with ID starting with fd61c7fcc453af61aa3cebe883eb9d35441d73553fab590a1acd3d435329f905 not found: ID does not exist" Apr 23 18:59:32.266086 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:32.266039 2576 scope.go:117] "RemoveContainer" containerID="b7141349ad39e7f0ffb64a3c1b118301239ce7778478ae62503a6b2d5ff1f8e2" Apr 23 18:59:32.266281 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:59:32.266256 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7141349ad39e7f0ffb64a3c1b118301239ce7778478ae62503a6b2d5ff1f8e2\": container with ID starting with b7141349ad39e7f0ffb64a3c1b118301239ce7778478ae62503a6b2d5ff1f8e2 not found: ID does not exist" containerID="b7141349ad39e7f0ffb64a3c1b118301239ce7778478ae62503a6b2d5ff1f8e2" Apr 23 18:59:32.266393 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:32.266286 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7141349ad39e7f0ffb64a3c1b118301239ce7778478ae62503a6b2d5ff1f8e2"} err="failed to get container status \"b7141349ad39e7f0ffb64a3c1b118301239ce7778478ae62503a6b2d5ff1f8e2\": rpc error: code = NotFound desc = could not find container \"b7141349ad39e7f0ffb64a3c1b118301239ce7778478ae62503a6b2d5ff1f8e2\": container with ID starting with b7141349ad39e7f0ffb64a3c1b118301239ce7778478ae62503a6b2d5ff1f8e2 not found: ID does not exist" Apr 23 18:59:32.266393 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:32.266302 2576 scope.go:117] "RemoveContainer" containerID="882c297191e801da67687e639863c07203c9280a698c6001a9db5d2c1568cb19" Apr 23 18:59:32.266580 ip-10-0-137-157 kubenswrapper[2576]: E0423 18:59:32.266557 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"882c297191e801da67687e639863c07203c9280a698c6001a9db5d2c1568cb19\": container with ID starting with 882c297191e801da67687e639863c07203c9280a698c6001a9db5d2c1568cb19 not found: ID does not exist" containerID="882c297191e801da67687e639863c07203c9280a698c6001a9db5d2c1568cb19" Apr 23 18:59:32.266622 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:32.266587 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"882c297191e801da67687e639863c07203c9280a698c6001a9db5d2c1568cb19"} err="failed to get container status \"882c297191e801da67687e639863c07203c9280a698c6001a9db5d2c1568cb19\": rpc error: code = NotFound desc = could not find container \"882c297191e801da67687e639863c07203c9280a698c6001a9db5d2c1568cb19\": container with ID starting with 882c297191e801da67687e639863c07203c9280a698c6001a9db5d2c1568cb19 not found: ID does not exist" Apr 23 18:59:32.267073 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:32.267052 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch"] Apr 23 18:59:32.270016 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:32.269995 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-79699d6c76-65jch"] Apr 23 18:59:33.415252 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:33.415219 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" path="/var/lib/kubelet/pods/4d11e07a-3fe1-4608-95b7-c49fa50c8c1e/volumes" Apr 23 18:59:36.236344 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:36.236315 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 18:59:36.236940 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:36.236917 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" podUID="71cdc5b9-6181-45ca-aad3-0a09090f532f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 23 18:59:46.237661 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:46.237580 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" podUID="71cdc5b9-6181-45ca-aad3-0a09090f532f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 23 18:59:56.237412 ip-10-0-137-157 kubenswrapper[2576]: I0423 18:59:56.237362 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" podUID="71cdc5b9-6181-45ca-aad3-0a09090f532f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 23 19:00:06.237082 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:06.237043 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" podUID="71cdc5b9-6181-45ca-aad3-0a09090f532f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 23 19:00:16.237800 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:16.237760 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" podUID="71cdc5b9-6181-45ca-aad3-0a09090f532f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 23 19:00:26.237488 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:26.237425 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" podUID="71cdc5b9-6181-45ca-aad3-0a09090f532f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 23 19:00:36.238249 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:36.238217 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 19:00:37.180560 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:37.180529 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l"] Apr 23 19:00:37.180870 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:37.180823 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" podUID="71cdc5b9-6181-45ca-aad3-0a09090f532f" containerName="kserve-container" containerID="cri-o://f95467326b7de8cc0756e951199dffc035b9216d80df14ab1d1d217f5e8f92a8" gracePeriod=30 Apr 23 19:00:37.180947 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:37.180881 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" podUID="71cdc5b9-6181-45ca-aad3-0a09090f532f" containerName="kube-rbac-proxy" containerID="cri-o://2d03651e3b91db7708047ecd8f671d35bee24474ceb4de594dc47472acfe71f3" gracePeriod=30 Apr 23 19:00:37.466379 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:37.466283 2576 generic.go:358] "Generic (PLEG): container finished" podID="71cdc5b9-6181-45ca-aad3-0a09090f532f" containerID="2d03651e3b91db7708047ecd8f671d35bee24474ceb4de594dc47472acfe71f3" exitCode=2 Apr 23 19:00:37.466379 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:37.466362 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" event={"ID":"71cdc5b9-6181-45ca-aad3-0a09090f532f","Type":"ContainerDied","Data":"2d03651e3b91db7708047ecd8f671d35bee24474ceb4de594dc47472acfe71f3"} Apr 23 19:00:38.248105 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.248069 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f"] Apr 23 19:00:38.248420 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.248408 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" containerName="kube-rbac-proxy" Apr 23 19:00:38.248484 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.248422 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" containerName="kube-rbac-proxy" Apr 23 19:00:38.248484 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.248439 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" containerName="kserve-container" Apr 23 19:00:38.248484 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.248444 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" containerName="kserve-container" Apr 23 19:00:38.248484 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.248471 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" containerName="storage-initializer" Apr 23 19:00:38.248484 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.248477 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" containerName="storage-initializer" Apr 23 19:00:38.248648 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.248538 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" containerName="kube-rbac-proxy" Apr 23 19:00:38.248648 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.248547 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d11e07a-3fe1-4608-95b7-c49fa50c8c1e" containerName="kserve-container" Apr 23 19:00:38.251677 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.251660 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" Apr 23 19:00:38.253623 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.253602 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-predictor-serving-cert\"" Apr 23 19:00:38.253808 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.253789 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\"" Apr 23 19:00:38.260551 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.260532 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f"] Apr 23 19:00:38.416922 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.416890 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7833386-b80b-43b8-aad1-69b11977af03-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f\" (UID: \"a7833386-b80b-43b8-aad1-69b11977af03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" Apr 23 19:00:38.417101 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.416937 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a7833386-b80b-43b8-aad1-69b11977af03-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f\" (UID: \"a7833386-b80b-43b8-aad1-69b11977af03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" Apr 23 19:00:38.417101 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.416962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzsrb\" (UniqueName: \"kubernetes.io/projected/a7833386-b80b-43b8-aad1-69b11977af03-kube-api-access-tzsrb\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f\" (UID: \"a7833386-b80b-43b8-aad1-69b11977af03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" Apr 23 19:00:38.417101 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.416988 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7833386-b80b-43b8-aad1-69b11977af03-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f\" (UID: \"a7833386-b80b-43b8-aad1-69b11977af03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" Apr 23 19:00:38.517787 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.517691 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7833386-b80b-43b8-aad1-69b11977af03-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f\" (UID: \"a7833386-b80b-43b8-aad1-69b11977af03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" Apr 23 19:00:38.517787 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.517743 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a7833386-b80b-43b8-aad1-69b11977af03-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f\" (UID: \"a7833386-b80b-43b8-aad1-69b11977af03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" Apr 23 19:00:38.517787 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.517781 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzsrb\" (UniqueName: \"kubernetes.io/projected/a7833386-b80b-43b8-aad1-69b11977af03-kube-api-access-tzsrb\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f\" (UID: \"a7833386-b80b-43b8-aad1-69b11977af03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" Apr 23 19:00:38.518332 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.517807 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7833386-b80b-43b8-aad1-69b11977af03-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f\" (UID: \"a7833386-b80b-43b8-aad1-69b11977af03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" Apr 23 19:00:38.518415 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.518387 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7833386-b80b-43b8-aad1-69b11977af03-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f\" (UID: \"a7833386-b80b-43b8-aad1-69b11977af03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" Apr 23 19:00:38.518545 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.518527 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a7833386-b80b-43b8-aad1-69b11977af03-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f\" (UID: \"a7833386-b80b-43b8-aad1-69b11977af03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" Apr 23 19:00:38.520304 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.520288 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7833386-b80b-43b8-aad1-69b11977af03-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f\" (UID: \"a7833386-b80b-43b8-aad1-69b11977af03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" Apr 23 19:00:38.525847 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.525822 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzsrb\" (UniqueName: \"kubernetes.io/projected/a7833386-b80b-43b8-aad1-69b11977af03-kube-api-access-tzsrb\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f\" (UID: \"a7833386-b80b-43b8-aad1-69b11977af03\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" Apr 23 19:00:38.563345 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.563316 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" Apr 23 19:00:38.686344 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:38.686124 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f"] Apr 23 19:00:38.690237 ip-10-0-137-157 kubenswrapper[2576]: W0423 19:00:38.690201 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7833386_b80b_43b8_aad1_69b11977af03.slice/crio-f5a359259df7d0a4b8357a19e7ccc2d0081543d895e0fcb1d8a28b4f54b772ea WatchSource:0}: Error finding container f5a359259df7d0a4b8357a19e7ccc2d0081543d895e0fcb1d8a28b4f54b772ea: Status 404 returned error can't find the container with id f5a359259df7d0a4b8357a19e7ccc2d0081543d895e0fcb1d8a28b4f54b772ea Apr 23 19:00:39.475181 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:39.475140 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" event={"ID":"a7833386-b80b-43b8-aad1-69b11977af03","Type":"ContainerStarted","Data":"3d674b319acbc37f0c96e608373fbd1d632c28f7f50e8163cf55939f8ff1f682"} Apr 23 19:00:39.475181 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:39.475185 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" event={"ID":"a7833386-b80b-43b8-aad1-69b11977af03","Type":"ContainerStarted","Data":"f5a359259df7d0a4b8357a19e7ccc2d0081543d895e0fcb1d8a28b4f54b772ea"} Apr 23 19:00:41.232377 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.232330 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" podUID="71cdc5b9-6181-45ca-aad3-0a09090f532f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.58:8643/healthz\": dial tcp 10.132.0.58:8643: connect: connection refused" Apr 23 19:00:41.433586 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.433565 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 19:00:41.483378 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.483285 2576 generic.go:358] "Generic (PLEG): container finished" podID="71cdc5b9-6181-45ca-aad3-0a09090f532f" containerID="f95467326b7de8cc0756e951199dffc035b9216d80df14ab1d1d217f5e8f92a8" exitCode=0 Apr 23 19:00:41.483378 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.483341 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" event={"ID":"71cdc5b9-6181-45ca-aad3-0a09090f532f","Type":"ContainerDied","Data":"f95467326b7de8cc0756e951199dffc035b9216d80df14ab1d1d217f5e8f92a8"} Apr 23 19:00:41.483378 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.483362 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" Apr 23 19:00:41.483378 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.483373 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l" event={"ID":"71cdc5b9-6181-45ca-aad3-0a09090f532f","Type":"ContainerDied","Data":"c0ac4d6cdfef3c5c76f8b16700043d5d90ee069fb1407215f073a83ec9085d4e"} Apr 23 19:00:41.483699 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.483393 2576 scope.go:117] "RemoveContainer" containerID="2d03651e3b91db7708047ecd8f671d35bee24474ceb4de594dc47472acfe71f3" Apr 23 19:00:41.491477 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.491439 2576 scope.go:117] "RemoveContainer" containerID="f95467326b7de8cc0756e951199dffc035b9216d80df14ab1d1d217f5e8f92a8" Apr 23 19:00:41.499033 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.499017 2576 scope.go:117] "RemoveContainer" containerID="c7ce5819c5ca0ba9ab5cd88159b91dbc04fe55ae016edbf9c650ffd34255cb58" Apr 23 19:00:41.506539 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.506522 2576 scope.go:117] "RemoveContainer" containerID="2d03651e3b91db7708047ecd8f671d35bee24474ceb4de594dc47472acfe71f3" Apr 23 19:00:41.506785 ip-10-0-137-157 kubenswrapper[2576]: E0423 19:00:41.506765 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d03651e3b91db7708047ecd8f671d35bee24474ceb4de594dc47472acfe71f3\": container with ID starting with 2d03651e3b91db7708047ecd8f671d35bee24474ceb4de594dc47472acfe71f3 not found: ID does not exist" containerID="2d03651e3b91db7708047ecd8f671d35bee24474ceb4de594dc47472acfe71f3" Apr 23 19:00:41.506837 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.506792 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d03651e3b91db7708047ecd8f671d35bee24474ceb4de594dc47472acfe71f3"} err="failed to get container status \"2d03651e3b91db7708047ecd8f671d35bee24474ceb4de594dc47472acfe71f3\": rpc error: code = NotFound desc = could not find container \"2d03651e3b91db7708047ecd8f671d35bee24474ceb4de594dc47472acfe71f3\": container with ID starting with 2d03651e3b91db7708047ecd8f671d35bee24474ceb4de594dc47472acfe71f3 not found: ID does not exist" Apr 23 19:00:41.506837 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.506809 2576 scope.go:117] "RemoveContainer" containerID="f95467326b7de8cc0756e951199dffc035b9216d80df14ab1d1d217f5e8f92a8" Apr 23 19:00:41.506990 ip-10-0-137-157 kubenswrapper[2576]: E0423 19:00:41.506973 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f95467326b7de8cc0756e951199dffc035b9216d80df14ab1d1d217f5e8f92a8\": container with ID starting with f95467326b7de8cc0756e951199dffc035b9216d80df14ab1d1d217f5e8f92a8 not found: ID does not exist" containerID="f95467326b7de8cc0756e951199dffc035b9216d80df14ab1d1d217f5e8f92a8" Apr 23 19:00:41.507037 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.506997 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f95467326b7de8cc0756e951199dffc035b9216d80df14ab1d1d217f5e8f92a8"} err="failed to get container status \"f95467326b7de8cc0756e951199dffc035b9216d80df14ab1d1d217f5e8f92a8\": rpc error: code = NotFound desc = could not find container \"f95467326b7de8cc0756e951199dffc035b9216d80df14ab1d1d217f5e8f92a8\": container with ID starting with f95467326b7de8cc0756e951199dffc035b9216d80df14ab1d1d217f5e8f92a8 not found: ID does not exist" Apr 23 19:00:41.507037 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.507014 2576 scope.go:117] "RemoveContainer" containerID="c7ce5819c5ca0ba9ab5cd88159b91dbc04fe55ae016edbf9c650ffd34255cb58" Apr 23 19:00:41.507215 ip-10-0-137-157 kubenswrapper[2576]: E0423 19:00:41.507195 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7ce5819c5ca0ba9ab5cd88159b91dbc04fe55ae016edbf9c650ffd34255cb58\": container with ID starting with c7ce5819c5ca0ba9ab5cd88159b91dbc04fe55ae016edbf9c650ffd34255cb58 not found: ID does not exist" containerID="c7ce5819c5ca0ba9ab5cd88159b91dbc04fe55ae016edbf9c650ffd34255cb58" Apr 23 19:00:41.507255 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.507221 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7ce5819c5ca0ba9ab5cd88159b91dbc04fe55ae016edbf9c650ffd34255cb58"} err="failed to get container status \"c7ce5819c5ca0ba9ab5cd88159b91dbc04fe55ae016edbf9c650ffd34255cb58\": rpc error: code = NotFound desc = could not find container \"c7ce5819c5ca0ba9ab5cd88159b91dbc04fe55ae016edbf9c650ffd34255cb58\": container with ID starting with c7ce5819c5ca0ba9ab5cd88159b91dbc04fe55ae016edbf9c650ffd34255cb58 not found: ID does not exist" Apr 23 19:00:41.542596 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.542567 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/71cdc5b9-6181-45ca-aad3-0a09090f532f-cabundle-cert\") pod \"71cdc5b9-6181-45ca-aad3-0a09090f532f\" (UID: \"71cdc5b9-6181-45ca-aad3-0a09090f532f\") " Apr 23 19:00:41.542715 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.542642 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71cdc5b9-6181-45ca-aad3-0a09090f532f-kserve-provision-location\") pod \"71cdc5b9-6181-45ca-aad3-0a09090f532f\" (UID: \"71cdc5b9-6181-45ca-aad3-0a09090f532f\") " Apr 23 19:00:41.542759 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.542735 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j4qq\" (UniqueName: \"kubernetes.io/projected/71cdc5b9-6181-45ca-aad3-0a09090f532f-kube-api-access-8j4qq\") pod \"71cdc5b9-6181-45ca-aad3-0a09090f532f\" (UID: \"71cdc5b9-6181-45ca-aad3-0a09090f532f\") " Apr 23 19:00:41.542808 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.542784 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/71cdc5b9-6181-45ca-aad3-0a09090f532f-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"71cdc5b9-6181-45ca-aad3-0a09090f532f\" (UID: \"71cdc5b9-6181-45ca-aad3-0a09090f532f\") " Apr 23 19:00:41.542873 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.542857 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71cdc5b9-6181-45ca-aad3-0a09090f532f-proxy-tls\") pod \"71cdc5b9-6181-45ca-aad3-0a09090f532f\" (UID: \"71cdc5b9-6181-45ca-aad3-0a09090f532f\") " Apr 23 19:00:41.542976 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.542957 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71cdc5b9-6181-45ca-aad3-0a09090f532f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "71cdc5b9-6181-45ca-aad3-0a09090f532f" (UID: "71cdc5b9-6181-45ca-aad3-0a09090f532f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:00:41.543019 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.542977 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71cdc5b9-6181-45ca-aad3-0a09090f532f-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "71cdc5b9-6181-45ca-aad3-0a09090f532f" (UID: "71cdc5b9-6181-45ca-aad3-0a09090f532f"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 19:00:41.543208 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.543179 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71cdc5b9-6181-45ca-aad3-0a09090f532f-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config") pod "71cdc5b9-6181-45ca-aad3-0a09090f532f" (UID: "71cdc5b9-6181-45ca-aad3-0a09090f532f"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 19:00:41.543332 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.543229 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71cdc5b9-6181-45ca-aad3-0a09090f532f-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:00:41.543332 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.543244 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/71cdc5b9-6181-45ca-aad3-0a09090f532f-cabundle-cert\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:00:41.545019 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.544993 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71cdc5b9-6181-45ca-aad3-0a09090f532f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "71cdc5b9-6181-45ca-aad3-0a09090f532f" (UID: "71cdc5b9-6181-45ca-aad3-0a09090f532f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:00:41.545101 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.545049 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71cdc5b9-6181-45ca-aad3-0a09090f532f-kube-api-access-8j4qq" (OuterVolumeSpecName: "kube-api-access-8j4qq") pod "71cdc5b9-6181-45ca-aad3-0a09090f532f" (UID: "71cdc5b9-6181-45ca-aad3-0a09090f532f"). InnerVolumeSpecName "kube-api-access-8j4qq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:00:41.644179 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.644141 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8j4qq\" (UniqueName: \"kubernetes.io/projected/71cdc5b9-6181-45ca-aad3-0a09090f532f-kube-api-access-8j4qq\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:00:41.644179 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.644173 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/71cdc5b9-6181-45ca-aad3-0a09090f532f-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:00:41.644179 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.644187 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71cdc5b9-6181-45ca-aad3-0a09090f532f-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:00:41.804778 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.804745 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l"] Apr 23 19:00:41.809187 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:41.809163 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-66cfc9f5d4-b8r7l"] Apr 23 19:00:42.488145 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:42.488119 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f_a7833386-b80b-43b8-aad1-69b11977af03/storage-initializer/0.log" Apr 23 19:00:42.488529 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:42.488154 2576 generic.go:358] "Generic (PLEG): container finished" podID="a7833386-b80b-43b8-aad1-69b11977af03" containerID="3d674b319acbc37f0c96e608373fbd1d632c28f7f50e8163cf55939f8ff1f682" exitCode=1 Apr 23 19:00:42.488529 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:42.488197 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" event={"ID":"a7833386-b80b-43b8-aad1-69b11977af03","Type":"ContainerDied","Data":"3d674b319acbc37f0c96e608373fbd1d632c28f7f50e8163cf55939f8ff1f682"} Apr 23 19:00:43.415534 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:43.415499 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71cdc5b9-6181-45ca-aad3-0a09090f532f" path="/var/lib/kubelet/pods/71cdc5b9-6181-45ca-aad3-0a09090f532f/volumes" Apr 23 19:00:43.493411 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:43.493384 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f_a7833386-b80b-43b8-aad1-69b11977af03/storage-initializer/0.log" Apr 23 19:00:43.493800 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:43.493433 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" event={"ID":"a7833386-b80b-43b8-aad1-69b11977af03","Type":"ContainerStarted","Data":"ca86aebd08fd905781acc0e398eb3150f5734b83740bf80a66443ec7fc2db07a"} Apr 23 19:00:47.508120 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:47.508045 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f_a7833386-b80b-43b8-aad1-69b11977af03/storage-initializer/1.log" Apr 23 19:00:47.508494 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:47.508418 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f_a7833386-b80b-43b8-aad1-69b11977af03/storage-initializer/0.log" Apr 23 19:00:47.508494 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:47.508451 2576 generic.go:358] "Generic (PLEG): container finished" podID="a7833386-b80b-43b8-aad1-69b11977af03" containerID="ca86aebd08fd905781acc0e398eb3150f5734b83740bf80a66443ec7fc2db07a" exitCode=1 Apr 23 19:00:47.508607 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:47.508500 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" event={"ID":"a7833386-b80b-43b8-aad1-69b11977af03","Type":"ContainerDied","Data":"ca86aebd08fd905781acc0e398eb3150f5734b83740bf80a66443ec7fc2db07a"} Apr 23 19:00:47.508607 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:47.508542 2576 scope.go:117] "RemoveContainer" containerID="3d674b319acbc37f0c96e608373fbd1d632c28f7f50e8163cf55939f8ff1f682" Apr 23 19:00:47.508939 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:47.508921 2576 scope.go:117] "RemoveContainer" containerID="3d674b319acbc37f0c96e608373fbd1d632c28f7f50e8163cf55939f8ff1f682" Apr 23 19:00:47.519370 ip-10-0-137-157 kubenswrapper[2576]: E0423 19:00:47.519344 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f_kserve-ci-e2e-test_a7833386-b80b-43b8-aad1-69b11977af03_0 in pod sandbox f5a359259df7d0a4b8357a19e7ccc2d0081543d895e0fcb1d8a28b4f54b772ea from index: no such id: '3d674b319acbc37f0c96e608373fbd1d632c28f7f50e8163cf55939f8ff1f682'" containerID="3d674b319acbc37f0c96e608373fbd1d632c28f7f50e8163cf55939f8ff1f682" Apr 23 19:00:47.519432 ip-10-0-137-157 kubenswrapper[2576]: E0423 19:00:47.519386 2576 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f_kserve-ci-e2e-test_a7833386-b80b-43b8-aad1-69b11977af03_0 in pod sandbox f5a359259df7d0a4b8357a19e7ccc2d0081543d895e0fcb1d8a28b4f54b772ea from index: no such id: '3d674b319acbc37f0c96e608373fbd1d632c28f7f50e8163cf55939f8ff1f682'; Skipping pod \"isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f_kserve-ci-e2e-test(a7833386-b80b-43b8-aad1-69b11977af03)\"" logger="UnhandledError" Apr 23 19:00:47.520711 ip-10-0-137-157 kubenswrapper[2576]: E0423 19:00:47.520691 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f_kserve-ci-e2e-test(a7833386-b80b-43b8-aad1-69b11977af03)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" podUID="a7833386-b80b-43b8-aad1-69b11977af03" Apr 23 19:00:48.231584 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:48.231553 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f"] Apr 23 19:00:48.513370 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:48.513289 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f_a7833386-b80b-43b8-aad1-69b11977af03/storage-initializer/1.log" Apr 23 19:00:48.644590 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:48.644567 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f_a7833386-b80b-43b8-aad1-69b11977af03/storage-initializer/1.log" Apr 23 19:00:48.644727 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:48.644644 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" Apr 23 19:00:48.700313 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:48.700278 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a7833386-b80b-43b8-aad1-69b11977af03-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"a7833386-b80b-43b8-aad1-69b11977af03\" (UID: \"a7833386-b80b-43b8-aad1-69b11977af03\") " Apr 23 19:00:48.700523 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:48.700350 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7833386-b80b-43b8-aad1-69b11977af03-kserve-provision-location\") pod \"a7833386-b80b-43b8-aad1-69b11977af03\" (UID: \"a7833386-b80b-43b8-aad1-69b11977af03\") " Apr 23 19:00:48.700523 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:48.700371 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7833386-b80b-43b8-aad1-69b11977af03-proxy-tls\") pod \"a7833386-b80b-43b8-aad1-69b11977af03\" (UID: \"a7833386-b80b-43b8-aad1-69b11977af03\") " Apr 23 19:00:48.700523 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:48.700390 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzsrb\" (UniqueName: \"kubernetes.io/projected/a7833386-b80b-43b8-aad1-69b11977af03-kube-api-access-tzsrb\") pod \"a7833386-b80b-43b8-aad1-69b11977af03\" (UID: \"a7833386-b80b-43b8-aad1-69b11977af03\") " Apr 23 19:00:48.700713 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:48.700658 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7833386-b80b-43b8-aad1-69b11977af03-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config") pod "a7833386-b80b-43b8-aad1-69b11977af03" (UID: "a7833386-b80b-43b8-aad1-69b11977af03"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 19:00:48.700713 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:48.700700 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7833386-b80b-43b8-aad1-69b11977af03-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a7833386-b80b-43b8-aad1-69b11977af03" (UID: "a7833386-b80b-43b8-aad1-69b11977af03"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:00:48.702700 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:48.702672 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7833386-b80b-43b8-aad1-69b11977af03-kube-api-access-tzsrb" (OuterVolumeSpecName: "kube-api-access-tzsrb") pod "a7833386-b80b-43b8-aad1-69b11977af03" (UID: "a7833386-b80b-43b8-aad1-69b11977af03"). InnerVolumeSpecName "kube-api-access-tzsrb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:00:48.702804 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:48.702704 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7833386-b80b-43b8-aad1-69b11977af03-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a7833386-b80b-43b8-aad1-69b11977af03" (UID: "a7833386-b80b-43b8-aad1-69b11977af03"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:00:48.801351 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:48.801257 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a7833386-b80b-43b8-aad1-69b11977af03-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:00:48.801351 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:48.801288 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7833386-b80b-43b8-aad1-69b11977af03-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:00:48.801351 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:48.801297 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7833386-b80b-43b8-aad1-69b11977af03-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:00:48.801351 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:48.801307 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tzsrb\" (UniqueName: \"kubernetes.io/projected/a7833386-b80b-43b8-aad1-69b11977af03-kube-api-access-tzsrb\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:00:49.296137 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.296107 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn"] Apr 23 19:00:49.296507 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.296493 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71cdc5b9-6181-45ca-aad3-0a09090f532f" containerName="kserve-container" Apr 23 19:00:49.296568 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.296509 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cdc5b9-6181-45ca-aad3-0a09090f532f" containerName="kserve-container" Apr 23 19:00:49.296568 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.296519 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7833386-b80b-43b8-aad1-69b11977af03" containerName="storage-initializer" Apr 23 19:00:49.296568 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.296524 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7833386-b80b-43b8-aad1-69b11977af03" containerName="storage-initializer" Apr 23 19:00:49.296568 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.296538 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71cdc5b9-6181-45ca-aad3-0a09090f532f" containerName="storage-initializer" Apr 23 19:00:49.296568 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.296543 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cdc5b9-6181-45ca-aad3-0a09090f532f" containerName="storage-initializer" Apr 23 19:00:49.296568 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.296555 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7833386-b80b-43b8-aad1-69b11977af03" containerName="storage-initializer" Apr 23 19:00:49.296568 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.296562 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7833386-b80b-43b8-aad1-69b11977af03" containerName="storage-initializer" Apr 23 19:00:49.296568 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.296569 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71cdc5b9-6181-45ca-aad3-0a09090f532f" containerName="kube-rbac-proxy" Apr 23 19:00:49.296809 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.296575 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cdc5b9-6181-45ca-aad3-0a09090f532f" containerName="kube-rbac-proxy" Apr 23 19:00:49.296809 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.296626 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7833386-b80b-43b8-aad1-69b11977af03" containerName="storage-initializer" Apr 23 19:00:49.296809 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.296635 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="71cdc5b9-6181-45ca-aad3-0a09090f532f" containerName="kserve-container" Apr 23 19:00:49.296809 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.296641 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="71cdc5b9-6181-45ca-aad3-0a09090f532f" containerName="kube-rbac-proxy" Apr 23 19:00:49.296809 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.296745 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7833386-b80b-43b8-aad1-69b11977af03" containerName="storage-initializer" Apr 23 19:00:49.301178 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.301155 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:00:49.303596 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.303571 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\"" Apr 23 19:00:49.303723 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.303618 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert\"" Apr 23 19:00:49.303723 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.303628 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 23 19:00:49.305193 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.305158 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0cedf86-36ef-472e-8850-3ba887b542ff-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn\" (UID: \"c0cedf86-36ef-472e-8850-3ba887b542ff\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:00:49.305285 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.305198 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c0cedf86-36ef-472e-8850-3ba887b542ff-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn\" (UID: \"c0cedf86-36ef-472e-8850-3ba887b542ff\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:00:49.305348 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.305288 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c0cedf86-36ef-472e-8850-3ba887b542ff-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn\" (UID: \"c0cedf86-36ef-472e-8850-3ba887b542ff\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:00:49.305348 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.305315 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6crg\" (UniqueName: \"kubernetes.io/projected/c0cedf86-36ef-472e-8850-3ba887b542ff-kube-api-access-t6crg\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn\" (UID: \"c0cedf86-36ef-472e-8850-3ba887b542ff\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:00:49.305348 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.305342 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0cedf86-36ef-472e-8850-3ba887b542ff-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn\" (UID: \"c0cedf86-36ef-472e-8850-3ba887b542ff\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:00:49.312352 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.312326 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn"] Apr 23 19:00:49.406221 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.406182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c0cedf86-36ef-472e-8850-3ba887b542ff-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn\" (UID: \"c0cedf86-36ef-472e-8850-3ba887b542ff\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:00:49.406221 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.406223 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6crg\" (UniqueName: \"kubernetes.io/projected/c0cedf86-36ef-472e-8850-3ba887b542ff-kube-api-access-t6crg\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn\" (UID: \"c0cedf86-36ef-472e-8850-3ba887b542ff\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:00:49.406500 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.406248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0cedf86-36ef-472e-8850-3ba887b542ff-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn\" (UID: \"c0cedf86-36ef-472e-8850-3ba887b542ff\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:00:49.406500 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.406275 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0cedf86-36ef-472e-8850-3ba887b542ff-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn\" (UID: \"c0cedf86-36ef-472e-8850-3ba887b542ff\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:00:49.406500 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.406292 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c0cedf86-36ef-472e-8850-3ba887b542ff-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn\" (UID: \"c0cedf86-36ef-472e-8850-3ba887b542ff\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:00:49.406500 ip-10-0-137-157 kubenswrapper[2576]: E0423 19:00:49.406400 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert: secret "isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert" not found Apr 23 19:00:49.406500 ip-10-0-137-157 kubenswrapper[2576]: E0423 19:00:49.406492 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0cedf86-36ef-472e-8850-3ba887b542ff-proxy-tls podName:c0cedf86-36ef-472e-8850-3ba887b542ff nodeName:}" failed. No retries permitted until 2026-04-23 19:00:49.906443177 +0000 UTC m=+3739.073770475 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c0cedf86-36ef-472e-8850-3ba887b542ff-proxy-tls") pod "isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" (UID: "c0cedf86-36ef-472e-8850-3ba887b542ff") : secret "isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert" not found Apr 23 19:00:49.406809 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.406784 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0cedf86-36ef-472e-8850-3ba887b542ff-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn\" (UID: \"c0cedf86-36ef-472e-8850-3ba887b542ff\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:00:49.407031 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.407013 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c0cedf86-36ef-472e-8850-3ba887b542ff-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn\" (UID: \"c0cedf86-36ef-472e-8850-3ba887b542ff\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:00:49.407068 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.407026 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c0cedf86-36ef-472e-8850-3ba887b542ff-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn\" (UID: \"c0cedf86-36ef-472e-8850-3ba887b542ff\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:00:49.415182 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.415156 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6crg\" (UniqueName: \"kubernetes.io/projected/c0cedf86-36ef-472e-8850-3ba887b542ff-kube-api-access-t6crg\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn\" (UID: \"c0cedf86-36ef-472e-8850-3ba887b542ff\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:00:49.517918 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.517888 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f_a7833386-b80b-43b8-aad1-69b11977af03/storage-initializer/1.log" Apr 23 19:00:49.518374 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.518016 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" Apr 23 19:00:49.518374 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.518015 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f" event={"ID":"a7833386-b80b-43b8-aad1-69b11977af03","Type":"ContainerDied","Data":"f5a359259df7d0a4b8357a19e7ccc2d0081543d895e0fcb1d8a28b4f54b772ea"} Apr 23 19:00:49.518374 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.518060 2576 scope.go:117] "RemoveContainer" containerID="ca86aebd08fd905781acc0e398eb3150f5734b83740bf80a66443ec7fc2db07a" Apr 23 19:00:49.552503 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.552412 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f"] Apr 23 19:00:49.556498 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.556451 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5dbb8bb846-q9k7f"] Apr 23 19:00:49.911207 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.911166 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0cedf86-36ef-472e-8850-3ba887b542ff-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn\" (UID: \"c0cedf86-36ef-472e-8850-3ba887b542ff\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:00:49.913768 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:49.913743 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0cedf86-36ef-472e-8850-3ba887b542ff-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn\" (UID: \"c0cedf86-36ef-472e-8850-3ba887b542ff\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:00:50.212508 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:50.212403 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:00:50.338261 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:50.338234 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn"] Apr 23 19:00:50.340509 ip-10-0-137-157 kubenswrapper[2576]: W0423 19:00:50.340474 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0cedf86_36ef_472e_8850_3ba887b542ff.slice/crio-f3502cac1c69ef361b5a9c41055b3805f6a3a8fc5af7c43caeddcc6888e5f54e WatchSource:0}: Error finding container f3502cac1c69ef361b5a9c41055b3805f6a3a8fc5af7c43caeddcc6888e5f54e: Status 404 returned error can't find the container with id f3502cac1c69ef361b5a9c41055b3805f6a3a8fc5af7c43caeddcc6888e5f54e Apr 23 19:00:50.523477 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:50.523376 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" event={"ID":"c0cedf86-36ef-472e-8850-3ba887b542ff","Type":"ContainerStarted","Data":"42c9088f8f92b80f8b7060ddca37d0d69c1e716895b28ea02459ec05df4d56ac"} Apr 23 19:00:50.523477 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:50.523420 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" event={"ID":"c0cedf86-36ef-472e-8850-3ba887b542ff","Type":"ContainerStarted","Data":"f3502cac1c69ef361b5a9c41055b3805f6a3a8fc5af7c43caeddcc6888e5f54e"} Apr 23 19:00:51.415225 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:51.415193 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7833386-b80b-43b8-aad1-69b11977af03" path="/var/lib/kubelet/pods/a7833386-b80b-43b8-aad1-69b11977af03/volumes" Apr 23 19:00:51.527757 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:51.527713 2576 generic.go:358] "Generic (PLEG): container finished" podID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerID="42c9088f8f92b80f8b7060ddca37d0d69c1e716895b28ea02459ec05df4d56ac" exitCode=0 Apr 23 19:00:51.528105 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:51.527767 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" event={"ID":"c0cedf86-36ef-472e-8850-3ba887b542ff","Type":"ContainerDied","Data":"42c9088f8f92b80f8b7060ddca37d0d69c1e716895b28ea02459ec05df4d56ac"} Apr 23 19:00:52.532845 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:52.532810 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" event={"ID":"c0cedf86-36ef-472e-8850-3ba887b542ff","Type":"ContainerStarted","Data":"f229b6c6d194d8b2641dad940c3575bf8ab9d2d8953935b59b4d59c33412b934"} Apr 23 19:00:52.532845 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:52.532845 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" event={"ID":"c0cedf86-36ef-472e-8850-3ba887b542ff","Type":"ContainerStarted","Data":"5451a034e8768b68d4ec1eed11c82473f06b4a731f65309618c9d40bd71a5c80"} Apr 23 19:00:52.533284 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:52.532988 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:00:52.554751 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:52.554703 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" podStartSLOduration=3.554689716 podStartE2EDuration="3.554689716s" podCreationTimestamp="2026-04-23 19:00:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 19:00:52.552417151 +0000 UTC m=+3741.719744471" watchObservedRunningTime="2026-04-23 19:00:52.554689716 +0000 UTC m=+3741.722017034" Apr 23 19:00:53.536625 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:53.536583 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:00:53.537853 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:53.537827 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" podUID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 23 19:00:54.540185 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:54.540139 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" podUID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 23 19:00:59.544152 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:59.544123 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:00:59.544714 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:00:59.544611 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" podUID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 23 19:01:09.544908 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:01:09.544859 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" podUID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 23 19:01:19.545486 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:01:19.545424 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" podUID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 23 19:01:29.544672 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:01:29.544631 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" podUID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 23 19:01:39.544590 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:01:39.544553 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" podUID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 23 19:01:49.545349 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:01:49.545311 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" podUID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 23 19:01:59.546009 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:01:59.545978 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:02:09.355085 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:09.355048 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn"] Apr 23 19:02:09.355609 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:09.355350 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" podUID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerName="kserve-container" containerID="cri-o://5451a034e8768b68d4ec1eed11c82473f06b4a731f65309618c9d40bd71a5c80" gracePeriod=30 Apr 23 19:02:09.355609 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:09.355394 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" podUID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerName="kube-rbac-proxy" containerID="cri-o://f229b6c6d194d8b2641dad940c3575bf8ab9d2d8953935b59b4d59c33412b934" gracePeriod=30 Apr 23 19:02:09.540839 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:09.540797 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" podUID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.60:8643/healthz\": dial tcp 10.132.0.60:8643: connect: connection refused" Apr 23 19:02:09.545603 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:09.545573 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" podUID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 23 19:02:09.811024 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:09.810938 2576 generic.go:358] "Generic (PLEG): container finished" podID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerID="f229b6c6d194d8b2641dad940c3575bf8ab9d2d8953935b59b4d59c33412b934" exitCode=2 Apr 23 19:02:09.811024 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:09.810983 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" event={"ID":"c0cedf86-36ef-472e-8850-3ba887b542ff","Type":"ContainerDied","Data":"f229b6c6d194d8b2641dad940c3575bf8ab9d2d8953935b59b4d59c33412b934"} Apr 23 19:02:10.441490 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:10.441433 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl"] Apr 23 19:02:10.445147 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:10.445130 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" Apr 23 19:02:10.446933 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:10.446911 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert\"" Apr 23 19:02:10.447063 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:10.447045 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\"" Apr 23 19:02:10.454888 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:10.454861 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl"] Apr 23 19:02:10.486930 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:10.486897 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/abaaf315-d7dc-4d94-967e-725730590812-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl\" (UID: \"abaaf315-d7dc-4d94-967e-725730590812\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" Apr 23 19:02:10.487159 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:10.486950 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abaaf315-d7dc-4d94-967e-725730590812-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl\" (UID: \"abaaf315-d7dc-4d94-967e-725730590812\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" Apr 23 19:02:10.487159 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:10.487011 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w8mj\" (UniqueName: \"kubernetes.io/projected/abaaf315-d7dc-4d94-967e-725730590812-kube-api-access-6w8mj\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl\" (UID: \"abaaf315-d7dc-4d94-967e-725730590812\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" Apr 23 19:02:10.487159 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:10.487055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/abaaf315-d7dc-4d94-967e-725730590812-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl\" (UID: \"abaaf315-d7dc-4d94-967e-725730590812\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" Apr 23 19:02:10.588263 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:10.588225 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/abaaf315-d7dc-4d94-967e-725730590812-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl\" (UID: \"abaaf315-d7dc-4d94-967e-725730590812\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" Apr 23 19:02:10.588445 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:10.588280 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/abaaf315-d7dc-4d94-967e-725730590812-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl\" (UID: \"abaaf315-d7dc-4d94-967e-725730590812\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" Apr 23 19:02:10.588445 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:10.588311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abaaf315-d7dc-4d94-967e-725730590812-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl\" (UID: \"abaaf315-d7dc-4d94-967e-725730590812\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" Apr 23 19:02:10.588445 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:10.588342 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6w8mj\" (UniqueName: \"kubernetes.io/projected/abaaf315-d7dc-4d94-967e-725730590812-kube-api-access-6w8mj\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl\" (UID: \"abaaf315-d7dc-4d94-967e-725730590812\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" Apr 23 19:02:10.588445 ip-10-0-137-157 kubenswrapper[2576]: E0423 19:02:10.588384 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert" not found Apr 23 19:02:10.588672 ip-10-0-137-157 kubenswrapper[2576]: E0423 19:02:10.588484 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abaaf315-d7dc-4d94-967e-725730590812-proxy-tls podName:abaaf315-d7dc-4d94-967e-725730590812 nodeName:}" failed. No retries permitted until 2026-04-23 19:02:11.088440535 +0000 UTC m=+3820.255767833 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/abaaf315-d7dc-4d94-967e-725730590812-proxy-tls") pod "isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" (UID: "abaaf315-d7dc-4d94-967e-725730590812") : secret "isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert" not found Apr 23 19:02:10.588838 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:10.588820 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abaaf315-d7dc-4d94-967e-725730590812-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl\" (UID: \"abaaf315-d7dc-4d94-967e-725730590812\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" Apr 23 19:02:10.589181 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:10.589159 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/abaaf315-d7dc-4d94-967e-725730590812-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl\" (UID: \"abaaf315-d7dc-4d94-967e-725730590812\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" Apr 23 19:02:10.599850 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:10.599822 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w8mj\" (UniqueName: \"kubernetes.io/projected/abaaf315-d7dc-4d94-967e-725730590812-kube-api-access-6w8mj\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl\" (UID: \"abaaf315-d7dc-4d94-967e-725730590812\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" Apr 23 19:02:11.092892 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:11.092840 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/abaaf315-d7dc-4d94-967e-725730590812-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl\" (UID: \"abaaf315-d7dc-4d94-967e-725730590812\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" Apr 23 19:02:11.095438 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:11.095417 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/abaaf315-d7dc-4d94-967e-725730590812-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl\" (UID: \"abaaf315-d7dc-4d94-967e-725730590812\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" Apr 23 19:02:11.357353 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:11.357261 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" Apr 23 19:02:11.483870 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:11.483831 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl"] Apr 23 19:02:11.485050 ip-10-0-137-157 kubenswrapper[2576]: W0423 19:02:11.485023 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabaaf315_d7dc_4d94_967e_725730590812.slice/crio-5caac00a0bdef528a9f71a0e9409844fec8b549743b1b7693c88f6b6a4fcd5c3 WatchSource:0}: Error finding container 5caac00a0bdef528a9f71a0e9409844fec8b549743b1b7693c88f6b6a4fcd5c3: Status 404 returned error can't find the container with id 5caac00a0bdef528a9f71a0e9409844fec8b549743b1b7693c88f6b6a4fcd5c3 Apr 23 19:02:11.819415 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:11.819374 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" event={"ID":"abaaf315-d7dc-4d94-967e-725730590812","Type":"ContainerStarted","Data":"24d85ba5ad6aa303a39fa34125ac5f9eba86a80dc89d569621bc86e6ed6dbd51"} Apr 23 19:02:11.819627 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:11.819423 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" event={"ID":"abaaf315-d7dc-4d94-967e-725730590812","Type":"ContainerStarted","Data":"5caac00a0bdef528a9f71a0e9409844fec8b549743b1b7693c88f6b6a4fcd5c3"} Apr 23 19:02:13.702285 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.702261 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:02:13.816347 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.816316 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0cedf86-36ef-472e-8850-3ba887b542ff-proxy-tls\") pod \"c0cedf86-36ef-472e-8850-3ba887b542ff\" (UID: \"c0cedf86-36ef-472e-8850-3ba887b542ff\") " Apr 23 19:02:13.816555 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.816371 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0cedf86-36ef-472e-8850-3ba887b542ff-kserve-provision-location\") pod \"c0cedf86-36ef-472e-8850-3ba887b542ff\" (UID: \"c0cedf86-36ef-472e-8850-3ba887b542ff\") " Apr 23 19:02:13.816555 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.816430 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c0cedf86-36ef-472e-8850-3ba887b542ff-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"c0cedf86-36ef-472e-8850-3ba887b542ff\" (UID: \"c0cedf86-36ef-472e-8850-3ba887b542ff\") " Apr 23 19:02:13.816555 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.816495 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c0cedf86-36ef-472e-8850-3ba887b542ff-cabundle-cert\") pod \"c0cedf86-36ef-472e-8850-3ba887b542ff\" (UID: \"c0cedf86-36ef-472e-8850-3ba887b542ff\") " Apr 23 19:02:13.816555 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.816526 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6crg\" (UniqueName: \"kubernetes.io/projected/c0cedf86-36ef-472e-8850-3ba887b542ff-kube-api-access-t6crg\") pod \"c0cedf86-36ef-472e-8850-3ba887b542ff\" (UID: \"c0cedf86-36ef-472e-8850-3ba887b542ff\") " Apr 23 19:02:13.816769 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.816731 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0cedf86-36ef-472e-8850-3ba887b542ff-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c0cedf86-36ef-472e-8850-3ba887b542ff" (UID: "c0cedf86-36ef-472e-8850-3ba887b542ff"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:02:13.816835 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.816821 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0cedf86-36ef-472e-8850-3ba887b542ff-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:02:13.816901 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.816877 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0cedf86-36ef-472e-8850-3ba887b542ff-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "c0cedf86-36ef-472e-8850-3ba887b542ff" (UID: "c0cedf86-36ef-472e-8850-3ba887b542ff"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 19:02:13.816954 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.816894 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0cedf86-36ef-472e-8850-3ba887b542ff-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config") pod "c0cedf86-36ef-472e-8850-3ba887b542ff" (UID: "c0cedf86-36ef-472e-8850-3ba887b542ff"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 19:02:13.818750 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.818729 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0cedf86-36ef-472e-8850-3ba887b542ff-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c0cedf86-36ef-472e-8850-3ba887b542ff" (UID: "c0cedf86-36ef-472e-8850-3ba887b542ff"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:02:13.818750 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.818742 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0cedf86-36ef-472e-8850-3ba887b542ff-kube-api-access-t6crg" (OuterVolumeSpecName: "kube-api-access-t6crg") pod "c0cedf86-36ef-472e-8850-3ba887b542ff" (UID: "c0cedf86-36ef-472e-8850-3ba887b542ff"). InnerVolumeSpecName "kube-api-access-t6crg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:02:13.828736 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.828711 2576 generic.go:358] "Generic (PLEG): container finished" podID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerID="5451a034e8768b68d4ec1eed11c82473f06b4a731f65309618c9d40bd71a5c80" exitCode=0 Apr 23 19:02:13.828863 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.828782 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" event={"ID":"c0cedf86-36ef-472e-8850-3ba887b542ff","Type":"ContainerDied","Data":"5451a034e8768b68d4ec1eed11c82473f06b4a731f65309618c9d40bd71a5c80"} Apr 23 19:02:13.828863 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.828794 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" Apr 23 19:02:13.828863 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.828809 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn" event={"ID":"c0cedf86-36ef-472e-8850-3ba887b542ff","Type":"ContainerDied","Data":"f3502cac1c69ef361b5a9c41055b3805f6a3a8fc5af7c43caeddcc6888e5f54e"} Apr 23 19:02:13.828863 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.828824 2576 scope.go:117] "RemoveContainer" containerID="f229b6c6d194d8b2641dad940c3575bf8ab9d2d8953935b59b4d59c33412b934" Apr 23 19:02:13.843087 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.843067 2576 scope.go:117] "RemoveContainer" containerID="5451a034e8768b68d4ec1eed11c82473f06b4a731f65309618c9d40bd71a5c80" Apr 23 19:02:13.850374 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.850354 2576 scope.go:117] "RemoveContainer" containerID="42c9088f8f92b80f8b7060ddca37d0d69c1e716895b28ea02459ec05df4d56ac" Apr 23 19:02:13.854289 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.854268 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn"] Apr 23 19:02:13.857841 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.857825 2576 scope.go:117] "RemoveContainer" containerID="f229b6c6d194d8b2641dad940c3575bf8ab9d2d8953935b59b4d59c33412b934" Apr 23 19:02:13.858092 ip-10-0-137-157 kubenswrapper[2576]: E0423 19:02:13.858071 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f229b6c6d194d8b2641dad940c3575bf8ab9d2d8953935b59b4d59c33412b934\": container with ID starting with f229b6c6d194d8b2641dad940c3575bf8ab9d2d8953935b59b4d59c33412b934 not found: ID does not exist" containerID="f229b6c6d194d8b2641dad940c3575bf8ab9d2d8953935b59b4d59c33412b934" Apr 23 19:02:13.858151 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.858100 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f229b6c6d194d8b2641dad940c3575bf8ab9d2d8953935b59b4d59c33412b934"} err="failed to get container status \"f229b6c6d194d8b2641dad940c3575bf8ab9d2d8953935b59b4d59c33412b934\": rpc error: code = NotFound desc = could not find container \"f229b6c6d194d8b2641dad940c3575bf8ab9d2d8953935b59b4d59c33412b934\": container with ID starting with f229b6c6d194d8b2641dad940c3575bf8ab9d2d8953935b59b4d59c33412b934 not found: ID does not exist" Apr 23 19:02:13.858151 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.858119 2576 scope.go:117] "RemoveContainer" containerID="5451a034e8768b68d4ec1eed11c82473f06b4a731f65309618c9d40bd71a5c80" Apr 23 19:02:13.858322 ip-10-0-137-157 kubenswrapper[2576]: E0423 19:02:13.858307 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5451a034e8768b68d4ec1eed11c82473f06b4a731f65309618c9d40bd71a5c80\": container with ID starting with 5451a034e8768b68d4ec1eed11c82473f06b4a731f65309618c9d40bd71a5c80 not found: ID does not exist" containerID="5451a034e8768b68d4ec1eed11c82473f06b4a731f65309618c9d40bd71a5c80" Apr 23 19:02:13.858360 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.858327 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5451a034e8768b68d4ec1eed11c82473f06b4a731f65309618c9d40bd71a5c80"} err="failed to get container status \"5451a034e8768b68d4ec1eed11c82473f06b4a731f65309618c9d40bd71a5c80\": rpc error: code = NotFound desc = could not find container \"5451a034e8768b68d4ec1eed11c82473f06b4a731f65309618c9d40bd71a5c80\": container with ID starting with 5451a034e8768b68d4ec1eed11c82473f06b4a731f65309618c9d40bd71a5c80 not found: ID does not exist" Apr 23 19:02:13.858360 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.858339 2576 scope.go:117] "RemoveContainer" containerID="42c9088f8f92b80f8b7060ddca37d0d69c1e716895b28ea02459ec05df4d56ac" Apr 23 19:02:13.858591 ip-10-0-137-157 kubenswrapper[2576]: E0423 19:02:13.858575 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42c9088f8f92b80f8b7060ddca37d0d69c1e716895b28ea02459ec05df4d56ac\": container with ID starting with 42c9088f8f92b80f8b7060ddca37d0d69c1e716895b28ea02459ec05df4d56ac not found: ID does not exist" containerID="42c9088f8f92b80f8b7060ddca37d0d69c1e716895b28ea02459ec05df4d56ac" Apr 23 19:02:13.858655 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.858598 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42c9088f8f92b80f8b7060ddca37d0d69c1e716895b28ea02459ec05df4d56ac"} err="failed to get container status \"42c9088f8f92b80f8b7060ddca37d0d69c1e716895b28ea02459ec05df4d56ac\": rpc error: code = NotFound desc = could not find container \"42c9088f8f92b80f8b7060ddca37d0d69c1e716895b28ea02459ec05df4d56ac\": container with ID starting with 42c9088f8f92b80f8b7060ddca37d0d69c1e716895b28ea02459ec05df4d56ac not found: ID does not exist" Apr 23 19:02:13.865093 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.865071 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-5c8446c4d8-cn6cn"] Apr 23 19:02:13.918267 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.918234 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0cedf86-36ef-472e-8850-3ba887b542ff-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:02:13.918267 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.918258 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c0cedf86-36ef-472e-8850-3ba887b542ff-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:02:13.918267 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.918269 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c0cedf86-36ef-472e-8850-3ba887b542ff-cabundle-cert\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:02:13.918601 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:13.918281 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t6crg\" (UniqueName: \"kubernetes.io/projected/c0cedf86-36ef-472e-8850-3ba887b542ff-kube-api-access-t6crg\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:02:15.416052 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:15.416021 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0cedf86-36ef-472e-8850-3ba887b542ff" path="/var/lib/kubelet/pods/c0cedf86-36ef-472e-8850-3ba887b542ff/volumes" Apr 23 19:02:17.844828 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:17.844796 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl_abaaf315-d7dc-4d94-967e-725730590812/storage-initializer/0.log" Apr 23 19:02:17.845237 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:17.844836 2576 generic.go:358] "Generic (PLEG): container finished" podID="abaaf315-d7dc-4d94-967e-725730590812" containerID="24d85ba5ad6aa303a39fa34125ac5f9eba86a80dc89d569621bc86e6ed6dbd51" exitCode=1 Apr 23 19:02:17.845237 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:17.844919 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" event={"ID":"abaaf315-d7dc-4d94-967e-725730590812","Type":"ContainerDied","Data":"24d85ba5ad6aa303a39fa34125ac5f9eba86a80dc89d569621bc86e6ed6dbd51"} Apr 23 19:02:18.849507 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:18.849478 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl_abaaf315-d7dc-4d94-967e-725730590812/storage-initializer/0.log" Apr 23 19:02:18.849907 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:18.849581 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" event={"ID":"abaaf315-d7dc-4d94-967e-725730590812","Type":"ContainerStarted","Data":"ca6ff68be1678ab2a55897d30e51f9ff685f6f564fdaaa28066bb53c8819e4a0"} Apr 23 19:02:20.416619 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:20.416585 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl"] Apr 23 19:02:20.417078 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:20.416860 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" podUID="abaaf315-d7dc-4d94-967e-725730590812" containerName="storage-initializer" containerID="cri-o://ca6ff68be1678ab2a55897d30e51f9ff685f6f564fdaaa28066bb53c8819e4a0" gracePeriod=30 Apr 23 19:02:21.550536 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.550493 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv"] Apr 23 19:02:21.551032 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.551010 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerName="kube-rbac-proxy" Apr 23 19:02:21.551108 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.551037 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerName="kube-rbac-proxy" Apr 23 19:02:21.551108 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.551053 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerName="storage-initializer" Apr 23 19:02:21.551108 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.551062 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerName="storage-initializer" Apr 23 19:02:21.551108 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.551091 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerName="kserve-container" Apr 23 19:02:21.551108 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.551101 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerName="kserve-container" Apr 23 19:02:21.551362 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.551196 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerName="kube-rbac-proxy" Apr 23 19:02:21.551362 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.551211 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0cedf86-36ef-472e-8850-3ba887b542ff" containerName="kserve-container" Apr 23 19:02:21.554480 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.554442 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:02:21.556501 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.556476 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\"" Apr 23 19:02:21.556501 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.556486 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 23 19:02:21.556677 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.556580 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert\"" Apr 23 19:02:21.567077 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.567053 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv"] Apr 23 19:02:21.686112 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.686078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e26bb39-0027-4c53-99b4-2f0a00518bf2-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv\" (UID: \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:02:21.686256 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.686122 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5e26bb39-0027-4c53-99b4-2f0a00518bf2-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv\" (UID: \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:02:21.686256 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.686156 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5e26bb39-0027-4c53-99b4-2f0a00518bf2-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv\" (UID: \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:02:21.686256 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.686223 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e26bb39-0027-4c53-99b4-2f0a00518bf2-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv\" (UID: \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:02:21.686438 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.686255 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6ghw\" (UniqueName: \"kubernetes.io/projected/5e26bb39-0027-4c53-99b4-2f0a00518bf2-kube-api-access-p6ghw\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv\" (UID: \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:02:21.760422 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.760401 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl_abaaf315-d7dc-4d94-967e-725730590812/storage-initializer/1.log" Apr 23 19:02:21.760747 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.760733 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl_abaaf315-d7dc-4d94-967e-725730590812/storage-initializer/0.log" Apr 23 19:02:21.760801 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.760793 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" Apr 23 19:02:21.786803 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.786776 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6ghw\" (UniqueName: \"kubernetes.io/projected/5e26bb39-0027-4c53-99b4-2f0a00518bf2-kube-api-access-p6ghw\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv\" (UID: \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:02:21.786942 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.786816 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e26bb39-0027-4c53-99b4-2f0a00518bf2-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv\" (UID: \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:02:21.786942 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.786837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5e26bb39-0027-4c53-99b4-2f0a00518bf2-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv\" (UID: \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:02:21.786942 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.786868 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5e26bb39-0027-4c53-99b4-2f0a00518bf2-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv\" (UID: \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:02:21.786942 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.786905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e26bb39-0027-4c53-99b4-2f0a00518bf2-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv\" (UID: \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:02:21.787285 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.787265 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e26bb39-0027-4c53-99b4-2f0a00518bf2-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv\" (UID: \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:02:21.787511 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.787491 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5e26bb39-0027-4c53-99b4-2f0a00518bf2-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv\" (UID: \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:02:21.787591 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.787535 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5e26bb39-0027-4c53-99b4-2f0a00518bf2-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv\" (UID: \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:02:21.789404 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.789386 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e26bb39-0027-4c53-99b4-2f0a00518bf2-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv\" (UID: \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:02:21.794507 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.794486 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6ghw\" (UniqueName: \"kubernetes.io/projected/5e26bb39-0027-4c53-99b4-2f0a00518bf2-kube-api-access-p6ghw\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv\" (UID: \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:02:21.863436 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.863412 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl_abaaf315-d7dc-4d94-967e-725730590812/storage-initializer/1.log" Apr 23 19:02:21.863838 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.863823 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl_abaaf315-d7dc-4d94-967e-725730590812/storage-initializer/0.log" Apr 23 19:02:21.863898 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.863864 2576 generic.go:358] "Generic (PLEG): container finished" podID="abaaf315-d7dc-4d94-967e-725730590812" containerID="ca6ff68be1678ab2a55897d30e51f9ff685f6f564fdaaa28066bb53c8819e4a0" exitCode=1 Apr 23 19:02:21.863935 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.863897 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" event={"ID":"abaaf315-d7dc-4d94-967e-725730590812","Type":"ContainerDied","Data":"ca6ff68be1678ab2a55897d30e51f9ff685f6f564fdaaa28066bb53c8819e4a0"} Apr 23 19:02:21.863935 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.863922 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" event={"ID":"abaaf315-d7dc-4d94-967e-725730590812","Type":"ContainerDied","Data":"5caac00a0bdef528a9f71a0e9409844fec8b549743b1b7693c88f6b6a4fcd5c3"} Apr 23 19:02:21.864001 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.863938 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl" Apr 23 19:02:21.864071 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.863942 2576 scope.go:117] "RemoveContainer" containerID="ca6ff68be1678ab2a55897d30e51f9ff685f6f564fdaaa28066bb53c8819e4a0" Apr 23 19:02:21.866089 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.866070 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:02:21.874088 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.874069 2576 scope.go:117] "RemoveContainer" containerID="24d85ba5ad6aa303a39fa34125ac5f9eba86a80dc89d569621bc86e6ed6dbd51" Apr 23 19:02:21.883525 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.883504 2576 scope.go:117] "RemoveContainer" containerID="ca6ff68be1678ab2a55897d30e51f9ff685f6f564fdaaa28066bb53c8819e4a0" Apr 23 19:02:21.883820 ip-10-0-137-157 kubenswrapper[2576]: E0423 19:02:21.883801 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca6ff68be1678ab2a55897d30e51f9ff685f6f564fdaaa28066bb53c8819e4a0\": container with ID starting with ca6ff68be1678ab2a55897d30e51f9ff685f6f564fdaaa28066bb53c8819e4a0 not found: ID does not exist" containerID="ca6ff68be1678ab2a55897d30e51f9ff685f6f564fdaaa28066bb53c8819e4a0" Apr 23 19:02:21.883877 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.883827 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca6ff68be1678ab2a55897d30e51f9ff685f6f564fdaaa28066bb53c8819e4a0"} err="failed to get container status \"ca6ff68be1678ab2a55897d30e51f9ff685f6f564fdaaa28066bb53c8819e4a0\": rpc error: code = NotFound desc = could not find container \"ca6ff68be1678ab2a55897d30e51f9ff685f6f564fdaaa28066bb53c8819e4a0\": container with ID starting with ca6ff68be1678ab2a55897d30e51f9ff685f6f564fdaaa28066bb53c8819e4a0 not found: ID does not exist" Apr 23 19:02:21.883877 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.883846 2576 scope.go:117] "RemoveContainer" containerID="24d85ba5ad6aa303a39fa34125ac5f9eba86a80dc89d569621bc86e6ed6dbd51" Apr 23 19:02:21.884108 ip-10-0-137-157 kubenswrapper[2576]: E0423 19:02:21.884089 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24d85ba5ad6aa303a39fa34125ac5f9eba86a80dc89d569621bc86e6ed6dbd51\": container with ID starting with 24d85ba5ad6aa303a39fa34125ac5f9eba86a80dc89d569621bc86e6ed6dbd51 not found: ID does not exist" containerID="24d85ba5ad6aa303a39fa34125ac5f9eba86a80dc89d569621bc86e6ed6dbd51" Apr 23 19:02:21.884168 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.884116 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24d85ba5ad6aa303a39fa34125ac5f9eba86a80dc89d569621bc86e6ed6dbd51"} err="failed to get container status \"24d85ba5ad6aa303a39fa34125ac5f9eba86a80dc89d569621bc86e6ed6dbd51\": rpc error: code = NotFound desc = could not find container \"24d85ba5ad6aa303a39fa34125ac5f9eba86a80dc89d569621bc86e6ed6dbd51\": container with ID starting with 24d85ba5ad6aa303a39fa34125ac5f9eba86a80dc89d569621bc86e6ed6dbd51 not found: ID does not exist" Apr 23 19:02:21.887439 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.887420 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abaaf315-d7dc-4d94-967e-725730590812-kserve-provision-location\") pod \"abaaf315-d7dc-4d94-967e-725730590812\" (UID: \"abaaf315-d7dc-4d94-967e-725730590812\") " Apr 23 19:02:21.887526 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.887505 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/abaaf315-d7dc-4d94-967e-725730590812-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"abaaf315-d7dc-4d94-967e-725730590812\" (UID: \"abaaf315-d7dc-4d94-967e-725730590812\") " Apr 23 19:02:21.887569 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.887533 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w8mj\" (UniqueName: \"kubernetes.io/projected/abaaf315-d7dc-4d94-967e-725730590812-kube-api-access-6w8mj\") pod \"abaaf315-d7dc-4d94-967e-725730590812\" (UID: \"abaaf315-d7dc-4d94-967e-725730590812\") " Apr 23 19:02:21.887569 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.887550 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/abaaf315-d7dc-4d94-967e-725730590812-proxy-tls\") pod \"abaaf315-d7dc-4d94-967e-725730590812\" (UID: \"abaaf315-d7dc-4d94-967e-725730590812\") " Apr 23 19:02:21.887703 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.887678 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abaaf315-d7dc-4d94-967e-725730590812-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "abaaf315-d7dc-4d94-967e-725730590812" (UID: "abaaf315-d7dc-4d94-967e-725730590812"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:02:21.887844 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.887819 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abaaf315-d7dc-4d94-967e-725730590812-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:02:21.887948 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.887923 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abaaf315-d7dc-4d94-967e-725730590812-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config") pod "abaaf315-d7dc-4d94-967e-725730590812" (UID: "abaaf315-d7dc-4d94-967e-725730590812"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 19:02:21.890013 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.889985 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abaaf315-d7dc-4d94-967e-725730590812-kube-api-access-6w8mj" (OuterVolumeSpecName: "kube-api-access-6w8mj") pod "abaaf315-d7dc-4d94-967e-725730590812" (UID: "abaaf315-d7dc-4d94-967e-725730590812"). InnerVolumeSpecName "kube-api-access-6w8mj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:02:21.890106 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.890082 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abaaf315-d7dc-4d94-967e-725730590812-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "abaaf315-d7dc-4d94-967e-725730590812" (UID: "abaaf315-d7dc-4d94-967e-725730590812"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:02:21.988960 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.988930 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/abaaf315-d7dc-4d94-967e-725730590812-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:02:21.988960 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.988958 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6w8mj\" (UniqueName: \"kubernetes.io/projected/abaaf315-d7dc-4d94-967e-725730590812-kube-api-access-6w8mj\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:02:21.989150 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.988968 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/abaaf315-d7dc-4d94-967e-725730590812-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:02:21.997524 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:21.997492 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv"] Apr 23 19:02:22.000193 ip-10-0-137-157 kubenswrapper[2576]: W0423 19:02:22.000169 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e26bb39_0027_4c53_99b4_2f0a00518bf2.slice/crio-ce037f5dc5131e16df2fc26c2aa353642239be90c8ad7fc0e9d40029e2f83e92 WatchSource:0}: Error finding container ce037f5dc5131e16df2fc26c2aa353642239be90c8ad7fc0e9d40029e2f83e92: Status 404 returned error can't find the container with id ce037f5dc5131e16df2fc26c2aa353642239be90c8ad7fc0e9d40029e2f83e92 Apr 23 19:02:22.198527 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:22.198491 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl"] Apr 23 19:02:22.202741 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:22.202716 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7b5c766dc8-rdtvl"] Apr 23 19:02:22.868803 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:22.868715 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" event={"ID":"5e26bb39-0027-4c53-99b4-2f0a00518bf2","Type":"ContainerStarted","Data":"5d8a4e928c4805b6594dbf4981a0addd96f39f2e60a2fbe2c6561595fb5bfe01"} Apr 23 19:02:22.868803 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:22.868756 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" event={"ID":"5e26bb39-0027-4c53-99b4-2f0a00518bf2","Type":"ContainerStarted","Data":"ce037f5dc5131e16df2fc26c2aa353642239be90c8ad7fc0e9d40029e2f83e92"} Apr 23 19:02:23.415366 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:23.415334 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abaaf315-d7dc-4d94-967e-725730590812" path="/var/lib/kubelet/pods/abaaf315-d7dc-4d94-967e-725730590812/volumes" Apr 23 19:02:23.874411 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:23.874374 2576 generic.go:358] "Generic (PLEG): container finished" podID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerID="5d8a4e928c4805b6594dbf4981a0addd96f39f2e60a2fbe2c6561595fb5bfe01" exitCode=0 Apr 23 19:02:23.874826 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:23.874486 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" event={"ID":"5e26bb39-0027-4c53-99b4-2f0a00518bf2","Type":"ContainerDied","Data":"5d8a4e928c4805b6594dbf4981a0addd96f39f2e60a2fbe2c6561595fb5bfe01"} Apr 23 19:02:24.881995 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:24.881951 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" event={"ID":"5e26bb39-0027-4c53-99b4-2f0a00518bf2","Type":"ContainerStarted","Data":"7e12788440f71cfbf2b70673af6175582cea097a635267b13a91f975dbdc7579"} Apr 23 19:02:24.881995 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:24.881995 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" event={"ID":"5e26bb39-0027-4c53-99b4-2f0a00518bf2","Type":"ContainerStarted","Data":"d9140e9e25ab29b53e4a91a689c631329c4fd8caf259d5e65e28325a334e5571"} Apr 23 19:02:24.882533 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:24.882157 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:02:24.901007 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:24.900951 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" podStartSLOduration=3.900936749 podStartE2EDuration="3.900936749s" podCreationTimestamp="2026-04-23 19:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 19:02:24.899224533 +0000 UTC m=+3834.066551863" watchObservedRunningTime="2026-04-23 19:02:24.900936749 +0000 UTC m=+3834.068264067" Apr 23 19:02:25.885415 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:25.885380 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:02:25.886701 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:25.886675 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" podUID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 23 19:02:26.888596 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:26.888561 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" podUID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 23 19:02:31.893612 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:31.893579 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:02:31.894188 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:31.894158 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" podUID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 23 19:02:41.895017 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:41.894930 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" podUID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 23 19:02:51.894271 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:02:51.894224 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" podUID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 23 19:03:01.894592 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:01.894546 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" podUID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 23 19:03:11.894190 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:11.894149 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" podUID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 23 19:03:21.894346 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:21.894307 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" podUID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 23 19:03:31.618216 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:31.618180 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 19:03:31.629781 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:31.629760 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 19:03:31.894770 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:31.894690 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:03:41.533275 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:41.533245 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv"] Apr 23 19:03:41.533759 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:41.533573 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" podUID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerName="kserve-container" containerID="cri-o://d9140e9e25ab29b53e4a91a689c631329c4fd8caf259d5e65e28325a334e5571" gracePeriod=30 Apr 23 19:03:41.533759 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:41.533653 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" podUID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerName="kube-rbac-proxy" containerID="cri-o://7e12788440f71cfbf2b70673af6175582cea097a635267b13a91f975dbdc7579" gracePeriod=30 Apr 23 19:03:41.889829 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:41.889783 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" podUID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.62:8643/healthz\": dial tcp 10.132.0.62:8643: connect: connection refused" Apr 23 19:03:41.894129 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:41.894104 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" podUID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 23 19:03:42.145514 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.145405 2576 generic.go:358] "Generic (PLEG): container finished" podID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerID="7e12788440f71cfbf2b70673af6175582cea097a635267b13a91f975dbdc7579" exitCode=2 Apr 23 19:03:42.145514 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.145488 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" event={"ID":"5e26bb39-0027-4c53-99b4-2f0a00518bf2","Type":"ContainerDied","Data":"7e12788440f71cfbf2b70673af6175582cea097a635267b13a91f975dbdc7579"} Apr 23 19:03:42.612616 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.612586 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn"] Apr 23 19:03:42.613063 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.613047 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="abaaf315-d7dc-4d94-967e-725730590812" containerName="storage-initializer" Apr 23 19:03:42.613110 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.613065 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="abaaf315-d7dc-4d94-967e-725730590812" containerName="storage-initializer" Apr 23 19:03:42.613110 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.613087 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="abaaf315-d7dc-4d94-967e-725730590812" containerName="storage-initializer" Apr 23 19:03:42.613110 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.613096 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="abaaf315-d7dc-4d94-967e-725730590812" containerName="storage-initializer" Apr 23 19:03:42.613209 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.613181 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="abaaf315-d7dc-4d94-967e-725730590812" containerName="storage-initializer" Apr 23 19:03:42.613209 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.613193 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="abaaf315-d7dc-4d94-967e-725730590812" containerName="storage-initializer" Apr 23 19:03:42.616447 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.616425 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" Apr 23 19:03:42.618508 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.618481 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert\"" Apr 23 19:03:42.618637 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.618487 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\"" Apr 23 19:03:42.626640 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.626611 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn"] Apr 23 19:03:42.759667 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.759627 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de31943c-7163-4096-81ba-43c04c925e7e-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn\" (UID: \"de31943c-7163-4096-81ba-43c04c925e7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" Apr 23 19:03:42.759667 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.759671 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de31943c-7163-4096-81ba-43c04c925e7e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn\" (UID: \"de31943c-7163-4096-81ba-43c04c925e7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" Apr 23 19:03:42.759875 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.759711 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/de31943c-7163-4096-81ba-43c04c925e7e-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn\" (UID: \"de31943c-7163-4096-81ba-43c04c925e7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" Apr 23 19:03:42.759875 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.759768 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfb2f\" (UniqueName: \"kubernetes.io/projected/de31943c-7163-4096-81ba-43c04c925e7e-kube-api-access-kfb2f\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn\" (UID: \"de31943c-7163-4096-81ba-43c04c925e7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" Apr 23 19:03:42.861019 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.860965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de31943c-7163-4096-81ba-43c04c925e7e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn\" (UID: \"de31943c-7163-4096-81ba-43c04c925e7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" Apr 23 19:03:42.861311 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.861042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/de31943c-7163-4096-81ba-43c04c925e7e-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn\" (UID: \"de31943c-7163-4096-81ba-43c04c925e7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" Apr 23 19:03:42.861311 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.861085 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfb2f\" (UniqueName: \"kubernetes.io/projected/de31943c-7163-4096-81ba-43c04c925e7e-kube-api-access-kfb2f\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn\" (UID: \"de31943c-7163-4096-81ba-43c04c925e7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" Apr 23 19:03:42.861311 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.861136 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de31943c-7163-4096-81ba-43c04c925e7e-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn\" (UID: \"de31943c-7163-4096-81ba-43c04c925e7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" Apr 23 19:03:42.861311 ip-10-0-137-157 kubenswrapper[2576]: E0423 19:03:42.861242 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 23 19:03:42.861311 ip-10-0-137-157 kubenswrapper[2576]: E0423 19:03:42.861304 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de31943c-7163-4096-81ba-43c04c925e7e-proxy-tls podName:de31943c-7163-4096-81ba-43c04c925e7e nodeName:}" failed. No retries permitted until 2026-04-23 19:03:43.361287444 +0000 UTC m=+3912.528614741 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/de31943c-7163-4096-81ba-43c04c925e7e-proxy-tls") pod "isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" (UID: "de31943c-7163-4096-81ba-43c04c925e7e") : secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 23 19:03:42.861653 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.861452 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de31943c-7163-4096-81ba-43c04c925e7e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn\" (UID: \"de31943c-7163-4096-81ba-43c04c925e7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" Apr 23 19:03:42.861864 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.861840 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/de31943c-7163-4096-81ba-43c04c925e7e-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn\" (UID: \"de31943c-7163-4096-81ba-43c04c925e7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" Apr 23 19:03:42.869712 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:42.869658 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfb2f\" (UniqueName: \"kubernetes.io/projected/de31943c-7163-4096-81ba-43c04c925e7e-kube-api-access-kfb2f\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn\" (UID: \"de31943c-7163-4096-81ba-43c04c925e7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" Apr 23 19:03:43.366410 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:43.366373 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de31943c-7163-4096-81ba-43c04c925e7e-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn\" (UID: \"de31943c-7163-4096-81ba-43c04c925e7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" Apr 23 19:03:43.369026 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:43.368993 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de31943c-7163-4096-81ba-43c04c925e7e-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn\" (UID: \"de31943c-7163-4096-81ba-43c04c925e7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" Apr 23 19:03:43.528822 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:43.528784 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" Apr 23 19:03:43.655366 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:43.655342 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn"] Apr 23 19:03:43.659586 ip-10-0-137-157 kubenswrapper[2576]: W0423 19:03:43.659552 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde31943c_7163_4096_81ba_43c04c925e7e.slice/crio-25b747619c56ce37a3c06bd1615eb51e4fa0e836318543f8b027b0370689e44e WatchSource:0}: Error finding container 25b747619c56ce37a3c06bd1615eb51e4fa0e836318543f8b027b0370689e44e: Status 404 returned error can't find the container with id 25b747619c56ce37a3c06bd1615eb51e4fa0e836318543f8b027b0370689e44e Apr 23 19:03:43.661708 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:43.661690 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 19:03:44.153801 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:44.153764 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" event={"ID":"de31943c-7163-4096-81ba-43c04c925e7e","Type":"ContainerStarted","Data":"b4a810d7b853d9c09e4d5f506f3697494f0ea88c92e115978e2a616938f82357"} Apr 23 19:03:44.153801 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:44.153802 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" event={"ID":"de31943c-7163-4096-81ba-43c04c925e7e","Type":"ContainerStarted","Data":"25b747619c56ce37a3c06bd1615eb51e4fa0e836318543f8b027b0370689e44e"} Apr 23 19:03:45.879842 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:45.879817 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:03:45.991020 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:45.990922 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e26bb39-0027-4c53-99b4-2f0a00518bf2-kserve-provision-location\") pod \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\" (UID: \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\") " Apr 23 19:03:45.991020 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:45.990983 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6ghw\" (UniqueName: \"kubernetes.io/projected/5e26bb39-0027-4c53-99b4-2f0a00518bf2-kube-api-access-p6ghw\") pod \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\" (UID: \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\") " Apr 23 19:03:45.991263 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:45.991154 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5e26bb39-0027-4c53-99b4-2f0a00518bf2-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\" (UID: \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\") " Apr 23 19:03:45.991263 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:45.991188 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5e26bb39-0027-4c53-99b4-2f0a00518bf2-cabundle-cert\") pod \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\" (UID: \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\") " Apr 23 19:03:45.991263 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:45.991222 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e26bb39-0027-4c53-99b4-2f0a00518bf2-proxy-tls\") pod \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\" (UID: \"5e26bb39-0027-4c53-99b4-2f0a00518bf2\") " Apr 23 19:03:45.991499 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:45.991366 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e26bb39-0027-4c53-99b4-2f0a00518bf2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5e26bb39-0027-4c53-99b4-2f0a00518bf2" (UID: "5e26bb39-0027-4c53-99b4-2f0a00518bf2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:03:45.991600 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:45.991576 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e26bb39-0027-4c53-99b4-2f0a00518bf2-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config") pod "5e26bb39-0027-4c53-99b4-2f0a00518bf2" (UID: "5e26bb39-0027-4c53-99b4-2f0a00518bf2"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 19:03:45.991673 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:45.991599 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e26bb39-0027-4c53-99b4-2f0a00518bf2-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "5e26bb39-0027-4c53-99b4-2f0a00518bf2" (UID: "5e26bb39-0027-4c53-99b4-2f0a00518bf2"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 19:03:45.993423 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:45.993396 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e26bb39-0027-4c53-99b4-2f0a00518bf2-kube-api-access-p6ghw" (OuterVolumeSpecName: "kube-api-access-p6ghw") pod "5e26bb39-0027-4c53-99b4-2f0a00518bf2" (UID: "5e26bb39-0027-4c53-99b4-2f0a00518bf2"). InnerVolumeSpecName "kube-api-access-p6ghw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:03:45.993521 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:45.993430 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e26bb39-0027-4c53-99b4-2f0a00518bf2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5e26bb39-0027-4c53-99b4-2f0a00518bf2" (UID: "5e26bb39-0027-4c53-99b4-2f0a00518bf2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:03:46.092208 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:46.092171 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5e26bb39-0027-4c53-99b4-2f0a00518bf2-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:03:46.092208 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:46.092202 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5e26bb39-0027-4c53-99b4-2f0a00518bf2-cabundle-cert\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:03:46.092208 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:46.092213 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e26bb39-0027-4c53-99b4-2f0a00518bf2-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:03:46.092439 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:46.092222 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e26bb39-0027-4c53-99b4-2f0a00518bf2-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:03:46.092439 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:46.092232 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p6ghw\" (UniqueName: \"kubernetes.io/projected/5e26bb39-0027-4c53-99b4-2f0a00518bf2-kube-api-access-p6ghw\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:03:46.163628 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:46.163594 2576 generic.go:358] "Generic (PLEG): container finished" podID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerID="d9140e9e25ab29b53e4a91a689c631329c4fd8caf259d5e65e28325a334e5571" exitCode=0 Apr 23 19:03:46.163796 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:46.163680 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" Apr 23 19:03:46.163796 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:46.163684 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" event={"ID":"5e26bb39-0027-4c53-99b4-2f0a00518bf2","Type":"ContainerDied","Data":"d9140e9e25ab29b53e4a91a689c631329c4fd8caf259d5e65e28325a334e5571"} Apr 23 19:03:46.163916 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:46.163797 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv" event={"ID":"5e26bb39-0027-4c53-99b4-2f0a00518bf2","Type":"ContainerDied","Data":"ce037f5dc5131e16df2fc26c2aa353642239be90c8ad7fc0e9d40029e2f83e92"} Apr 23 19:03:46.163916 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:46.163821 2576 scope.go:117] "RemoveContainer" containerID="7e12788440f71cfbf2b70673af6175582cea097a635267b13a91f975dbdc7579" Apr 23 19:03:46.172740 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:46.172722 2576 scope.go:117] "RemoveContainer" containerID="d9140e9e25ab29b53e4a91a689c631329c4fd8caf259d5e65e28325a334e5571" Apr 23 19:03:46.179941 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:46.179922 2576 scope.go:117] "RemoveContainer" containerID="5d8a4e928c4805b6594dbf4981a0addd96f39f2e60a2fbe2c6561595fb5bfe01" Apr 23 19:03:46.184738 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:46.184716 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv"] Apr 23 19:03:46.187380 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:46.187256 2576 scope.go:117] "RemoveContainer" containerID="7e12788440f71cfbf2b70673af6175582cea097a635267b13a91f975dbdc7579" Apr 23 19:03:46.187683 ip-10-0-137-157 kubenswrapper[2576]: E0423 19:03:46.187654 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e12788440f71cfbf2b70673af6175582cea097a635267b13a91f975dbdc7579\": container with ID starting with 7e12788440f71cfbf2b70673af6175582cea097a635267b13a91f975dbdc7579 not found: ID does not exist" containerID="7e12788440f71cfbf2b70673af6175582cea097a635267b13a91f975dbdc7579" Apr 23 19:03:46.187777 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:46.187693 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e12788440f71cfbf2b70673af6175582cea097a635267b13a91f975dbdc7579"} err="failed to get container status \"7e12788440f71cfbf2b70673af6175582cea097a635267b13a91f975dbdc7579\": rpc error: code = NotFound desc = could not find container \"7e12788440f71cfbf2b70673af6175582cea097a635267b13a91f975dbdc7579\": container with ID starting with 7e12788440f71cfbf2b70673af6175582cea097a635267b13a91f975dbdc7579 not found: ID does not exist" Apr 23 19:03:46.187777 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:46.187719 2576 scope.go:117] "RemoveContainer" containerID="d9140e9e25ab29b53e4a91a689c631329c4fd8caf259d5e65e28325a334e5571" Apr 23 19:03:46.188014 ip-10-0-137-157 kubenswrapper[2576]: E0423 19:03:46.187996 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9140e9e25ab29b53e4a91a689c631329c4fd8caf259d5e65e28325a334e5571\": container with ID starting with d9140e9e25ab29b53e4a91a689c631329c4fd8caf259d5e65e28325a334e5571 not found: ID does not exist" containerID="d9140e9e25ab29b53e4a91a689c631329c4fd8caf259d5e65e28325a334e5571" Apr 23 19:03:46.188060 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:46.188020 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9140e9e25ab29b53e4a91a689c631329c4fd8caf259d5e65e28325a334e5571"} err="failed to get container status \"d9140e9e25ab29b53e4a91a689c631329c4fd8caf259d5e65e28325a334e5571\": rpc error: code = NotFound desc = could not find container \"d9140e9e25ab29b53e4a91a689c631329c4fd8caf259d5e65e28325a334e5571\": container with ID starting with d9140e9e25ab29b53e4a91a689c631329c4fd8caf259d5e65e28325a334e5571 not found: ID does not exist" Apr 23 19:03:46.188060 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:46.188037 2576 scope.go:117] "RemoveContainer" containerID="5d8a4e928c4805b6594dbf4981a0addd96f39f2e60a2fbe2c6561595fb5bfe01" Apr 23 19:03:46.188352 ip-10-0-137-157 kubenswrapper[2576]: E0423 19:03:46.188332 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d8a4e928c4805b6594dbf4981a0addd96f39f2e60a2fbe2c6561595fb5bfe01\": container with ID starting with 5d8a4e928c4805b6594dbf4981a0addd96f39f2e60a2fbe2c6561595fb5bfe01 not found: ID does not exist" containerID="5d8a4e928c4805b6594dbf4981a0addd96f39f2e60a2fbe2c6561595fb5bfe01" Apr 23 19:03:46.188430 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:46.188357 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d8a4e928c4805b6594dbf4981a0addd96f39f2e60a2fbe2c6561595fb5bfe01"} err="failed to get container status \"5d8a4e928c4805b6594dbf4981a0addd96f39f2e60a2fbe2c6561595fb5bfe01\": rpc error: code = NotFound desc = could not find container \"5d8a4e928c4805b6594dbf4981a0addd96f39f2e60a2fbe2c6561595fb5bfe01\": container with ID starting with 5d8a4e928c4805b6594dbf4981a0addd96f39f2e60a2fbe2c6561595fb5bfe01 not found: ID does not exist" Apr 23 19:03:46.188430 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:46.188362 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-9d8f6fcff-xjjgv"] Apr 23 19:03:47.169998 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:47.169961 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn_de31943c-7163-4096-81ba-43c04c925e7e/storage-initializer/0.log" Apr 23 19:03:47.169998 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:47.170000 2576 generic.go:358] "Generic (PLEG): container finished" podID="de31943c-7163-4096-81ba-43c04c925e7e" containerID="b4a810d7b853d9c09e4d5f506f3697494f0ea88c92e115978e2a616938f82357" exitCode=1 Apr 23 19:03:47.170415 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:47.170078 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" event={"ID":"de31943c-7163-4096-81ba-43c04c925e7e","Type":"ContainerDied","Data":"b4a810d7b853d9c09e4d5f506f3697494f0ea88c92e115978e2a616938f82357"} Apr 23 19:03:47.415349 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:47.415317 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" path="/var/lib/kubelet/pods/5e26bb39-0027-4c53-99b4-2f0a00518bf2/volumes" Apr 23 19:03:48.174992 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:48.174962 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn_de31943c-7163-4096-81ba-43c04c925e7e/storage-initializer/0.log" Apr 23 19:03:48.175589 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:48.175025 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" event={"ID":"de31943c-7163-4096-81ba-43c04c925e7e","Type":"ContainerStarted","Data":"55ff08f825d66e60214015776aa847105245d7bb2c33bf2d93a35402016ba08f"} Apr 23 19:03:52.610621 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:52.610580 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn"] Apr 23 19:03:52.611085 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:52.610976 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" podUID="de31943c-7163-4096-81ba-43c04c925e7e" containerName="storage-initializer" containerID="cri-o://55ff08f825d66e60214015776aa847105245d7bb2c33bf2d93a35402016ba08f" gracePeriod=30 Apr 23 19:03:53.557595 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:53.557572 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn_de31943c-7163-4096-81ba-43c04c925e7e/storage-initializer/1.log" Apr 23 19:03:53.557957 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:53.557941 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn_de31943c-7163-4096-81ba-43c04c925e7e/storage-initializer/0.log" Apr 23 19:03:53.558019 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:53.558007 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" Apr 23 19:03:53.655025 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:53.654992 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/de31943c-7163-4096-81ba-43c04c925e7e-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"de31943c-7163-4096-81ba-43c04c925e7e\" (UID: \"de31943c-7163-4096-81ba-43c04c925e7e\") " Apr 23 19:03:53.655438 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:53.655087 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfb2f\" (UniqueName: \"kubernetes.io/projected/de31943c-7163-4096-81ba-43c04c925e7e-kube-api-access-kfb2f\") pod \"de31943c-7163-4096-81ba-43c04c925e7e\" (UID: \"de31943c-7163-4096-81ba-43c04c925e7e\") " Apr 23 19:03:53.655438 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:53.655126 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de31943c-7163-4096-81ba-43c04c925e7e-kserve-provision-location\") pod \"de31943c-7163-4096-81ba-43c04c925e7e\" (UID: \"de31943c-7163-4096-81ba-43c04c925e7e\") " Apr 23 19:03:53.655438 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:53.655161 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de31943c-7163-4096-81ba-43c04c925e7e-proxy-tls\") pod \"de31943c-7163-4096-81ba-43c04c925e7e\" (UID: \"de31943c-7163-4096-81ba-43c04c925e7e\") " Apr 23 19:03:53.655438 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:53.655364 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de31943c-7163-4096-81ba-43c04c925e7e-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config") pod "de31943c-7163-4096-81ba-43c04c925e7e" (UID: "de31943c-7163-4096-81ba-43c04c925e7e"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 19:03:53.655649 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:53.655483 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de31943c-7163-4096-81ba-43c04c925e7e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "de31943c-7163-4096-81ba-43c04c925e7e" (UID: "de31943c-7163-4096-81ba-43c04c925e7e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:03:53.657269 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:53.657242 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de31943c-7163-4096-81ba-43c04c925e7e-kube-api-access-kfb2f" (OuterVolumeSpecName: "kube-api-access-kfb2f") pod "de31943c-7163-4096-81ba-43c04c925e7e" (UID: "de31943c-7163-4096-81ba-43c04c925e7e"). InnerVolumeSpecName "kube-api-access-kfb2f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:03:53.657367 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:53.657279 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de31943c-7163-4096-81ba-43c04c925e7e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "de31943c-7163-4096-81ba-43c04c925e7e" (UID: "de31943c-7163-4096-81ba-43c04c925e7e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:03:53.756687 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:53.756598 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kfb2f\" (UniqueName: \"kubernetes.io/projected/de31943c-7163-4096-81ba-43c04c925e7e-kube-api-access-kfb2f\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:03:53.756687 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:53.756628 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de31943c-7163-4096-81ba-43c04c925e7e-kserve-provision-location\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:03:53.756687 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:53.756638 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de31943c-7163-4096-81ba-43c04c925e7e-proxy-tls\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:03:53.756687 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:53.756649 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/de31943c-7163-4096-81ba-43c04c925e7e-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-157.ec2.internal\" DevicePath \"\"" Apr 23 19:03:54.196418 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:54.196386 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn_de31943c-7163-4096-81ba-43c04c925e7e/storage-initializer/1.log" Apr 23 19:03:54.196782 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:54.196766 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn_de31943c-7163-4096-81ba-43c04c925e7e/storage-initializer/0.log" Apr 23 19:03:54.196851 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:54.196804 2576 generic.go:358] "Generic (PLEG): container finished" podID="de31943c-7163-4096-81ba-43c04c925e7e" containerID="55ff08f825d66e60214015776aa847105245d7bb2c33bf2d93a35402016ba08f" exitCode=1 Apr 23 19:03:54.196922 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:54.196899 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" event={"ID":"de31943c-7163-4096-81ba-43c04c925e7e","Type":"ContainerDied","Data":"55ff08f825d66e60214015776aa847105245d7bb2c33bf2d93a35402016ba08f"} Apr 23 19:03:54.196970 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:54.196940 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" event={"ID":"de31943c-7163-4096-81ba-43c04c925e7e","Type":"ContainerDied","Data":"25b747619c56ce37a3c06bd1615eb51e4fa0e836318543f8b027b0370689e44e"} Apr 23 19:03:54.196970 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:54.196955 2576 scope.go:117] "RemoveContainer" containerID="55ff08f825d66e60214015776aa847105245d7bb2c33bf2d93a35402016ba08f" Apr 23 19:03:54.197075 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:54.196910 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn" Apr 23 19:03:54.205810 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:54.205790 2576 scope.go:117] "RemoveContainer" containerID="b4a810d7b853d9c09e4d5f506f3697494f0ea88c92e115978e2a616938f82357" Apr 23 19:03:54.213106 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:54.213090 2576 scope.go:117] "RemoveContainer" containerID="55ff08f825d66e60214015776aa847105245d7bb2c33bf2d93a35402016ba08f" Apr 23 19:03:54.213365 ip-10-0-137-157 kubenswrapper[2576]: E0423 19:03:54.213341 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55ff08f825d66e60214015776aa847105245d7bb2c33bf2d93a35402016ba08f\": container with ID starting with 55ff08f825d66e60214015776aa847105245d7bb2c33bf2d93a35402016ba08f not found: ID does not exist" containerID="55ff08f825d66e60214015776aa847105245d7bb2c33bf2d93a35402016ba08f" Apr 23 19:03:54.213444 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:54.213378 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55ff08f825d66e60214015776aa847105245d7bb2c33bf2d93a35402016ba08f"} err="failed to get container status \"55ff08f825d66e60214015776aa847105245d7bb2c33bf2d93a35402016ba08f\": rpc error: code = NotFound desc = could not find container \"55ff08f825d66e60214015776aa847105245d7bb2c33bf2d93a35402016ba08f\": container with ID starting with 55ff08f825d66e60214015776aa847105245d7bb2c33bf2d93a35402016ba08f not found: ID does not exist" Apr 23 19:03:54.213444 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:54.213407 2576 scope.go:117] "RemoveContainer" containerID="b4a810d7b853d9c09e4d5f506f3697494f0ea88c92e115978e2a616938f82357" Apr 23 19:03:54.213671 ip-10-0-137-157 kubenswrapper[2576]: E0423 19:03:54.213648 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4a810d7b853d9c09e4d5f506f3697494f0ea88c92e115978e2a616938f82357\": container with ID starting with b4a810d7b853d9c09e4d5f506f3697494f0ea88c92e115978e2a616938f82357 not found: ID does not exist" containerID="b4a810d7b853d9c09e4d5f506f3697494f0ea88c92e115978e2a616938f82357" Apr 23 19:03:54.213726 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:54.213679 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4a810d7b853d9c09e4d5f506f3697494f0ea88c92e115978e2a616938f82357"} err="failed to get container status \"b4a810d7b853d9c09e4d5f506f3697494f0ea88c92e115978e2a616938f82357\": rpc error: code = NotFound desc = could not find container \"b4a810d7b853d9c09e4d5f506f3697494f0ea88c92e115978e2a616938f82357\": container with ID starting with b4a810d7b853d9c09e4d5f506f3697494f0ea88c92e115978e2a616938f82357 not found: ID does not exist" Apr 23 19:03:54.242317 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:54.238520 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn"] Apr 23 19:03:54.245239 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:54.245215 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-fc57cf4df-25wgn"] Apr 23 19:03:55.415287 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:03:55.415247 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de31943c-7163-4096-81ba-43c04c925e7e" path="/var/lib/kubelet/pods/de31943c-7163-4096-81ba-43c04c925e7e/volumes" Apr 23 19:04:24.514500 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.514398 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-25xv6/must-gather-zv4lh"] Apr 23 19:04:24.514942 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.514749 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de31943c-7163-4096-81ba-43c04c925e7e" containerName="storage-initializer" Apr 23 19:04:24.514942 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.514760 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="de31943c-7163-4096-81ba-43c04c925e7e" containerName="storage-initializer" Apr 23 19:04:24.514942 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.514768 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerName="storage-initializer" Apr 23 19:04:24.514942 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.514774 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerName="storage-initializer" Apr 23 19:04:24.514942 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.514781 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerName="kube-rbac-proxy" Apr 23 19:04:24.514942 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.514786 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerName="kube-rbac-proxy" Apr 23 19:04:24.514942 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.514798 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerName="kserve-container" Apr 23 19:04:24.514942 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.514804 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerName="kserve-container" Apr 23 19:04:24.514942 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.514864 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="de31943c-7163-4096-81ba-43c04c925e7e" containerName="storage-initializer" Apr 23 19:04:24.514942 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.514875 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerName="kserve-container" Apr 23 19:04:24.514942 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.514881 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e26bb39-0027-4c53-99b4-2f0a00518bf2" containerName="kube-rbac-proxy" Apr 23 19:04:24.514942 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.514922 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de31943c-7163-4096-81ba-43c04c925e7e" containerName="storage-initializer" Apr 23 19:04:24.514942 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.514928 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="de31943c-7163-4096-81ba-43c04c925e7e" containerName="storage-initializer" Apr 23 19:04:24.515334 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.515002 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="de31943c-7163-4096-81ba-43c04c925e7e" containerName="storage-initializer" Apr 23 19:04:24.518041 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.518022 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-25xv6/must-gather-zv4lh" Apr 23 19:04:24.520311 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.520285 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-25xv6\"/\"kube-root-ca.crt\"" Apr 23 19:04:24.520420 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.520286 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-25xv6\"/\"openshift-service-ca.crt\"" Apr 23 19:04:24.520784 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.520766 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-25xv6\"/\"default-dockercfg-mzws8\"" Apr 23 19:04:24.524923 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.524900 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-25xv6/must-gather-zv4lh"] Apr 23 19:04:24.588771 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.588737 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/da09b9c2-5abd-4242-9c00-3de0e32487d7-must-gather-output\") pod \"must-gather-zv4lh\" (UID: \"da09b9c2-5abd-4242-9c00-3de0e32487d7\") " pod="openshift-must-gather-25xv6/must-gather-zv4lh" Apr 23 19:04:24.588941 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.588798 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj9h8\" (UniqueName: \"kubernetes.io/projected/da09b9c2-5abd-4242-9c00-3de0e32487d7-kube-api-access-kj9h8\") pod \"must-gather-zv4lh\" (UID: \"da09b9c2-5abd-4242-9c00-3de0e32487d7\") " pod="openshift-must-gather-25xv6/must-gather-zv4lh" Apr 23 19:04:24.690155 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.690106 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kj9h8\" (UniqueName: \"kubernetes.io/projected/da09b9c2-5abd-4242-9c00-3de0e32487d7-kube-api-access-kj9h8\") pod \"must-gather-zv4lh\" (UID: \"da09b9c2-5abd-4242-9c00-3de0e32487d7\") " pod="openshift-must-gather-25xv6/must-gather-zv4lh" Apr 23 19:04:24.690327 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.690197 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/da09b9c2-5abd-4242-9c00-3de0e32487d7-must-gather-output\") pod \"must-gather-zv4lh\" (UID: \"da09b9c2-5abd-4242-9c00-3de0e32487d7\") " pod="openshift-must-gather-25xv6/must-gather-zv4lh" Apr 23 19:04:24.690533 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.690517 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/da09b9c2-5abd-4242-9c00-3de0e32487d7-must-gather-output\") pod \"must-gather-zv4lh\" (UID: \"da09b9c2-5abd-4242-9c00-3de0e32487d7\") " pod="openshift-must-gather-25xv6/must-gather-zv4lh" Apr 23 19:04:24.698353 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.698322 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj9h8\" (UniqueName: \"kubernetes.io/projected/da09b9c2-5abd-4242-9c00-3de0e32487d7-kube-api-access-kj9h8\") pod \"must-gather-zv4lh\" (UID: \"da09b9c2-5abd-4242-9c00-3de0e32487d7\") " pod="openshift-must-gather-25xv6/must-gather-zv4lh" Apr 23 19:04:24.838493 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.838442 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-25xv6/must-gather-zv4lh" Apr 23 19:04:24.960595 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:24.960570 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-25xv6/must-gather-zv4lh"] Apr 23 19:04:24.963098 ip-10-0-137-157 kubenswrapper[2576]: W0423 19:04:24.963072 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda09b9c2_5abd_4242_9c00_3de0e32487d7.slice/crio-62231c3ccc28fa5475eb12e9a2ab68cdccc1f17ecd188a1d97ae8277b4ed6455 WatchSource:0}: Error finding container 62231c3ccc28fa5475eb12e9a2ab68cdccc1f17ecd188a1d97ae8277b4ed6455: Status 404 returned error can't find the container with id 62231c3ccc28fa5475eb12e9a2ab68cdccc1f17ecd188a1d97ae8277b4ed6455 Apr 23 19:04:25.310123 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:25.310032 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-25xv6/must-gather-zv4lh" event={"ID":"da09b9c2-5abd-4242-9c00-3de0e32487d7","Type":"ContainerStarted","Data":"62231c3ccc28fa5475eb12e9a2ab68cdccc1f17ecd188a1d97ae8277b4ed6455"} Apr 23 19:04:26.316632 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:26.316592 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-25xv6/must-gather-zv4lh" event={"ID":"da09b9c2-5abd-4242-9c00-3de0e32487d7","Type":"ContainerStarted","Data":"3e0093f16386a22f635f031dc57b382e21aa9dc02c0cc01fdcf083f949d01dc1"} Apr 23 19:04:26.316632 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:26.316635 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-25xv6/must-gather-zv4lh" event={"ID":"da09b9c2-5abd-4242-9c00-3de0e32487d7","Type":"ContainerStarted","Data":"a30b10ccb98913c3bfdb14e229f2195e14604ead06b2f4d73ba86d085596d0c8"} Apr 23 19:04:26.333345 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:26.333281 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-25xv6/must-gather-zv4lh" podStartSLOduration=1.435908822 podStartE2EDuration="2.33326308s" podCreationTimestamp="2026-04-23 19:04:24 +0000 UTC" firstStartedPulling="2026-04-23 19:04:24.964958834 +0000 UTC m=+3954.132286134" lastFinishedPulling="2026-04-23 19:04:25.862313094 +0000 UTC m=+3955.029640392" observedRunningTime="2026-04-23 19:04:26.331453399 +0000 UTC m=+3955.498780720" watchObservedRunningTime="2026-04-23 19:04:26.33326308 +0000 UTC m=+3955.500590400" Apr 23 19:04:27.438813 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:27.438781 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-62sfd_a3fad012-152e-4084-a80b-63e1ff5a0998/global-pull-secret-syncer/0.log" Apr 23 19:04:27.615817 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:27.615778 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7kq2r_60dd6cf9-cd33-4a4c-bc5c-61d0fcd7b08a/konnectivity-agent/0.log" Apr 23 19:04:27.761391 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:27.761305 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-157.ec2.internal_e0be1704db5e6b55a9c3869587898cb5/haproxy/0.log" Apr 23 19:04:31.078832 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:31.078791 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_30ffb75d-23b2-49d1-912a-c5a2bfb24cbc/alertmanager/0.log" Apr 23 19:04:31.104979 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:31.104940 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_30ffb75d-23b2-49d1-912a-c5a2bfb24cbc/config-reloader/0.log" Apr 23 19:04:31.130523 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:31.130497 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_30ffb75d-23b2-49d1-912a-c5a2bfb24cbc/kube-rbac-proxy-web/0.log" Apr 23 19:04:31.155277 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:31.155244 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_30ffb75d-23b2-49d1-912a-c5a2bfb24cbc/kube-rbac-proxy/0.log" Apr 23 19:04:31.184005 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:31.183913 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_30ffb75d-23b2-49d1-912a-c5a2bfb24cbc/kube-rbac-proxy-metric/0.log" Apr 23 19:04:31.211345 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:31.211315 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_30ffb75d-23b2-49d1-912a-c5a2bfb24cbc/prom-label-proxy/0.log" Apr 23 19:04:31.240295 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:31.240261 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_30ffb75d-23b2-49d1-912a-c5a2bfb24cbc/init-config-reloader/0.log" Apr 23 19:04:31.308499 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:31.300377 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-f8b7t_275ce576-f50c-4c52-aa60-875645871e66/cluster-monitoring-operator/0.log" Apr 23 19:04:31.483847 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:31.483752 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cmn5h_03b2d537-2af8-4205-8fec-7db579b30694/node-exporter/0.log" Apr 23 19:04:31.508582 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:31.508547 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cmn5h_03b2d537-2af8-4205-8fec-7db579b30694/kube-rbac-proxy/0.log" Apr 23 19:04:31.533835 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:31.533805 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cmn5h_03b2d537-2af8-4205-8fec-7db579b30694/init-textfile/0.log" Apr 23 19:04:32.170486 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:32.170429 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7c874b45d-sjk2w_944258a9-3a02-4539-a1c0-d605fae95404/thanos-query/0.log" Apr 23 19:04:32.195217 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:32.195186 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7c874b45d-sjk2w_944258a9-3a02-4539-a1c0-d605fae95404/kube-rbac-proxy-web/0.log" Apr 23 19:04:32.219438 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:32.219402 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7c874b45d-sjk2w_944258a9-3a02-4539-a1c0-d605fae95404/kube-rbac-proxy/0.log" Apr 23 19:04:32.243401 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:32.243371 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7c874b45d-sjk2w_944258a9-3a02-4539-a1c0-d605fae95404/prom-label-proxy/0.log" Apr 23 19:04:32.268364 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:32.268323 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7c874b45d-sjk2w_944258a9-3a02-4539-a1c0-d605fae95404/kube-rbac-proxy-rules/0.log" Apr 23 19:04:32.301441 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:32.301405 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7c874b45d-sjk2w_944258a9-3a02-4539-a1c0-d605fae95404/kube-rbac-proxy-metrics/0.log" Apr 23 19:04:33.481100 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:33.481068 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-mw95w_ed805c43-a1a5-4865-9b28-5ecd8393eece/networking-console-plugin/0.log" Apr 23 19:04:34.336475 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:34.336433 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-rcfgz_5a9ac9a5-f676-42f2-9af1-a148cd6302d8/download-server/0.log" Apr 23 19:04:34.620813 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:34.620724 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5"] Apr 23 19:04:34.625286 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:34.625259 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5" Apr 23 19:04:34.637587 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:34.637556 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5"] Apr 23 19:04:34.686276 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:34.686232 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krl6j\" (UniqueName: \"kubernetes.io/projected/3f9578eb-d985-4d78-b45e-848ab09f3d2a-kube-api-access-krl6j\") pod \"perf-node-gather-daemonset-drjd5\" (UID: \"3f9578eb-d985-4d78-b45e-848ab09f3d2a\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5" Apr 23 19:04:34.686481 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:34.686296 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f9578eb-d985-4d78-b45e-848ab09f3d2a-lib-modules\") pod \"perf-node-gather-daemonset-drjd5\" (UID: \"3f9578eb-d985-4d78-b45e-848ab09f3d2a\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5" Apr 23 19:04:34.686481 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:34.686333 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3f9578eb-d985-4d78-b45e-848ab09f3d2a-podres\") pod \"perf-node-gather-daemonset-drjd5\" (UID: \"3f9578eb-d985-4d78-b45e-848ab09f3d2a\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5" Apr 23 19:04:34.686481 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:34.686372 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3f9578eb-d985-4d78-b45e-848ab09f3d2a-proc\") pod \"perf-node-gather-daemonset-drjd5\" (UID: \"3f9578eb-d985-4d78-b45e-848ab09f3d2a\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5" Apr 23 19:04:34.686481 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:34.686416 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f9578eb-d985-4d78-b45e-848ab09f3d2a-sys\") pod \"perf-node-gather-daemonset-drjd5\" (UID: \"3f9578eb-d985-4d78-b45e-848ab09f3d2a\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5" Apr 23 19:04:34.787221 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:34.787189 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3f9578eb-d985-4d78-b45e-848ab09f3d2a-podres\") pod \"perf-node-gather-daemonset-drjd5\" (UID: \"3f9578eb-d985-4d78-b45e-848ab09f3d2a\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5" Apr 23 19:04:34.787396 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:34.787229 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3f9578eb-d985-4d78-b45e-848ab09f3d2a-proc\") pod \"perf-node-gather-daemonset-drjd5\" (UID: \"3f9578eb-d985-4d78-b45e-848ab09f3d2a\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5" Apr 23 19:04:34.787396 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:34.787258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f9578eb-d985-4d78-b45e-848ab09f3d2a-sys\") pod \"perf-node-gather-daemonset-drjd5\" (UID: \"3f9578eb-d985-4d78-b45e-848ab09f3d2a\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5" Apr 23 19:04:34.787396 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:34.787327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krl6j\" (UniqueName: \"kubernetes.io/projected/3f9578eb-d985-4d78-b45e-848ab09f3d2a-kube-api-access-krl6j\") pod \"perf-node-gather-daemonset-drjd5\" (UID: \"3f9578eb-d985-4d78-b45e-848ab09f3d2a\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5" Apr 23 19:04:34.787396 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:34.787342 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3f9578eb-d985-4d78-b45e-848ab09f3d2a-proc\") pod \"perf-node-gather-daemonset-drjd5\" (UID: \"3f9578eb-d985-4d78-b45e-848ab09f3d2a\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5" Apr 23 19:04:34.787396 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:34.787357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f9578eb-d985-4d78-b45e-848ab09f3d2a-lib-modules\") pod \"perf-node-gather-daemonset-drjd5\" (UID: \"3f9578eb-d985-4d78-b45e-848ab09f3d2a\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5" Apr 23 19:04:34.787396 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:34.787377 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3f9578eb-d985-4d78-b45e-848ab09f3d2a-podres\") pod \"perf-node-gather-daemonset-drjd5\" (UID: \"3f9578eb-d985-4d78-b45e-848ab09f3d2a\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5" Apr 23 19:04:34.787638 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:34.787412 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f9578eb-d985-4d78-b45e-848ab09f3d2a-sys\") pod \"perf-node-gather-daemonset-drjd5\" (UID: \"3f9578eb-d985-4d78-b45e-848ab09f3d2a\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5" Apr 23 19:04:34.787638 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:34.787473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f9578eb-d985-4d78-b45e-848ab09f3d2a-lib-modules\") pod \"perf-node-gather-daemonset-drjd5\" (UID: \"3f9578eb-d985-4d78-b45e-848ab09f3d2a\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5" Apr 23 19:04:34.795363 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:34.795338 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krl6j\" (UniqueName: \"kubernetes.io/projected/3f9578eb-d985-4d78-b45e-848ab09f3d2a-kube-api-access-krl6j\") pod \"perf-node-gather-daemonset-drjd5\" (UID: \"3f9578eb-d985-4d78-b45e-848ab09f3d2a\") " pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5" Apr 23 19:04:34.938633 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:34.938539 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5" Apr 23 19:04:35.086199 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:35.085799 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5"] Apr 23 19:04:35.089227 ip-10-0-137-157 kubenswrapper[2576]: W0423 19:04:35.089196 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3f9578eb_d985_4d78_b45e_848ab09f3d2a.slice/crio-a6dec375469afb009b4e6f24121fd58cb92563c652f7d62ad04da6f366c3c6c8 WatchSource:0}: Error finding container a6dec375469afb009b4e6f24121fd58cb92563c652f7d62ad04da6f366c3c6c8: Status 404 returned error can't find the container with id a6dec375469afb009b4e6f24121fd58cb92563c652f7d62ad04da6f366c3c6c8 Apr 23 19:04:35.358266 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:35.358231 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5" event={"ID":"3f9578eb-d985-4d78-b45e-848ab09f3d2a","Type":"ContainerStarted","Data":"75db63ea944d40aeaacb897b1c21de41541e7349eb28b281b0147d98f577d29b"} Apr 23 19:04:35.358551 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:35.358529 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5" event={"ID":"3f9578eb-d985-4d78-b45e-848ab09f3d2a","Type":"ContainerStarted","Data":"a6dec375469afb009b4e6f24121fd58cb92563c652f7d62ad04da6f366c3c6c8"} Apr 23 19:04:35.358740 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:35.358715 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5" Apr 23 19:04:35.375375 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:35.375331 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5" podStartSLOduration=1.375318634 podStartE2EDuration="1.375318634s" podCreationTimestamp="2026-04-23 19:04:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 19:04:35.373185921 +0000 UTC m=+3964.540513261" watchObservedRunningTime="2026-04-23 19:04:35.375318634 +0000 UTC m=+3964.542645981" Apr 23 19:04:35.496882 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:35.496802 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6xhj9_bbb38a01-f704-4497-b3c1-20236e4e4f23/dns/0.log" Apr 23 19:04:35.522804 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:35.522770 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6xhj9_bbb38a01-f704-4497-b3c1-20236e4e4f23/kube-rbac-proxy/0.log" Apr 23 19:04:35.692278 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:35.692248 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-x7kw5_3ce8a6d4-d062-4813-b21e-b06b4d147b13/dns-node-resolver/0.log" Apr 23 19:04:36.181769 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:36.181737 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-k8ncf_365c5763-fac0-4121-9de2-0a669a25bc8c/node-ca/0.log" Apr 23 19:04:37.284212 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:37.284161 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6pv4g_e929e6ba-de02-4dcb-affd-2772d869c2e0/serve-healthcheck-canary/0.log" Apr 23 19:04:37.692110 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:37.692072 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-85zc7_23e40766-68ea-4cb5-b83f-5a64e8740c67/insights-operator/0.log" Apr 23 19:04:37.692720 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:37.692700 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-85zc7_23e40766-68ea-4cb5-b83f-5a64e8740c67/insights-operator/1.log" Apr 23 19:04:37.781726 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:37.781697 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9gqzb_881b7aa3-8be8-4f22-9712-03c2de339ea4/kube-rbac-proxy/0.log" Apr 23 19:04:37.809934 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:37.809900 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9gqzb_881b7aa3-8be8-4f22-9712-03c2de339ea4/exporter/0.log" Apr 23 19:04:37.836581 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:37.836550 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9gqzb_881b7aa3-8be8-4f22-9712-03c2de339ea4/extractor/0.log" Apr 23 19:04:40.215735 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:40.215707 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-x89r9_95f618ae-2c6a-441e-9f67-a46e8ff85d44/server/0.log" Apr 23 19:04:40.680991 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:40.680945 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-z6n4h_cf3159ad-fdc1-42eb-9c5c-d7876a8bc0db/s3-tls-init-serving/0.log" Apr 23 19:04:40.712379 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:40.712347 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-ghlhs_56921661-c4b9-4336-8078-2ef8305fc3e8/seaweedfs/0.log" Apr 23 19:04:40.738961 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:40.738933 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-cpgb7_a91c2e04-906f-4720-b338-fb7867f30f39/seaweedfs-tls-custom/0.log" Apr 23 19:04:41.379042 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:41.379002 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-25xv6/perf-node-gather-daemonset-drjd5" Apr 23 19:04:44.821387 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:44.821361 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-d4pdb_68703c38-8262-4cea-8aa5-15f64ae66fa1/migrator/0.log" Apr 23 19:04:44.844788 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:44.844753 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-d4pdb_68703c38-8262-4cea-8aa5-15f64ae66fa1/graceful-termination/0.log" Apr 23 19:04:46.517494 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:46.517444 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xznrt_f66c5d99-c18e-417a-b05d-439bf68ddbff/kube-multus-additional-cni-plugins/0.log" Apr 23 19:04:46.546245 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:46.546217 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xznrt_f66c5d99-c18e-417a-b05d-439bf68ddbff/egress-router-binary-copy/0.log" Apr 23 19:04:46.578415 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:46.578383 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xznrt_f66c5d99-c18e-417a-b05d-439bf68ddbff/cni-plugins/0.log" Apr 23 19:04:46.609652 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:46.609626 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xznrt_f66c5d99-c18e-417a-b05d-439bf68ddbff/bond-cni-plugin/0.log" Apr 23 19:04:46.647623 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:46.647595 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xznrt_f66c5d99-c18e-417a-b05d-439bf68ddbff/routeoverride-cni/0.log" Apr 23 19:04:46.675343 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:46.675264 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xznrt_f66c5d99-c18e-417a-b05d-439bf68ddbff/whereabouts-cni-bincopy/0.log" Apr 23 19:04:46.702008 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:46.701974 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xznrt_f66c5d99-c18e-417a-b05d-439bf68ddbff/whereabouts-cni/0.log" Apr 23 19:04:46.795351 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:46.795319 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vrgrj_f6a1e63d-4ac5-4843-8b2f-4842157bdd00/kube-multus/0.log" Apr 23 19:04:46.878022 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:46.877989 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-q98mx_ab8ad387-1bfb-42ab-ad18-c8ea20362f8f/network-metrics-daemon/0.log" Apr 23 19:04:46.904994 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:46.904958 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-q98mx_ab8ad387-1bfb-42ab-ad18-c8ea20362f8f/kube-rbac-proxy/0.log" Apr 23 19:04:47.881380 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:47.881349 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-controller/0.log" Apr 23 19:04:47.904728 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:47.904701 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/0.log" Apr 23 19:04:47.942560 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:47.942524 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovn-acl-logging/1.log" Apr 23 19:04:47.971066 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:47.971040 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/kube-rbac-proxy-node/0.log" Apr 23 19:04:48.000300 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:48.000273 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 19:04:48.041855 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:48.041830 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/northd/0.log" Apr 23 19:04:48.078739 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:48.078712 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/nbdb/0.log" Apr 23 19:04:48.110992 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:48.110963 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/sbdb/0.log" Apr 23 19:04:48.315196 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:48.315162 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7xtlp_07f7ffd8-67f5-4806-a74c-8c1b4961ac85/ovnkube-controller/0.log" Apr 23 19:04:50.057190 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:50.057159 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-58nsw_8f6f3666-ed32-45c7-8804-c3a7951d4815/network-check-target-container/0.log" Apr 23 19:04:51.003620 ip-10-0-137-157 kubenswrapper[2576]: I0423 19:04:51.003591 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-fhvhd_b7dafa22-aab9-4274-a348-a27afa11e470/iptables-alerter/0.log"